Swetha Sadashivappa
Summary
Over **+ years of experience in IT industry specializing in Data Warehousing, QA Engineer in large-scale Data warehouse Applications including: Data Extraction/Conversion, Data transformation DT),User Interface, Integration Testing, Optical networking and Software Testing.
Experienced in structuring and leading Quality Assurance teams towards building quality software products in complex fast paced multi-project environments.
Well versed with B2B technology to implement Data Transformation ( DT) and Data Exchange(DX).
Experience with the full Software Development Life Cycle (SDLC), which includes Requirements, Planning, Analysis, Design, Agile Development, Testing, Implementation and Support.
Experience in testing UI applications.
Expertise in working on requirement in Requirement Traceability Matrix (RTM).
Experience in Developing XSD and testing involving XML, HTML
Experience in working as Senior QA implementing Test automation and mentoring testing team.
Expertise in Functional, Integration, Regression, Performance Testing and Compatibility Testing
Experienced in performing System Testing, User Acceptance Testing, Web Services testing and Database Testing.
Expertise in working on requirement in Requirement Traceability Matrix (RTM).
Good exposure to Ebusiness application, Order management application, Stock maintenance application.
Experience in working as Senior QA implementing Test automation and mentoring testing team.
Expertise in Functional, Integration, Regression, Performance Testing and Compatibility Testing
Experienced in performing System Testing, User Acceptance Testing, Web Services testing and Database Testing.
Extensive experience in developing Test Plans/Strategies, Test Cases, Test Scenarios and Test Scripts (Manual and Automated) for various applications to ensure proper business compliance.
Extensive experience in performing back-end database test on Oracle, SQL Server by developing and executing SQL Queries to verify the data.
Expert in Manual & Automated testing using Mercury Test tools, Test Director, HP-Quality Center, and Win Runner.
Experience in understanding Business Process from the requirements and converting them to test scenarios.
Excellent knowledge and working experience in Test Execution and Test Result Analysis, Test Reviews and Defect Management (Quality Center)
Knowledge of Visual Basic
Expertise in Data Warehouse, ETL Verification.
Experienced in web and client/server applications QA in manual and automated testing.
Experience on Quality Improvement Processes. Implemented Client Quality Sigma for client in Alcatel-Lucent Technologies.
Believer of documentation process and have hands on experience on professional documentation during all the phases of development life cycle.
Good domain Knowledge of US Banking, Telecommunication.
Good domain knowledge of US Healthcare, Welfare Benefits, COBRA Changes & Specifications, HIPAA4010/5010, Medicare. EDI x12 Transactions(837/835/834/820, 270/271, 276/277, 278)
Experience in Scripting using TCL/TK and maintenance of existing scripts for automated regression testing.
Ensured coverage of TL1 commands by using Client Proprietary tool
A self-starter, team player working on fast-faced development environments committed to the deliverables.
Flexible and versatile to adapt to any new environment and work on any project.
Testing Tools: Test Director 8.0, Quality Center 8.2/9.1/10.0, Bugzilla, Load runner,
Win-runner, QTP
GUI Tools: Visual Basic 6.0, Developer 2000
Databases & RDBMS: ORACLE 7.X, 8, 9i, 10g, MY SQL, MS Access, SQL Server 2005,SQL Server 2008
Operating Systems: MS-DOS 6.22, WINDOWS ’95,'98 & NT/2000/XP, Unix
Languages: C, C++, SQL, PL/SQL, VB Scripting, Unix shell script, Perl
Networking: TCP/IP, FTP, UDP, ATM, Ethernet, LAN & WAN
Web Browsers: Internet Explorer 7.0, 6.0, 5.0, Netscape 7.0, 6.0, Mozilla Firefox 1.5
Mobile : iPhone and Android
Web Technologies: XML, HTML, XSD
Data Warehousing Tools: Conduit, Informatica (Power Center 8x/7x), ContentMaster
Tools: TOAD, SQL Developer, MS Outlook, Lotus Notes, Ultra Edit
Domain Knowledge: US Banking, Telecommunication, Healthcare & Welfare Benefits, Insurance, Mortgage-Accounting, COBRA, HIPAA(EDI x12 Transactions), Medicare, FACETS, PORTICO, NASCO.
Technical Skills
Professional Experience
BRIGHTMLS 03/18–Till date
Senior QA Analyst/Test Lead
SALESFORCE
Synopsis:
Bright MLS was created from a desire to be the change that real estate professionals demand and focuses on data migration/conversion
Bright provides subscribers with access to listing information for a larger geographic area, modern, easy-to-use systems and apps to serve your clients and run your business, and a comprehensive ecosystem of robust property information and analytics.
Responsibilities:
Created the Test scenarios, Test procedures as well as documentation
Responsible for test environment setup to test the complete end to end testing process
Used Version one/JIRA to execute the test cases, track execution against the plan during testing
Managed defects from inception to resolution using JIRA
Created Test Data using Conduit tool, Salesforce and Bright frontend application
Tested UI application such as BRIGHT, TREND and MRIS
Used salesforce to create the data and test the Agent and Broker products/Billing information
Performed Mobile testing
Performed the Performance Testing for the Maximum data been Injected in the DB Server
Practiced Scaled Agile process
Documented and communicated test results on a daily standup call
Participated in project meetings, review specifications, Deployment calls and give recommendations and suggestions.
Mentoring the Team Members
Conducting Defect review meetings on weekly basis
Coordinating in the Process of End to End testing
UPS 06/13 – 02/18
Senior QA Analyst
ODS and RM
Synopsis:
Upgradation of Route Manager (RM) delivering drivers manifest when a driver performs a GetEDD in DIAD and also feeds ODSe system through MORE Server.
Responsibilities:
Created the Test scenarios, Test procedures as well as documentation
Responsible for test environment setup (SIT/Performance/UAT).
Used HP-ALM (previously known as HP Quality Center) to execute the test cases, track execution against the plan during testing
Managed defects from inception to resolution using Serena Team Track
Created Test Data to inject in Database
Created SQL queries to test the data against the SQLServer 2008 database
Tested UI application such as ODS Legacy and SOSS
Performed the Performance Testing for the Maximum data been Injected in the DB Server
Installation of the Microsoft Patches to our Test Environments
Documented and communicated test results on a daily/weekly basis.
Participated in project meetings, review specifications, and give recommendations and suggestions.
Mentoring the Team Members
Conducting Defect review meetings on weekly basis
Coordinating in the Process of End to End testing
Coordinating with Business for UAT, Status meetings, conduct walkthroughs.
Environment: Windows 2007, Quality Center 10.0, SQL-Server 2008, UNIX, QTP 11.0,Load Runner,Java,C++
CareFirst 02/12 – 05/13
Data Analyst
802 EDM Optimization (Enterprise data model)
Synopsis:
CareFirst BlueCross BlueShield has embarked on a major initiative to upgrade systems, increase administrative efficiencies, and improve performance metrics through consolidation of data and analytics to standardized platforms. A major piece of this work had been the creation of a new Enterprise Data Warehouse, referred to as the Enterprise Dimensional Model (EDM).
Responsibilities:
Created the Test Strategy/Plan, test scenarios, test procedures as well as documentation.
Responsible for test environment setup (SIT/Performance/UAT).
Used HP-ALM (previously known as HP Quality Center) to execute the test cases, track execution against the plan during testing and manage defects from inception to resolution.
Created extensive SQL queries to test the data against the Oracle databases for the counts and data validation/reconciliation.
Created EDI X12 837I/P files for testing as part of the end to end verification.
Tested UI application
Verified the performance of the upgraded ETL workflows and compared between the old and new models and generated the report.
Verified the ETL transformation logic by creating test SQL’s and running them against the source and target entities.
Maintained traceability of test cases with the requirements on HP-ALM
Documented and communicated test results on a daily/weekly basis.
Conducted walkthroughs across the testing deliverables.
Created Test execution metrics, Defect Metrics for daily status meeting.
Participated in project meetings, review specifications, and give recommendations and suggestions.
Coordinating with Business for UAT, Status meetings, conduct walkthroughs.
Environment: Informatica power center 8.6, windows 2007, Quality Center 10.0 SQL-Server 2008, Oracle 11, SQL,TOAD,UNIX
Verizon Communication Inc. 05/11 – 12/11
Senior QA
BRP (Billing Replacement Program)
Synopsis:
Verizon Communications Inc. is one of the leading telecom companies in US. BRP is one of the corporate initiatives which is basically proposed for billing purpose of customers in real time provisioning, migration of account and get the instant bill on request via Internet/Mobile.
Responsibilities
Involved in the design, analysis and testing of the Migration solution/system/application.
Created the Test Strategy/Plan, test scenarios, test procedures as well as documentation.
Responsible for test environment setup (SIT/Performance/UAT).
Used Quality Center to execute the test cases, track execution against the plan during testing and manage defects from inception to resolution.
Created extensive SQL queries to test the data against the SQL Server, Oracle databases.
Created Test cases scripts from the functional specifications and maintained traceability with the requirements.
Testing of User Interface application.
Performed the role of module lead for a team of 2 members.
Created test data for the customer billing and accounting.
Created UNIX shell scripts for comparison of the output files.
Performed Verification and Validation of migrated data in different targets viz; SQL Server, Oracle, DB2, LDAP, UNIX
Documented and communicated test results.
Conducted walkthroughs across the testing deliverables.
Created Test execution metrics, Defect Metrics for daily status meeting.
Participated in project meetings, review specifications, and give recommendations and suggestions.
Coordinating with Business for UAT, Status meetings, conduct walkthroughs.
Environment: Informatica Power center 8.6, windows 2007, Quality Center 10.0, SQL-Server 2008, Oracle 11, SQL, TOAD, UNIX
Care Source 10/10 – 04/11
Senior QA
Care Source Broker Express
Synopsis:
CareSource Broker Express is an E-commerce application that will allow the Sales Representative, Broker, General Producer or Full Service Producer to perform a number of tasks, including obtaining quotes and preparing proposals. CareSource Broker Express will focus on new groups that have between 1 and 199 subscribers. Care Source Broker Express integrates Prospect Creation, Quote/Proposal Creation, Sold Quote Processing, Group Setup, Renewals and Facets Installation into the work streams in the Small-Mid Group channel. It also provides CareSource a platform to electronically collect underwriting and group packet paperwork to support all group sales.
Responsibilities
Analyzed the business requirements and was involved in the review discussions.
Coordinated with development team to identify Functional areas and Test scenarios that needs to be tested.
Maintained Test Plan which includes description of the item being tested and its scope, user requirements, environmental needs,risk assessment,approach and methodology
Participated in Test Case walkthroughs, inspection meetings
Wrote and executed detailed test cases including prerequisites, detailed instructions and anticipated results.
Created multiple test scenarios based on business rules to create data driven testing.
Prepared Requirement Traceability Matrix (RTM) to establish traceability between requirements and test cases.
Performed various types of testing, such as unit, black box, functional, performance, negative, system integration, and regression and user acceptance.
Modified and maintained test cases due to changes in the requirements.
Debugged the test cases, verified the test results and reported the defects using Quality center.
Investigated software defects and interacted with developers to resolve technical issues.
Performed Regression Testing to ensure that existing functionality is not impacted by bug fixes.
Tested deployment of applications in various environments during all phases of product lifecycle
Run the SQL queries to fetch test data and backend (database) testing to validate the database.
Participated in daily status meetings to report any bugs, issues and risks.
Environment: Oracle 9i, Windows XP/NT
Wells Fargo 08/09 – 09/10
QA Analyst
Wholesale DMD
Synopsis:
Wells Fargo & Co. (WFC) has recently announced that it will be converting 65 Wachovia branches in California displaying its own name and stagecoach logo. San Francisco-based Wells Fargo has been moving slowly to consolidate Wachovia Bank into its system. During December of last year, all 4,800 branches of the residential lender Wachovia Mortgage had been folded into Wells Fargo Home Mortgage.
The integration began in Colorado last month, where both the institutions had overlapping branches. After California, the conversion process will move to other states such as Arizona, Illinois, Nevada and Texas, where Wells Fargo and Wachovia offices overlap. Conversion in these states is expected to begin in 2010, though the conversion dates haven’t been announced for the Wachovia-only markets in the Carolinas and along the East Coast. Wachovia Corp. was purchased by Wells Fargo on Dec. 31, 2008, when it ceased to be an independent corporation. The company had been battered by heavy losses, especially in its portfolio of flexible, interest-only home loans, and was suffering a run on its deposits.
Responsibilities:
Prepared Test plans, Test Strategies and Test cases to meet business requirements.
Responsible for test environment setup (SIT/Performance/UAT).
Maintained Test Plan which includes description of the item being tested and its scope user requirements environmental needs,risk assessment,approach and methodology.
Designed the Test Schedules.
Used Quality Center to execute the test cases, track execution against the plan during testing and manage defects from inception to resolution.
Created extensive SQL queries to test the data against the SQL Server, Oracle databases.
Created basic PL/SQL stored procedure for execution of few SQL transactions.
Created Test cases scripts from the functional specifications and maintained traceability with the requirements.
Created test data in the Source Wachovia system.
Creating the test batch data file by FTP thru Unix.
Validating the Tilde delimited file converted from VSAM file.
Created Unix shell scripts and Perl scripts.
Performed Verification and Validation of migrated data in different targets viz; SQL Server, Oracle, DB2, LDAP,UNIX
Documented and communicated test results.
Conducted walkthroughs across the testing deliverables.
Created Test execution metrics, Defect Metrics for daily status meeting.
Coordinating with Business for UAT, Status meetings, conduct walkthroughs.
Training & Mentoring new comers as and when required.
Environment: Windows 2000/XP, Quality Center 9.0, SQL Server-2008
Alcatel-Lucent 01/08 – 07/09
Project Engineer
Credit Score Analysis (Data & informatics –ODS)
Synopsis:
Alcatel-Lucent is the trusted transformation partner of service providers, enterprises, strategic industries such as defense, energy, healthcare, transportation and governments worldwide, providing solutions to deliver voice, data and video communication services to end-users.
The data within the Credit Scoring Analysis (CSA) database is used primarily for the analysis of current risk strategies. This is done by the ongoing monitoring of application characteristics and customer behavior with a view to finding likely indicators of increased risk and cost to Alcatel-Lucent. The majority of CSA data is sourced from CSS. The ETL application is divided in to two different processes, one is publisher - to publish the data from the CRED Database (Oracle) into the Public Interface Server (SQL Server) and the other is subscriber - to subscribe the data from the Public Interface Server (SQL Server) and populate into CSA Database (Oracle) applying business rules.
Responsibilities:
Involved in the design, analysis and testing of the application.
Created the Test Strategy, test scenarios, test procedures as well as documentation.
Responsible for test environment setup (Systest/integration/performance/UAT).
Used Quality Center to execute the test cases, track execution against the plan during testing and manage defects from inception to resolution.
Created extensive SQL queries to test the data against the database.
Created PL/SQL stored procedure for execution of few SQL transactions.
Created Test cases scripts from the functional specifications and maintained traceability with the requirement
Monitored metrics like CPU utilization, response time, memory activities and network delay.
Documented and communicated test results.
Conducted walkthroughs across the testing deliverables.
Reviewing the testing deliverables of other projects.
Training & Mentoring new comers as and when required.
Environment: Informatica Power center 8.1.1, Windows 2000/XP, Quality Center
Nortel Technologies 06/07 – 12/07
Project Engineer QA
Metro Ethernet Management (MEM)
Synopsis:
The project “Metro Ethernet Manager” provides a network element management system for the metro Ethernet network. MEM allows us to access the fault management and Service Configuration from one user interface. The MEM integrates several sub-systems so that we can configure and fault manage the ME network. The main sub-systems of the MEM solution are as follows:
MEM security which helps in centralizing authentication and security mechanisms to control access to the MEM and the ME network elements. MEM fault management provides the Common Web Desktop and fault collection infrastructure. MEM configuration uses the Product and Service Provisioning system to provide network-wide service configuration management, inventory management, and administration.
The overall MEM data was proposed to be acquired into a Data acquisition layer in ODS using the ETL process.
Responsibilities:
Involved in Client meetings to gather Requirements to be mapped to the system.
Created Requirements specifications to be presented and signed off by the client.
Created/tested test scenarios to validate the integration/routing/transformation and deliver of data such as the Ethernet element data,Security data and fault management data.
Leading efforts to improve QA practices by identifying, proposing and implementing move of test case development into the Quality Center tool from MS Excel, Word.
Coordinated with the business analysts and developers and discussed issues in interpreting the requirements and effectively managed the finalized requirements using Quality Center
Maintained Test Plan which includes description of the item being tested and its scope, user requirements, environmental needs, risk assessment, approach and methodology.
Performed Web testing.
Modified and used automated test scripts which are called from Quality Center during execution.
Created multiple test scenarios based on business rules to create data driven testing.
Participated in Test Case walkthroughs, inspection meetings.
Performed various types of testing, such as unit, black box, functional, performance, negative, system integration, and regression and user acceptance.
Participated in project meetings, review specifications, release notes and gave recommendations and Suggestions.
Tested deployment of applications in various environments during all phases of product lifecycle.
Managed defect tracking, defect reporting for assigned projects using Quality Center/Test Director.
Maintained Daily Project Status Report.
Worked with development teams to investigate, prioritize and resolve software bugs / defects based on the testing result.
Participated in review meetings and took the initiative to meet the QA testing targets
Run the SQL queries to fetch test data and backend (database) testing to validate the database.
Environment: Informatica Power center 7x, Manual Testing, Quality Center 9.2
Ordyn Technologies 01/05 – 12/06
Production Engineer
MADM (Multiple ADD and DROP Multiplexer)
Synopsis:
MADM is a Multiple add and drop multiplexer,the MADM/ADM consists of a Main Chassis (MC) and a optional Expansion Chassis (EC).The expansion chassis is used when large number of low speed tributaries are to be add-dropped. Only E1,E3/DS3,STM-1 and STM-4 cards can be supported on the expansion chassis.
The main chassis contains 14 slots of which two are reserved for redundant switch cards.4slots are reserved for STM-16 line (OHT-overhead termination) cards. Remaining 8 slots are available for tributary cards.
Responsibilities:
Testing of MADM system parameters.
Testing all the system parameters that includes jitter (both input and output jitter), order-wire test, Bit error,AIS clock freq, Alarm injection, optical power, Receiver sensitivity, Automatic laser shutoff, Pulse mask, Return Loss
Testing the parameters using Analyzers (ANT 20, EDT 135).
Testing the system using LCT and EMS.
Connecting the system in Ring topology for Stability test and monitoring in NMS.
Environment: Windows NT, Unix/Linux, Manual Testing, Test Director
Education :
Bachelor of Engineering in Electronics and Communication, VTU University, Bangalore, India
Certification: Salesforce Certified Administrator, Scaled Agile certified