Post Job Free
Sign in

Qa Lead Business Analyst

Location:
Charlotte, NC
Posted:
June 25, 2023

Contact this candidate

Resume:

Professional Summary

Exhaustive IT experience in requirements gathering, analysis, design, development, testing, deployment, maintenance, support and management.

Around 12+ years of experience as QA Lead in datawarehouse ETL/BI/MDM testing and overall 19 years in IT industry.

Around 1.5 years of experience as QA Lead in IVR testing using Cyara automation tool, deploying the code using Jenkins/Portainer, testing API using Web services Postman REST, Soap UI and JSON Payloads, etc.

Experience in leading/managing large testing teams, preparation of Master Test Plan and Test Strategy, providing Testing Efforts and Estimates, Test Design and Test Execution activities in offshore/onsite model.

Experience in Test Automation using Python data testing framework, Jupiter and Platinum tool setup, data comparisons between source & target layers and in building regression test suite.

Good experience in entire life cycle of Data warehouse/Data mart using Informatica Power Center (Designer, Repository manager, Workflow manager and workflow monitor).

Experience working with various heterogeneous databases like Oracle, DB2, SQL Server, Teradata, Snowflake and Flat files.

Experience in Agile Report Designer (ARD) for ARD flow creation and preparation of Agile Test Report.

Experience working in Agile Scrum & Waterfall methodologies, Involvement in daily scrum standup meetings and Sprint Planning, Sprint Grooming, Review and Retrospective meetings.

Very good understanding of Change/Incident Management processes and other ticketing tools across infrastructures.

Experience in Utilities, Retail, Insurance, Healthcare, Banking & Financial, Life science, Telecom, Media & Comm domains.

Experience in design of System Testing (ST), Integration/Functional Testing and User Acceptance Testing (UAT) test scenarios and test cases, test execution, facilitating defect triage and status meetings, experience in Test Management tools like HP ALM Quality Center for monitoring and tracking defects, Test Metrics and Status Reporting.

Experience in testing the applications using automated tools like Jira, Test Director, HP Quality Center/ALM/Octane, and Quick Test Professional, Win Runner.

Technical Skills

Database

IBM DB2, Oracle 11, SQL Server, Teradata, S4/HANA, Big SQL,Snowflake

MDM/MDG Tools

SAP MDG9.0, Informatica MDM 10.1 and IBM WCC 9.0

Data Warehouse Tools

Business Objects Tools BOXIR2/6.5/5.1 and Web Intelligence, Microstrategy 10.4, Informatica 10.1, Tableau 10x/2018x

Functional/Automation Test Tools

Quality Center 11.5/Octane, Team Foundation Server (TFS), Platinum Tool, Jupiter Tool, Autosys, Control-M, Python data testing framework, JSON Payloads, Cyara Automation Tool, Web Services testing using Soap UI, Postman(REST) and Swagger, IVR, PBX, Avaya Experience Portal, DRM, Appbar, ETS Studio, and QMS etc.

Other Tools/ Platforms/ Cloud Computing

Test Director 8, Quality Center 9/10/ 11, Quick Test Professional, SQL, TSL, SQA Basic, VB Script, JavaScript, Agile Testing, Clarity, JIRA, Confluence, TFS, SharePoint/BOX,MS Project, AWS Cloud, DevOps, Pega, etc.

Languages /Scripting Tools

SQL, C, HTML, JavaScript, CSS, JAVA, Unix

Database Utilities/Modeling

TOAD, Advanced Query Technique (AQT), Erwin v8.0.1, IBM Data Studio, ARD

Operating Systems

Windows 95/98/NT/XP, UNIX

Domains

Retail, BFS, Utilities, Insurance, Telecom, Media & Comm, Healthcare and Life Sciences

Professional Experience:

Charter Communications, NC Nov 2021 – Till Date

QA Lead

Roles and Responsibilities:

Working as QA Lead interfacing IVR/Web Services Development teams, UAT and Business for all Test activities and resolving issues in onsite/offshore model.

Involved in QA estimations of IVR and Web Services testing effort, validations of IVR call flows and Web Services request & response for residential/commercial customers, successfully ran each script in Cyara for multiple iterations.

Running health check Campaigns in Cyara to validate the call flows are successful within the threshold limit.

Running regression automation scripts using Cyara for SIT and PVT and identifying the failures and providing the report to management.

Validating backend response and data coming from Pega to Web services and IVR.

Involved in Sprint schedule/planning meetings, Grooming sessions, Daily stand up calls, Techno and Retro meetings, etc.

Running SQL queries in DB to validate data is correctly logged based on the front end IVR call flows.

Identified potential quality issues per defined process and immediately escalates potential quality issues to management

Involved in BRD and UI design walkthrough and Test Scenarios review and providing review comments.

Involved in Outage support, Validations of TFN repointing to new DNIS in PVT environment, code migrations to QA environment, updations to Config parameters file, etc.

Responsible for delivery of SIT summary report and Weekly Status report to all stakeholders.

Communicate and work directly with scrum team on testing issues and to ensure proper documentation and testing deliverables are met prior to implementing software releases to production environment.

Ensures proper hand off to UAT and support UAT users during UAT for any issues related to IVR Calls, DNIS, Account Numbers and TFNs to use, etc.

Environment: Windows, UNIX, Oracle, Postman (REST), Soap UI, SQL, JIRA, ALM Quality Center/Octane, SharePoint, Pega, Mobax, Cyara, IVR Technology, Avaya Experience Portal, DRM,Chalk Page, etc.

Morgan Stanley, Alpharetta, GA March 2021 – Oct 2021

QA Lead

Roles and Responsibilities:

Working as QA Lead and as an interface between the Development team, UAT and Business for all Test Coordination activities in onsite/offshore model for different Agile squads.

Designed ARD process flow using Agile Requirements Designer (ARD) and getting approval from Product Owner.

Validations of Initial load files and Daily delta files for data coming from Etrade to Morgan Stanley from files to staging and staging to target tables using Python Data Framework for Automation.

Near Real Time data validations between Etrade and Morgan Stanley for New Account Creation using Party and Account payloads using Postman (REST).

Involved in daily scrum meetings for daily stand up status, grooming of backlog/Epic/Sprint user stories, review of run book for prod movement, coordination with etrade team for creating test data accounts required for Morgan Stanley testing.

Extensively prepared test cases and scripts to adhere the requirements and perform various testing like Functional Testing, Integration Testing and Post Production validations.

Involved in test planning for each sprint phase user stories, test cases, test scenarios and test data requirements.

Communicate and work directly with scrum team on testing issues and to ensure proper documentation and testing deliverables are met prior to implementing software releases to production environment.

Involvement in Test Execution of Near Real Time data and Batch files in different QA environments, log defects as sprint bugs, upload of test execution results to Jira and getting approval from Product Owner after review in Jira.

Interaction with Etrade QA team for integration testing in terms of test data requirement, initial load and daily batch files, test accounts creation, highlight any issues, etc.

Environment: Windows, Unix, IBM DB2, Informatica 10.2, Postman(REST), Python data framework, Data lake, SQL, Box, Unix, JIRA, ALM Quality Center, SharePoint, Agile Requirements Designer (ARD), etc.

MGM Studios, Los Angeles Jan 2020 – June 2020

ETL/BI Test Lead

Roles and Responsibilities:

Worked as ETL/BI Test Lead and as an interface between the Development team, UAT and Business for all Test Coordination activities in onsite/offshore model.

Designed project Test Strategy with Approach, provided Efforts Estimation and Test Planning activities.

Validations of Tables/Views with migration data and ETL delta data(Informatica) by comparison between Teradata and Snowflake QA databases using automation tools like Jupiter and Platinum during System Integration testing.

Validations of data in Microstrategy BI reports using Integrity Manager between Teradata and Snowflake web reports.

Completed Performance testing of Microstrategy Teradata and Snowflake web reports for benchmarking of 50 critical reports.

Involved in requirements gathering - attended turnover meetings with BA’s/Business and provide QA Inputs with overall test approach.

Provided testing templates which includes Test case templates, Test SQL drafting plans, Defect report, and Test summary reports.

Responsible for data analysis techniques by resolving missing/incomplete information or inconsistencies/anomalies in more complex research/data by using SQL Queries.

Worked on Test Automation for automating the comparisons of source to target data using Jupiter and Platinum tools.

Extensively prepared test cases and scripts to adhere the requirements and perform various testing like Functional Testing, Non-Functional, System Integration Testing, User Acceptance Testing and Post Production.

Developed test plans for each phase, QA test cases, test scenarios and test summary reports.

Communicated and worked directly with Data Architect, Developers, Business Analysts and Project Manager on testing issues and to ensure proper documentation and testing deliverables are met prior to implementing software releases to production environment

Responsible for verifying data migration/code upgrade from different environment on all phases of the testing SDLC

Responsible for reporting daily test metrics and analysis to Project Manager and all the stakeholders.

Validated various Microstrategy reports and dashboards.

Prepared and ran production validation queries on daily basis and providing the Prod count mismatch/data mismatch issues to Customer for their analysis and issue resolution.

Coordinated with DBA/Server Admin teams for resolution of any disk space issues, taking backup of test logs, etc.

Review of Offshore test deliverables daily before delivering to Customer and providing Support during UAT.

Environment: Windows, Teradata 16.10, Informatica 10.2, SQL, Snowflake, Microstrategy, Migro, Box, Unix, ALM Quality Center, MSTR Integrity Manager, Jupiter, Platinum, SharePoint, AWS Cloud, etc.

Cigna Healthspring, Nashville Oct 2018 – Dec 2019

QA Lead

Roles and Responsibilities:

Responsible for overall delivery of Quality Engineering testing artifacts leading a team size of nine resources in Agile testing process and Methodology.

Prepared and delivered Test Strategy with test approach, deliverables, process flow, Roles and Responsibilities, etc.

Involved in Agile User Stories grooming sessions, Sprint Planning and Retrospection sessions and Daily sprint stand up calls to discuss about the Sprint deliverables, risks, issues, etc.

Review of user stories, acceptance criteria, test scenarios and scripts and escalate any issues upfront to the team.

Provided Quality metrics and status reporting to Customer at the end of sprint/release.

Worked towards focus areas like Process Standardization, Test Automation, Regression Test Suite and Domain data QA knowledge.

Collaborated with System Analysts, Development Team and UAT business teams at various stages of Agile methodology for delivery of testing artifacts with quality and on time for each sprint.

Experienced with high volume datasets/Sources from various sources like Teradata, Oracle, Text Files, and Mainframe and XML targets.

Prepared documentation for all the testing methodologies including test strategy, test plan, QA release notes, estimations QA Project charter used to validate the Data Warehouse.

Test execution in accordance with test strategy and test schedule, review; issue identification and data mining exercises in accordance with project plan timelines.

Provided technical leadership and expertise in the reorganization of a complex End-to-End testing effort and Performed system analysis and organizational review of existing QA methodology with special focus on custom test tools.

Compared, verified and validated data between various sources and targets for the business functionality.

Created QA templates for Source to target validations which includes ETL validations as well reporting.

Wrote the SQL queries on EDW tables and Data Mart Staging Tables to validate the data results.

Responsible for testing the ETL Loads to verify data completeness, data Transformation, data quality, integration testing, UAT & regression testing.

Validated reports, Frame work against back end databases as per the BI rules.

Responsible for periodic updating of Test Plans and Test Cases as per Requirement Specifications and Business rules.

Facilitated daily Defect/Bug review meetings with business, environment and development team to understand the problem and setting the severity levels and get resolutions for defects logged by team.

Involved in FTP process to transfer the files from one server to other, performed quality control analysis and validates results of conversion programs.

Environment: Teradata, Oracle SQL, Oracle 11, SQL Server 2008, UNIX, Mainframe, Windows, SSIS, Quality Center 11.0, Toad, Teradata SQL assistant, DB2

KeyBank, Cleveland Jan 2017 – Sept 2018

Test Lead

Roles and Responsibilities:

As Onsite ETL Test Lead, responsible for quality delivery of all the testing artifacts related to data testing during Integration testing and End to End testing from source to consumption and downstream layers.

Responsible for updations to end-to-end test strategy, running the sourcing and consumption mainframe jobs and validating the output files as per requirements and mapping documents.

Coordinate with different down streams like, GL, NXG, MDM, EDW, CIX, etc during testing.

Coordination with functional team, offshore data testing team, development team and project teams in day-to-day testing activities including data preparation.

Involvement in ETL test execution, test automation using platinum tool, defect management using ALM quality center, defect triage calls and sending status reports.

Done various testing phases like Functional Testing, Non-Functional (Implementation, Back-out Test, Performance Testing, Purge, Backup & Recovery Test), System Integration Testing, Validation Testing, Static, User Acceptance Testing and Regression Testing.

Developed various SQL queries to validate the test case results for back-end test cases.

Worked on Daily Transitions templates and meeting for Onshore - Offshore coordination.

Well exposed to Software Development Life Cycle and Test methodologies.

Tested the data and data integrity among various sources and targets including reports.

Involved in testing to verify that all data were synchronized after the data is troubleshoot, and also used SQL to verify/validate my test cases.

Created the test data & Coordinating with Developers/Leads in resolving bugs.

Supported and worked as single point of contact for any QA issues.

Prepared QA Test Results after Test case execution. Worked with data validation, constraints, record counts, and source to target, row counts, random sampling, boundary values and error processing.

Providing testing inputs to the Project team (Managers, Designers and the Developers) during the Planning sessions to finalize the Work Breakdown structure.

Hands on experience setting of Quality center for projects and domains, created user groups and project folder structures.

Built cycles in Quality center during QA execution phase.

Delivered several Quality assurance documents like Test Strategy, Test plans and cases Metrics, QA Entry and exit criteria document, Weekly reports, End of phase summary reports.

Knowledge on logical data models, recognized source tables and columns, and tested the built Attributes, Facts, drill-maps, Hierarchies and Relationships necessary for creation of a Micro Strategy project.

Environment: Informatica, Oracle, SQL, Oracle 11, SQL Server 2008,PL/SQL UNIX, CA Clarity™ Project & Portfolio Manager 12, As 440, Mainframe, JIRA, Cognos 8, Windows, XML, Quality Center 11.0, Toad

PetSmart, India Oct 2015 – Dec 2016

QA Manager

Roles and Responsibilities:

As offshore Test Manager, responsible for quality delivery of all the testing artifacts related to Integration Testing including estimations and resource loading.

Preparation of Test Strategy and Test Plan and Interaction with Tibco/Mulesoft service teams for integration testing support activities.

Used HP Quality Center for requirements traceability, test case & defect management and test execution in test lab.

Responsible for sending status reports to all the project stakeholders and providing guidance to the QA team regarding MDM testing of Match/Merge rules and Survivorship rules as per the business requirements.

Written Test Cases for ETL to compare Source and Target database systems.

Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity.

Responsible for Data mapping testing by writing complex SQL Queries using Toad.

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Ensured data integrity and verified all data modifications and calculations during database migration using ETL tools. Developed test cases to accomplish ETL data migration.

Used workflow manager for session management, database connection management and scheduling of jobs.

Involved in Data analysis and prepared required test data for validation of test cases.

Performing manual testing of the data in various staging locations.

Liaise with Business Analysts and responsible for driving issue-log meetings on regular basis.

Extensively used HP Quality Center 9.2 for Test management and execution of test cases.

Coordinate with project team to achieve project goals through status meetings and reviews.

Conducting Regression Testing whenever some code module is changed. Participation in peer reviews of test artifacts.

Responsible for periodic updating of Test Plans and Test Cases as per Requirement Specifications.

Conducting Sanity Testing on the load environment before performance test begins.

End to End testing responsibility for a CIDW Database which involves testing requirements, ETL Testing, Data Validation, Testing reports and Cubes.

Environment: Informatica, Oracle 10g, Quality Center 9.2, Business Objects, Tibco, Mulesoft, Windows 2000, UNIX, Spot fire 3.0, Toad, Autosys, JADE, FileZilla, WinScp.

AMEX, Phoenix Jan 2014 – Sept 2015

BIDW Test Lead/Manager

Roles and Responsibilities:

Worked in Various BIDW projects in AMEX. As Test Manager, Responsible for delivery of BIDW Test Strategy and Test Plan and provide inputs for End-to-End Test Strategy, align on the key test deliverables.

Responsible for hand on sessions with vendors on Test Conditions, Test data and Test planning and to provide estimations for the ETL & Reports testing, resource management and Coordination with offshore team for daily deliverables.

Interaction with various stakeholders – Dev teams, Business team, Source System teams, etc who are geographically spread across different locations and time zones.

Use HP Quality Center 11 for requirements traceability, test case & defect management and test execution in test lab.

Created test plan, test cases, test scripts and test steps for every release and stored in TFS.

Created Requirements Traceability Matrix (RTM) for test coverage analysis and test completeness

Created Risk Assessment matrix document based on defects identified and risks associated in system/production environment.

Incorporated Integration within System Test plans, which adequately exercise functioning of the system and its various components.

Validated various Data Stage jobs according to Business Requirements and Functional Specifications.

Created manual Test cases and utilized QTP to automate test scripts for Regression Testing.

Integrated the Automation test scripts in Quality Center and created baselines.

Wrote queries on Oracle and DB2 database for data validation, tested the Java web interface application with the legacy Mainframe programs.

Validated various Cognos reports for data quality and lay outs.

Involved in validating various Informatica workflows, mappings, tasks.

Extensively developed SQL Queries using Rapid SQL to test the business process of financial accounting modules to PeopleSoft.

Created RTM between Quality Center and DOORS

Coordinated the Testing efforts during Integration, System and User Acceptance Testing phases.

Manage defects through Rational Clear Quest Defect Tracking. Assign defect priority, analyze follow up and retest defects over builds.

Environment: Microsoft-TFS, DataStage, Informatica, Oracle 9i, Toad, SQL, PL/SQL, DB2, CDW, PeopleSoft (GL, AP and AR), Autosys, Windows XP, QTP, SQL Server2005, Crystal Reports, Cognos, Rational Clear Quest, Rational Clear Case, MS Office, VISIO and Project.

TARGET BI, Minneapolis Jan 2009 – Dec 2013

Business Analyst and Test Lead

Roles and Responsibilities:

Worked as Business Analyst and Test Lead in multiple sub-projects across Target in DWH and BI area.

Responsible for Design and development of test plans based on high-level documents (BRD & FRS).

Prepared and implemented the Test plan, Test cases, Test Procedures, Test sets using Use cases and requirement specifications.

Analyzed customer and business needs to determine business and functional requirements.

Gathered and documented business requirements for system gaps that require development in credit card project.

Configured Test Environment Management (TEM) for specific Test Cases, created test data, executed automated or manual unit tests, document results and updated defect-tracking systems.

Participated in functional requirement reviews and code reviews.

Analyzed existing business processes and provided recommendations for improvements and efficiencies in credit card project.

Worked with Business Analyst and Developer to resolve the defects.

Designed and developed Use Cases, Use Cases diagrams, Activity Diagrams, Sequence Diagrams.

Extensively involved in testing the application developed in Agile Methodology and detailed designs.

Analyzed the user/business requirements and functional specification documents.

Analyzed and optimized the use cases and created the Test cases based on them for Change Requests.

Have leaded small offshore team to develop Test Plan and Test Script.

Validated the responses of web services by passing various requests using Soap UI.

Executed all Test cases in different phases of testing like Smoke, Regression and System testing of the application.

Validated SSIS and SSRS packages according to business and functional specification documents.

Validated the custom reports developed in Cognos Report studio, validated the adhoc reports in Cognos Query studio, and multi-dimensional cubes in Cognos Analysis studio.

Have tested scheduler scenario on Autosys.

Validated the Informatica workflows according to business requirement documents.

Used Informatica workflow manager to run the workflows and workflow monitor to verify their status and logs.

Validated the Data warehouse(DWH) Table structure part of ETL testing

Execute batch processing and verify the ETL jobs status and data in database tables

Validated parameter driven ETL process to map source systems to target data warehouse with Informatica complete source system profiling.

Tested all the rules which implemented during ETL job move data from source to target Data Warehouse(DWH)

Developed complex SQL queries to test ETL jobs source to target Data Warehouse(DWH)

Involved in defining Test Scenarios for the applications and performed manual testing in HP Quality Center.

Created and executed different types of test cases for the Change Request and existing functionality of the application.

Responsible for checking of data in database by writing and executing SQL statements.

Extensively involved in testing the applications developed in both Agile Methodology and Waterfall.

Involved in meetings with Automation team for Development of Automation Scripts for Change requests to run the along with regression tests suites of business priorities on Loan Application.

Involved in backend testing using UNIX commands.

Environment: HP ALM, Oracle 11g, AS 400 IBM iSeries DW, Informatica Power Center 9.5, SQL Server 2012, SSIS, SSRS, SSAS, Autosys 11, Toad 10.0, WinSCP, Cognos 10, Soap UI, UNIX, DB Visualizer 10.0, VB Script, Windows Web Services, SQL, PL/SQL, Java, XML.

AEG, New Jersey July 2007 – Dec 2008

Project Lead

Roles and Responsibilities:

Lead the Business Objects development team in requirements gathering, delivery of reports design and testing documents.

Involved in the installation of BO software and customization with JBoss application server

Interaction with the client on day to day development activities and escalations of any risks and issues.

Created and designed BO universe semantic layer and design of some of the canned BO and web Intelligence reports.

Have given End user training on the usage of Web Intelligence reports and creation of adhoc BO reports including the preparation of training manuals.

Managed the administrative aspects of BO XI R2 support work like user management, reports management and scheduling of canned reports.

Involved in the failover process setup and testing.

Created Reports with business logic using Universes, Free-hand SQL, Personal data files, and stored procedure.

Involved in Unit testing of Business Objects and Web Intelligence Reports.

Reviewed unit test cases and test results and certified the code for Integration testing

Prepared Test Scenarios by creating Mock data based on the different test cases.

Extensively used SQL programming in backend and front-end functions, procedures, packages to test business rules and security.

Written several complex SQL queries for data verification and data quality checks.

Environment: Business Objects Crystal Decision XI R2, Jboss V4.2, Oracle 10G database running on Windows 2003 server.

Toad, etc.

SHELL, Netherlands March 2004 – June 2007

Technical Lead

●Worked as Technical Lead in various data warehouse projects in delivering BI/DWH artifacts.

Transco, UK Dec 2002 – Feb 2004

BI Developer

●Worked as Technical Development Lead in various data warehouse projects in delivering BI/DWH artifacts.



Contact this candidate