Post Job Free

Resume

Sign in

ETL/BI Test Lead

Location:
Frisco, TX
Posted:
October 14, 2020

Contact this candidate

Resume:

PAVANKUMAR CHINTHALA JANAKIRAM 612-***-****

SENIOR ETL/BI TEST LEAD adgy97@r.postjobfree.com

PROFESSIONAL SUMMARY:

Skilled professional having 10+ Years of experience in IT industry with experience in ETL/BI (DWH) Report development and Testing.

Worked closely with QA Manager, Developers, Business Analyst, Business Operations and other QA staff to ensure that business requirements are testable and verifiable.

Proven customer-facing skills during test case review and User Acceptance Testing (UAT), coordinating and cohesively interfacing with end users in respect of business process testing, scope creep, data set- up and business scenario clarification.

Strong proficiency in complete Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC)

Proficiently experienced in onsite-offshore test coordination and communication process to bring about synergistic and cogent relationships between onsite team and offshore team.

Lead the efforts to plan testing, build test cases, execute tests, report results, and troubleshoot issues found on the ETL platforms.

Experienced in AGILE, Waterfall Model Methodologies and Techniques.

Worked as center of excellence lead.

Extensive experience in Black Box Testing Functional Testing, Integration Testing, System Testing, Regression Testing, Manual Testing on Enterprise Applications.

Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of DWH using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)

Sound Knowledge and experience in Metadata and Star schema. Analyzed Source Systems, Staging area, Fact and Dimension tables in Target Data Warehouse.

Strong working knowledge of SSIS ETL Tool and SSRS.

Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, having knowledge Star flake and Snowflake schema used in relational, dimensional and multidimensional modeling.

Involved in preparation of Test plan and Test Cases and Test summary Report.

Experienced in managing the 18 members team as QS Test Lead.

Involved in preparation of test data for functionality of Informatica Sources Mappings, Targets

Extensive experience working with Data warehouse ETL tools like Informatica, and Pentaho and Informatica DVO to automate the workflows.

Have good hands on experience in Informatica, Tableau Dashboards, COGNOS, Business Objects Reports development and testing and Excel Macros, Python scripting, Control M and Ab Initio and Main frames

Expert in reviewing/reporting, track & maintaining Defects with high level of accuracy and informative recreation steps using Quality Center, JIRA, QUEST and Bugzilla.

Hands on experience in writing SQL queries, Joins, Functions, PL/SQL Procedures, DBMS, RDBMS in Oracle, SQL Server and Netezza databases. Involved in preparation of Weekly and Status reports.

EDUCATION:

Bachelor of Engineering in Information Technology from JNTU, Anantapur, India.

TECHNICAL SKILLS:

Testing Tools

HP Quality Center 11.0, Rational Test Manager, JIRA, Version One

Data Modeling

Dimensional and relational Data Modeling in COGNOS Framework Manager.

Databases

ORACLE 12g, SQL Server 2008, Microsoft Access 2000, DB2, Netezza database

Database Tools

PL/SQL Developer

ETL TOOLS

Informatica Power Center 10.2.0, SSIS, Pentaho, MDM, Ab Initio

Programming

SQL, PL/SQL, Core Java, C, C++

Operating Systems

Windows XP/2003/2007, Unix

BI & Visualization tolls

COGNOS, Business Objects and Tableau desktop and Tableau Public and SAS

PROFESSIONAL EXPERIENCE:

DFW Airport.

ETL/BI Test Lead Jan 2020 – Till date

EDL Projects

DFW Airport is doing business with Airline companies, Airport stores and Parking lots. We have loading Passenger data and travel history data with Airlines in different source systems to one database. We have centralized data in EDL. EDL Means Enterprise Data liberation, in this module I have delivered 17 projects. I have validated Oracle, SQL Server are Source database for Oracle Financial (AP, AR, GL, PO and FA), Prop works, ACI Benchmark Dashboard, Waste Management, NASA& SWIM projects to Snowflake Database. Different Source files are loaded to Target tables for COS concession POS phase 2.3 projects to Oracle Database.

Responsibilities: -

Analyzed business requirements, system requirements and data mapping requirement specifications, interacting with client, developers.

Written Test Strategy and get reviewed with the team and then preparing the Test Cases.

Involved in extensive DATA validation using SQL queries and back-end testing.

Building, Maintaining and tuning Dashboards, reporting queries and ETL mapping to load data from XML source to relational pre- staging, then from pre- staging to staging, then from staging to Data warehouse, from there validate Tableau public and Tableau desktop Dashboard’s and Micro strategy reports.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Involved in preparation and maintenance of the Requirement Traceability Matrix (RTM) to measure the progress, test matrix and test script review.

Exclusively involved PL/SQL batch programs and responsible for reporting the defects to development team.

Tested the ETL SSIS mappings and other ETL Processes (Data Warehouse Testing)

Experience in creating UNIX scripts for file transfer and file manipulation.

Written several complex SQL queries for validating Reports.

Manage automation of file processing as well as all ETL processes within a job workflow

Lead in few complex Projects and Performed testing and created QA documents.

Automated test cases execution in ALM using VB script & validated JSON XML using Soap UI.

ALCON Laboratories, Inc.

ETL/BI Test Lead Aug 2019 – Dec 2019

Project Name: Alcon L-Quality Analytics Exec (QAC) Project

QAC means Quality Analytics Exec, in the project we developed 47 Tableau dashboards and tested thoroughly. This project involves different source system like historical scorecards, Online Score Card via Share point List, iTrack and CAPA metrics and AQC Data mart. We converted source system data to target database using Informatica and then we have created the Tableau dashboards for ALCON Quality purpose. This system used in globally for ALCON Quality metrics calculations.

Responsibilities: -

Write complex structured queries using Oracle SQL & PL/SQL to load the data from source database to target database and then compare Tableau Dashboard and IBM COGNOS report with the database output.

Use Agile methodologies like scrum to implement the products features and maintain the software.

Use Version One application to track the user stories status and project collaboration.

Design and implement test scripts and manage SQL test automation process. Lead continuous improvement efforts within the delivery teams to increase productivity and streamline the process.

Develop and maintain sanity and regression suites. Support solution designing activities and automated test scripts.

Prepare and participating in requirement analysis stage till production deployment stage including test kick off meetings, team meeting, defect triage, Product demo and weekly status meeting.

Responsible in preparation of various reports (Daily status report, Weekly status report, Monthly status report Test summary report).

Building, Maintaining and tuning Dashboards, reporting queries and ETL mapping to load data from XML source to relational pre- staging, then from pre- staging to staging, Then from staging to Data warehouse, from there validate Tableau public and Tableau desktop Dashboard’s and COGNOS report.

Co-ordinate with Business intelligence testing team and review the deliverables.

Provide support to technical discussion and implement application changes. Interact with scrum team on daily basis to resolve technical issues.

Help client end users during UAT phase and defects found and tracked for closure.

Identify test data scenario wise, involve in peer review of test cases, and test strategy. Have daily meetings and updates with the team on current projects.

Monitor the team activities to abide the compliance and process. Review regression and integration suite prepared by team, track various issues to closure, and provide essential support to team to ensure the quality of the product.

Analyze business requirements and work with BA, End users and development team to understand the requirements and analyze data model, source to target mapping document to build visualization dashboards and reports, QA, deployment and support using Tableau, IBM COGNOS, Informatica and Oracle SQL, PL/ SQL and Python scripting.

PNC Bank

ETL/BI Senior Test Lead Oct 2017 – Aug 2019

Anti-Money Laundering (AML) - PXP UI and Feed Upgrade Project

PXP means Politically Exposed Persons. In this project, we upgraded the PXP application UI and Feed to new version in the FIRCO software. The FIRCO Filter Engine is the component of the FIRCO application suite of tools that interrogates transaction and structured records to sanctioned/watch lists to identify matches between those sources. We have algorithms in PXP system, and it is providing instructions to the filter. The parameters of the filter engine can be and are adapted to optimize the detection for transactions and customer/structured records.

PNC sanction screen will perform screening against all units (OFAC list) and creates alerts in database for the matched entities and displays hit in the PXP. PXP Compliance users will decision the generated hits based their research.

Responsibilities: -

Analyzed the business requirements, system requirements, and data mapping requirement specifications, interacting with client, developers and QA team.

Written Test Strategy, Test Cases, reported bugs and tracked defects using Quality Center.

We acted as compliance user for PXP. We performed decision making for the generated alerts.

Involved in extensive DATA validation using SQL queries and back-end testing.

Performed all aspects of verification, validation including functional, regression, load and system testing.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Apart from project worked on Tableau reports development various portfolio’s in PNC Bank

Responsible for testing Initial and daily loads using Informatica.

Involved in preparation and maintenance of the Requirement Traceability Matrix (RTM) to measure the progress, test matrix and test script review.

Exclusively involved PL/SQL batch programs and responsible for reporting the defects to development team.

Tested the ETL SSIS mappings and validated database results.

Good to have knowledge on understanding VB Script to automate data testing using Macros.

Experience in creating UNIX scripts for file transfer and file manipulation.

Written several complex SQL queries for validating Reports.

Environment: SSIS, HPQC 12, Oracle PLSQL developer, SSIS and Tableau SQL, Mainframes,

UNIX, Version One.

Money Gram International, MN

ETL/BI Test Lead Mar 2015-Oct 2017

Project Name: Consumer Profile Project Phase II

A consumer can complete a MoneyGram transaction without a profile created. Without a profile, there is no option for the consumer to create preferences or opt-in/opt-out to receiving information or transaction notifications from MoneyGram. Moving forward, we would like to create a profile for every consumer at the time of his/her first transaction. On the first transaction, we will also collect the consumer’s basic opt-in/opt-out preferences for transaction notifications and special offers from MoneyGram. The profile is created regardless of the channels used to complete the transaction and for money transfer send/receive transactions and bill pay transactions

Responsibilities:

Assisted in gathering the business requirements, Analysis, test and design of the flow and the logic for the Data warehouse project.

Extensively used ETL methodology for testing and supporting data extraction, transformations and loading processing in ETL Solution using Informatica.

Created ETL test data for all ETL mapping rules to test the functionality.

Assisted in performing the Data Validation, Data Verification, Record Count, Data transformation

Worked on the Star schema with the fact and dimension tables.

To perform the ETL testing I was extensively using the SQL functionalities.

We followed the agile methodologies in the Project.

Using the SQL functionalities ensure that the data are populated against source and target.

Assisted to test the reports using the Cognos Reporting Solution.

Tested several different types of reports including Report Layout, Naming Conventions, Totals, and Sub-Totals, Drilling options, prompts, metric calculations, drill maps and COGNOS filters.

Used Informatica Workflow Monitor to monitor workflows and review session log.

Worked on Source Analyzer, Power Center designer, Workflow manager, Mapping Designer.

Writing test cases for data extracts to make sure the vendors are sending correct data files.

Maintained Source definitions, Transformation rules and Target definitions.

Raised defects in HP Quality Center as Defect Tracking System.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

We will log all the defects in the HP Quality Center, once the defects are assigned, the Development team fixes it, and we will retest the logged defects and close.

Environment: ETL INFORMATICA, HPQC 11.0, COGNOS, Oracle PLSQL developer, MDM database, UNIX.

Consumer Profile Project Phase I

This project phase (phase 1) was encompass the creation of consumer profiles from transactions being processed via the MGI OLTP environment. These are termed ‘implicit profiles. It was not including the migration of consumer profiles that were defined under other MGI consumer product/programs. These would include MGO and Plus Program members. The integration of these profiles, termed ‘explicit profiles’ will be undertaken in later phases of the project.

This phase was including loading initial implicit profiles into the MDM environment as profiles. The project sponsor has asked that 18 months’ worth of transactions be used for loading the initial baseline of consumer profiles.

This project phase will also support extracting and migrating profile data to the enterprise data warehouse and the creation of several reports the business may use to analyze the profiles being created as well as errors (insufficient data for example) encountered in trying to create or maintain consumer profiles.

Responsibilities:

As an ETL QA Tester, I was involved in analyzing the existing system process.

Created Master test plans and test cases for manual testing from BRD to match the project's initiatives.

Based on the various Business rules developed the ETL test cases from the Mapping Document.

Performed ETL testing and extensively used SQLs. Used PLSQL connect to Oracle Database.

Expertise in writing SQL Statements in database to make sure whether the data is populated in Data Mart/Data warehouse according to Business Rules in the Framework Manager.

Assisted in the Data Validation as Record Count against the Source and Staging Data and Target table.

Investigating the issues which performed during the Smoke testing, System testing, Integration testing, Regression testing, End to End testing

Implemented the Auto Sys scheduled jobs in UNIX Environment. Raised defects in HP Quality Center defect tracking system.

Raised defects in HP Quality Center as Defect Tracking System.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Logged the defects in the HPQC and once the defects were fixed, again retest defects and closed.

Collected evidence for each step of the process to ensure that any errors are capture in time and resolved immediately.

Assisted in the Data Migration testing against the Dimension Tables and Fact Tables.

Environment: ETL (Informatica and MDM), COGNOS, Oracle, HP Quality Center, PL/SQL developer, UNIX.

Compliance Reporting Portal

CRP stands for Compliance Reporting Portal and this portal will fetch and display the required data from the EFA database. The EFA database is populated with data using the ETL workflows. The QS test environment used will be BIRQ5 and EFAQ5. EFA database is a new database that is procured as part of CRP. QS Will be validating the ETL loads and the Portal for its functionality and the data displayed within each portal view.

Responsibilities:

Analyzed the business requirements, system requirements, and data mapping requirement specifications, interacting with client, developers and QA team.

Written Test Plans, Test Strategy, Test Script, Test Cases, reported bugs and tracked defects using Quality Center

Performed all aspects of verification, validation including functional, structural, regression, load and system testing.

I worked as COE lead for Business Intellengence.

Written complex SQL queries for data validation for verifying the ETL Mapping Rules.

Tested Metadata of tables including the data types, constraints and default values on the columns.

Involved in maintaining the test environments; with activities like requesting data loads, data base backups, restarting the servers, requesting the deployments, troubleshooting issues.

Used Toad to run SQL queries and validate the data loaded into the target tables.

Involved in user training sessions and assisting in UAT (User Acceptance Testing).

Involved in Functional Testing, Regression Testing, Sanity Testing, Ad-hoc Testing and User Acceptance Testing.

Responsible for testing Initial and daily loads of ETL jobs.

Involved in preparation and maintenance of the Requirement Traceability Matrix (RTM) to measure the progress, test matrix and test script review.

Exclusively involved PL/SQL batch programs and responsible for reporting the defects to development team.

Tested the ETL informatica mappings and other ETL Processes (Data Warehouse Testing)

Experience in creating UNIX scripts for file transfer and file manipulation.

Written several complex SQL queries for validating Reports.

Created Test input requirements and prepared the test data for Data Driven testing.

Involved in extensive DATA validation using SQL queries and back-end testing.

Tested the different sources such as Flat files, Main Frame Legacy Flat Files, SQL server 2005 and Oracle to load into the Oracle data warehouse

Extensively used Rational Clear Quest to track defects and managed them.

Performed periodic checks to run crosscheck against QA/SIT/PROD environments to ensure it is up and running.

Environment: Informatica, Oracle 10g, HP Quality Center, Netezza, Windows XP, UNIX,

THE HARTFORD, Connecticut

ETL/BI Developer and Test Analyst Jun 2013 – Mar 2015

Audit Central

The Hartford Insurance Company majorly focuses on Home Insurance, Vehicles Insurance and Personal insurance. Here Audit done by Auditors. Auditors are Supervisor, Team Leader and Claim Manager. The Process starts from claimer were applied for claims in Hartford Insurance Company. Once the Hartford has received the claim, Supervisor, Team Leader and Claim Manager will review the claim.

Responsibilities:

Create test plan and test cases for manual and automated testing from the business requirements to match the project's initiatives.

Tested various ETL transformation rules based on log files, data movement and with help of SQL.

Created and executed detailed test cases with systematic procedure and expected result. Maintained the test logs, test reports, test issues, defect tracking using Quality center 9.2.

Extensively used Data Stage for extraction, transformation and loading process.

Tested to verify that all data were synchronized after the data is troubleshoot and used SQL to verify my test cases.

Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD/Exceed.

Worked with ETL group for understanding Data Stage graphs for dimensions and facts.

Writing complex SQL queries and PL/SQL for data validation for verifying the Packages and business Rules.

Involved in testing the OBIEE reports by writing complex SQL queries

Tested Data stage Hashed files for extracting and write data - an intermediate file in a job.

Ran workflows created in Data stage by developers then compared before and after transformation of data generated to ensure that transformation was successful.

Good experience in Cubes

Interacted with Users to understand the bugs in Business Reports generated from OBIEE, COGNOS

Worked with UNIX shell scripts. Worked with releasing scripts live and tested.

Tracked bugs using Quality Center and generated the defect reports for review by the client and the management teams.

Involved in debugging of the scripts according to the modifications made to the build using QTP

Performed ETL testing and extensively used SQL functionalities. Used TOAD to connect to Oracle.

Tested the reports generated by OBIEE and verified and validated the reports using SQL.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Facilitated the sharing of structured data across different information systems using XML Created and executed automation scripts using QTP

Tested the reports like Drilldown, Drill Up and Pivot reports generated from OBIEE, COGNOS

Automated and scheduled the Data stage jobs using UNIX Shell Scripting.

Participate in weekly meeting with the management team and walkthroughs

Environment: Control M and Informatica, COGNOS, OBIEE, Clear Quest, MS Access 2003, UNIX, TOAD

Citicorp Credit Services Inc. (USA)

ETL /Data warehouse Tester June 2010 – May 2012

Debt Restructuring

Project Description: Debt restructuring comes down to the notion of providing the customer with

Options for servicing their entire debt in one consolidated communication effort. One can also

Extend this effort to make the debt more manageable by breaking down the debts into several

Convenient repayment plans or partial or deferred payments based on individual cases.

Responsibilities:

As an ETL Tester, I was involved in analyzing the existing system process.

Enhanced the Legacy Technical Specifications to meet the Enhanced System specs.

Designed and maintained logical and physical enterprise data warehouse schemas.

Worked on Star schema loaded data using Fact Tables to the staging tables.

As testing all data issues, I have used SQL scripts to get issues resolved.

Created naming conventions and system standards for lookup (Connected and unconnected) transformation and target tables.

Experienced in writing complex SQL and PL/SQL queries for testing data in oracle targets.

Worked on issues with migration from development to testing.

Prepared test data to validate ETL rules wherever necessary.

Extensively used ETL and Informatica to load data from flat files MS Access and MS SQL Server 2005.

Used Informatica Workflow Monitor to monitor workflows and review session log.

Designed, developed and deployed new Data Marts along with modifying existing support additional business requirements

Worked on Source Analyzer, Power Center designer, Workflow manager, Mapping Designer.

Performed Regression Testing as and when the Bugs got fixed and check whether the application is free from builds.

Writing test plans to incorporate the testing needs of the data warehouse using Test Director.

Writing test cases for the data extracts to make sure the vendors are sending the correct data files.

Maintained Source definitions, Transformation rules and Target definitions.

Implemented Scheduled jobs in UNIX Environment and EDW scheduler.

Worked on testing with Complex Mappings by importing Procedures/functions to build business rules to load data.

As a testing process created Database scripts, Procedures, Database Constraints.

Designed developed and tested processes for validating and conditioning data and maintained test documents (UAT), System Testing and integration Testing.

Environment: Informatica Power center 8.1/8.6.0, Oracle, SQL, PL/SQL, Clear Quest, Main frames, MS Access 2003, UNIX, TOAD



Contact this candidate