Post Job Free

Resume

Sign in

Etl Tester Informatica Developer

Location:
Charlotte, NC
Posted:
May 24, 2023

Contact this candidate

Resume:

SUMMARY

Over **+ years of experience in Software Quality Assurance with experience in Data Warehouse (ETL & BI) testing.

Experience in defining Testing Methodologies; creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC)

Strong in Software Analysis, Planning, Design, Development, Testing, Maintenance and Augmentation for various applications in data warehousing, metadata repositories, data migration, data mining and Enterprise Business Intelligence.

Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter, Podium, Informatica BigData Management.

Extensive experience in writing SQL to validate the database systems and for backend database testing.

Implemented all stages of development processes including extraction, transformation and loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica PowerCenter using Informatica Designer (Source Qualifier, Transformation Developer, Mapping and Mapplet Designer), Repository Manager, Workflow Manager and Workflow Monitor.

Experience writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL) in Bigdata environment.

Experience in validating tables with Partitions, bucketing and Loading data into HIVE tables in Bigdata environment.

Good knowledge in testing methodologies like Agile-Scrum and Waterfall Model.

SKILLS

ETL Tools: Informatica PowerCenter 9.x/8.x,Ab Initio, Podium, Informatica BigData Management

Databases: Oracle 10g / 9i / 8i / 7.x, DB2 UDB, MS SQL Server 2008, MS-Access, Teradata, Informix

Languages: SQL, T-SQL, PL/SQL, Unix Shell Scripting

Operating Systems: UNIX, Linux, MS-DOS, Windows vista / XP / 2000 / NT / 98 / 95.

Scheduling Tools: Autosys, Control M, One Automation

BI Tools: Cognos 8, crystal reports, business objects xi, OBIEE

Data Modeling Tools: ERWIN 4.1 / 4.0

Methodologies: Agile, Kanban,SaFe, Waterfall

Testing Tools: HP ALM/QC, JIRA, UFT, Selenium, Ranorex

EXPERIENCE

BBnT now Truist, Winston-Salem, NC

Test Manager ŸAPR 2021 – present

Lead a team of 12 members with onsite-offshore model as part of multiple projects and multiple releases.

Responsible for project estimation, risk mitigation plan, resource planning & allocation, train team members.

Responsible for providing Client Acceptance Test Sign-off document for multiple projects.

Work with the project teams in identifying and reviewing requirements.

Responsible for preparing Test Strategy and Test Plan for Data Migration as part of MOE (Merger of Equals) and get approvals from stake holders.

Developed Test cases, Test scripts from the data mapping documents, functional Specification documents.

Mapped the test cases with the requirements for generating the Requirement Traceability Matrix (RTM).

Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files to target tables.

Tested the field mappings and transformation rules for data migration.

Used PACE (Python Automation Comparison Engine), Query Surge automation tool for validating data from file to file, file to table, table to file and table to table.

Verified the Transformation rules against Workflow mapping and Error logs.

Interacted with the developers on a regular basis for load updates and defect fixes.

Analyzed the root cause of the failed Test cases and provided the related data samples to the client.

Responsible for Process & Compliance for each of the projects.

Participated in defect triage calls and all scrum ceremonies.

Coordinated with Offshore team members.

Environment: SQL, PL/SQL, Mainframes, Data lake, Unix, IBM DB2, Informatica, Oracle, PACE, HP ALM, Agile, Hive, Hue, HDFS, Rally.

MedPro, Fortwayne, IN

Senior Test LeadŸFEB 2021 – APR 2021

Gained good knowledge on Medical malpractice insurance domain.

Worked closely with product owner and development team to understand the user stories and capture test scenarios.

Responsible for Regression and Integration testing.

Review mapping documents and design SQL queries according to the transformation.

Analyzed Source to Stage and Stage to Target mapping document based on Facts & Dimensions and validating the Data Model, Counts, Not Nulls, Duplicates, Primary key & Foreign key validation, business rules to be applied on target tables.

Documented test results in JIRA.

Responsible for debugging when there are issues in data validation.

Validate source data from relational and non-relational systems.

Tested Full load & Incremental load by entering the sample test data to couple of entities and making sure Inserts & Updates of loading the data is done correctly.

Validated conversion data loading from source to target for Policy, Coverage, Risk and Source Party.

Used Python script for comparing datasets.

Validated the data in QA, UAT and PROD environments.

Environment: MS SQL Server Management Studio, PL/SQL, Jira, QA, UAT, Agile, Informatica, Oracle, Mainframes, Python, Alteryx, Spyder (Anaconda 3) (Python 3.7)

Wells Fargo, Charlotte, NC

Senior Test LeadŸMAY 2020 – NOV 2020

Worked closely with tech team and business analysts to understand the requirements and capture test scenarios.

Reviewed mapping documents and design SQL queries according to the transformation.

Designed and executed test cases for each of the requirements.

Validated source data from relational and non-relational systems.

Created testdata using Postman.

Validated the winner logic for various test scenarios in selfheal.

Validated data from core tables and update matrix.

Analyzed the data using values from update matrix.

Validated data from update matrix to downstream applications.

Ran automation scripts for regression testing using cucumber.

Ran Data Quality Engine (DQE’s) for every release as part of regression testing.

Responsible for sanity testing in production for every release.

Environment: MS SQL, PL/SQL, API, Postman, Cucumber, Webservices, XML, Jira, UAT, SIT, SIT2, IST.

Charter Communications, Charlotte, NC

ETL Test LeadŸSEP 2019 – APR 2020

Worked closely with developers/ product owners to develop and execute SQL test cases to validate the data.

Analyzed Source to Target mapping documents.

Designed and executed test cases in HP ALM.

Responsible for checking mapping documents from source to target for ICOMS and CSG tables and fields to Core Tables.

Scheduled jobs in One Automation.

Designed SQL queries according to the mapping document to validate source and target systems.

Executed queries in Bline (through putty) and Teradata Studio.

Executed jobs in Unix, One Automation for reload and incremental loads scenarios.

Develop and execute ETL related functional, performance, and integration and regression test cases.

Performed data validations using SQL queries for various SCD types, particularly SCD Type 4 and SCD Type 2.

Responsible for validating data model changes.

Develop and execute ETL related functional, performance, and integration and regression test cases.

Tested data across all layers of data platforms (Data Lake, database).

Extract by the subject areas flat files from Source billing system CSG, ICOMS, RDM and validate data loaded in Stage and Persist Tables in Teradata.

Responsible for review of all test reports and provide the QA sign off for all the deployments

Environment: SQL, PL/SQL, Informatica Power Center, Oracle 11g, DB2, Unix, One Automation, HP ALM, SVN, Teradata studio/Teradata SQL Assistant, Agile, Podium, Ambari, Hadoop, HDFS, HiVE, Data lake, Bigdata.

State Street, Quincy, MA

Technical Lead ŸFEB 2017 – AUG 2019

Participated in Planning and story writing meetings as part of scrum methodology.

Prepared Test plan and Test strategy documents.

Analyzed and understood all the user stories and prepared test cases.

Designed SQL queries according to the source and target systems.

Ensured that all the mappings have been covered in the test cases.

Designed test queries with multiple joins, indexes, temporary tables, analytical functions, and sub queries linking by unions, intersection, and minus operators etc.

Validated the data and transformations between source to staging and staging to target tables.

Extensively involved in implementing performance tuning techniques for the queries.

Coordinated between onshore and offshore team members and conducted weekly connect meetings.

Responsible for executing automation scripts for regression testing using UFT.

Involved in executing Cleanup job based on Selenium for improving region performance.

Executed jobs based on Unix when the records are stuck.

Environment:SQL, PL/SQL, Informatica Power Center, Oracle 11g, DB2, Mainframes, UFT, Selenium, Unix

Bright House Networks, Riverview, FL

ETL Test Lead ŸMAY 2016 – DEC 2016

Analyzed business requirements document, functional specifications document to prepare Test plan and Test cases.

Implemented and followed a Scrum Agile development methodology within the cross functional team and coordinated between the business user group and the technical team.

Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirements.

Established Traceability between the Requirements and the Tests in Traceability Matrix using HP ALM.

Performed source system data analysis in order to manage source to target data mapping.

Involved in creating Use Case specifications, business flow diagrams and sequence diagrams to facilitate the developers and other stakeholders to understand the business process according to their perspective with possible alternate scenarios.

Extensively used SQL for accessing and manipulating database systems

Created data migration scripts, data mappings.

Identified, resolved, and documented database issues.

Involved in defining the source to target data mappings, business rules, and business and data definitions.

Interacted with the team members in designing database, data integration and business intelligence infrastructures and other reporting tools with best practices.

Involved in Functionality testing during the various phases of the development.

Verified the correlation between the UML diagrams and developed detailed diagrams.

Validated the system End-to-End Testing to meet the Approved Functional Requirements.

Mapped Test Scenarios back to business requirements defined entry and exit criteria for test execution.

Responsible for data validation in HiVE (HQL) – Big data validation.

Responsible for Sanity testing during production deployments.

Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.

Environment:SQL, PL/SQL, Informatica Power Center 9.6.1.2, Oracle 11g, Linux, PuTTY, WinSCP3, HP ALM, Teradata, Aquadata studio, Teradata SQL Assistant, Hive, Optymyze, Datalake, Bigdata, Hadoop.

State Street, Quincy, MA

ETL/Automation Test LeadŸSEP 2015 – APR 2016

Reviewed the Business Requirement Documents and the Functional Specification.

Developed Test Cases for Deployment Verification, ETL Data Validation and Report testing.

Used HP Quality Center to perform Manual Testing and logging and tracking defect

Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.

Tested to verify that all data were synchronized after the data is troubleshoot and also used SQL to verify/validate my test cases.

Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.

Created high level use case diagrams, process composition diagrams and data modeling

Reviewed Informatica mappings and test cases before delivering to Client.

Experienced in writing complex SQL queries for extracting data from multiple tables.

Wrote SQL queries to validate source data versus data in the data warehouse including identification of duplicate records.

Working on automation of test scripts using Test Agnostic Framework (TAF) which is based on UFT.

Extensively working on creating VB scripts for automating manual test scripts.

Environment:SQL, PL/SQL, Informatica Power Center 9.6.1.2, Oracle 11g, Linux, PuTTY, WinSCP3, HP ALM, Teradata, TOAD, XML, UFT.

State of Illinois (Healthcare & Family Services), Springfiled, IL

Sr. Informatica Developer ŸJAN 2015 – AUG 2015

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Worked on enhancements of the existing data warehouse for producing health care data reports for HFS.

Involved in extraction of Data from source systems, transformation in the stage layer and finally loading into Marts.

Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.

Worked with the joiner transformation using Normal Join, Master outer join, detail outer join and full outer join. Implemented slowly changing dimensions type 1 and type 2 for change data capture.

Created Expression, Look up, sequence generator, Source qualifier, Rank, Aggregator were implemented to load the data from source to targets. Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.

Extensively worked with various Passive transformations like Expression, Lookup and Sequence Generator. Created complex mappings using unconnected and Connected Look Up transformations.

Transformed data was loaded into relational tables and finally loaded into fact table.

Monitored data quality, resolving problem data issues, and ensuring timely updates.

Migrated to the test environment and production environments

Worked with IT Development team to ensure feasibility of design and clarify user requirements.

Environment: SQL, PL/SQL, UNIX, DB2, TERADATA, putty, Business Objects XIR3, XML Files, INFORMATICA, ORACLE

Credit Suisse (Wipro Technologies), Manhattan, NY

ETL/BI Test AnalystŸFEB 2012 – Jan 2015

Performed ETL testing based on ETL mapping document for data movement from source to target.

Transformed the EDI Data into the format understandable to Back End Systems.

Developed, executed, and maintained Test Plans, Test Case, Test Scripts, and Test Data for manual testing approaches using track & defect management tool HP Quality Center (QC).

Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security.

Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity

Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata.

Automated detailed test cases by using Quick Test Pro.

Used Quick Test Pro to write the automated test scripts by using various actions and reusable actions.

Drafted Test Plan for flow of data from front end to back end legacy applications.

Written several complex SQL queries to validate the Data Transformation Rules for ETL testing.

Performed extensive data validations against Data Warehouse.

Loaded flat file Data into Teradata tables using Unix Shell scripts.

Written Test Cases for ETL to compare Source and Target database systems.

Monitored the data movement process through Data Extract to flat files through Informatica execution flows and Loading data to Data mart through NZLOAD utility.

Tested the ETL data movement from Oracle Data mart to Teradata on an Incremental and full load basis.

Environment: SQL, PL/SQL, Excel Pivot, Informatica PowerCenter 8.6.1, Oracle 10g/9i, Business Objects, DB2, SQL, UNIX, PuTTY, WinSCP3, Quality Center, Control M, Teradata, TOAD

Walmart (Wipro Technologies), Bangalore

ETL/SQL AnalystŸAUG 2011 – FEB 2012

Worked with Data Profiling for various needs of business.

Scrubbed data to accurately generate customer pull. Provided output files in various file format based on customer request.

Worked on calling shell scripts from post-session and pre-session events. Extensively involved in implementing performance tuning techniques for the queries.

Tested different reports generated through Cognos Crystal Reports, and OBIEE.

Extensively used various types of transformations such as Expression, Joiner, Update strategy, Look up, filter for developing mappings.

Optimized QTP scripts for Regression testing of the application with various data sources and data types.

Executed regression tests at each new build in QTP.

Prepared the Test Plan and Testing Strategies for Data Warehousing Application.

Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)

Tested Database Triggers, Stored Procedure, Functions and Packages.

Optimized queries using rule based & cost based approach.

Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security

Tuned SQL statements and schemas for optimal performance.

Environment: OBIEE, Erwin 4.1, TOAD 7.6, Oracle 9i, DB2, Cognos, SQL, PL/SQL, UNIX, Shell Scripts.

Capital One (Wipro Technologies), Bangalore

Data/SQL/ETL Tester ŸDEC 2010 – AUG 2011

Tested ETL jobs as per business rules using ETL design document.

Assisted in creating fact and dimension table implementation in Star schema model based on requirements.

Expert in writing complex SQL/PLSQL Scripts in querying Teradata and Oracle.

Tested the database schema with help of data architects using ERWIN.

Identified and documented additional data cleansing needs and consistent error patterns that could divert by modifying ETL Code.

Involved in developing Unix Korn Shell wrappers to run various Ab Initio Scripts.

Tested several data validation graphs developed in Ab Initio environment

Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool.

Tested and developed the graphs for extracting, cleansing, transforming, integrating, and loading data using Ab Initio ETL Tool.

Extensively used Teradata load utilities Fast load, Multiload and FastExport to extract, transform and load the Teradata data warehouse.

Queried test database results using a variety of internal tools, SQL statements, and Quest's Toad database UI product to verify back-end results.

Involved in writing test scripts and functions in Test Script Language using QTP for automated testing.

Responsible for different Data mapping activities from Source systems to Teradata.

Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly basis with proper dependencies.

Wrote complex SQL queries using joins, sub queries and correlated sub queries.

Performed System Integration testing by developing and documenting test cases in Quality Center.

Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.

Development Environment: Abinitio, Flat files, Perl, Erwin 4.0, MS SQL Server 2008, Oracle 10g, SQL, PL/SQL, IBM DB2 8.0, AGILE, Teradata SQL Assistant, Mercury Quality Center 10, QTP 10.0, Autosys, Toad, Unix Shell Scripting, Windows XP.

Medtronic Inc (Wipro Technologies), Bangalore

ETL TesterŸDEC 2009 – DEC 2010

Performed unit and integration test for the Informatica and database level.

Created technical specifications, mapping documents and managed test cases.

Participated in development of an estimation tracking tool for level of effort projections.

Gathered information, compiled findings and prepared management reports for staffing levels.

Wrote several complex SQL queries to validate the data conditions as per the mapping document.

Provided data analysis, identified trends, summarized results and generated reports for Card Solutions Delivery reorganization effort.

Extracted data from different sources like, Oracle, Flat files, Xml & loaded into Operational Data Source and tested the same.

Experience in a consolidation MDM environment where customer data is sourced from different sources SAP R/3, Siebel CRM, SAP CRM

Extensively used SQL and PL/SQL Procedures and Worked in both UNIX and Windows environment.

Created UNIX shell scripts for Informatica ETL tool to automate sessions.

Development Environment: Oracle, SQL, PL/SQL, SQL*Plus, Mainframe, Windows 2000/NT/98/95, Microsoft office 97, Informatica 6.1, UNIX, PERL, Shell Scripting, Selenium, XML

EDUCATION

Bachelors in Technology, GIET, Rajahmundry, India

Master of Science (Computer Science), University of Toledo, Toledo, USA

Master of Business Applications (Strategic Leadership), UC, USA

ACHIEVEMENTS:

Certified SAFE 5Agilist.

Certified SAFE SSM.

Received Agile Bronze certification from State Street Corporation.

Received Certificate of Appreciation from Bright House Networks.

Received Agile Bronze certification from State Street Corporation for scaled team(SaFe)s.



Contact this candidate