Post Job Free

Resume

Sign in

Sr. ETL Tester

Location:
Dallas, TX
Salary:
$55/hr
Posted:
May 07, 2021

Contact this candidate

Resume:

Divya

Sr. ETL Tester

adl70n@r.postjobfree.com

512-***-****

BACKGROUND SUMMARY:

8+ years of experience in DWBI testing and development.

Strong QA testing experience in leading the QA effort for large projects from requirements inception to production release.

Strong background working with large data sets in Data Warehousing and Business Intelligence environment.

Good working experience in technical planning & requirements gathering phases including Design, code, test, troubleshoot, and document engineering software applications.

Collecting and reporting test results through various metrics reporting tools and communicating to stakeholders on a regular basis.

Involved in Implementing Agile (Scrum) methodologies for the complex project modules.

Executing back-end data-driven test efforts with a focus on data transformations between various systems and data warehouse.

Developed and Tested applications in languages including ETL and any suite of open-source products which provide data integration.

Strong in ETL data validation developed using Informatica /Abinitio /Datastage/SSIS ETL environments.

Strong in BI report validation developed using Cognos/Business Objects/Microstrategy/SSRS BI.

Experienced in SQL Server 2008/2012 Business Intelligence Tools using SQL Server Reporting Service (SSRS), SQL Server Analysis Service (SSAS) and SQL Server Integration Service (SSIS).

Migrating data between applications or databases and exporting data from databases to flat files.

Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.

Strong debugging, problem solving and investigative skills. Ability to assimilate disparate information (log files, error messages etc.) and pursue leads to find root cause problems.

Have testing skills with a firm understanding of Functional End to End Testing, System Testing, Batch Processing, and Regression.

Strong experience in writing SQL/ PL/SQL queries.

Good working experience in Databases like MS SQL, Oracle.

Testing and validating results of new or updated data operations/processes.

General programming and debugging knowledge a plus.

Knowledge of test strategies, test cases, procedures, and/or test scripts

Knowledge of translating business requirements into test cases

Knowledge and/or experience with Business Intelligence (BI) data warehousing environment

Demonstrated experience in scripting language.

Demonstrated analytical experience such as data analysis, data usage analysis, and/or business data analysis

Good knowledge on fundamentals of Data warehousing concepts and model

Experience with Agile/Scrum methodology

Good exposure to onsite and offshore model

Extensive experience in formulating error handling mechanism.

Excellent analytical skills in understanding client’s organizational structure.

Good knowledge and experience in Agile and Waterfall methodologies.

Excellent problem-solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.

TECHNICAL SKILLS:

ETL Tools

Informatica Power Center 10.x/9.x, SSIS,SSRS

Database

Oracle 12c,Oracle 9i/10g/11g, SQL Server 2014/2008/2005, TOAD

Data Modelling Tools

Erwin .

Languages

SQL, T-SQL,PL/SQL, C, C++

Scripting Languages

UNIX Shell Scripting, Korn Shell, Bash shell scripting

Operating Systems

Windows, MSDOS, Linux, Unix, Sun Solaris

PROFESSIONAL EXPERIENCE:

PRA Group Inc, Schamburg IL Sept’19 - Till Date

Sr. ETL Tester

Responsibilities:

Worked closely with software developers/project owners and BAs to develop and execute thorough test suites in all phases of the software development cycle.

Developed Test strategy, test plan/design, execute test cases and defect management for the ETL & BI systems.

Developed and executed detailed ETL related functional, performance, integration and regression test cases, and documented the results.

Analyzed and understood the ETL workflows developed.

Quality Management - Knowledge of quality management methods, tools and techniques.

Created UNIX shell scripts to access and move data from production to development environment.

Experience in writing stored procedures, packages using joins, cursors, and functions.

Perform data analytical testing for the BI systems.

Validated data transformations and performed End-to-End data validation for ETL & BI systems.

Strong knowledge in ETL data validation developed using Informatica environments.

Strong knowledge on ETL and BI processes.

Communicated timely status, including any potential risks/issues to the appropriate teams to ensure completion of all deliverables within schedule, budget and quality constraints

Experience in communicating and managing management teams and affected stakeholders during the planning and roll out of project release.

Good experience working in Agile environment.

Environment:Informatica Powercenter 10.2, 0racle 11g, TOAD/PL SQL Developer, SQL PLUS, UNIX, Windows, TFS.

CVS Health, Buffalo Grove IL Feb 2017 – Aug 2019

ETL Tester

Description: CVS Caremark, Inc. is one of the leading providers of “prescription benefit services” in the United States. Through its affiliates, it provides comprehensive drug benefit services to health plan sponsors and their plan participants throughout the US. CVS Caremark Rebates System - CVS Caremark negotiates rebates with manufacturers and passes on the benefits to plan sponsors and participants. Rebate System involves processing of claims received from pharmacies through Adjudication Engines to calculate the rebates. Calculation of rebates is a complex process that depends on a wide variety of contract arrangements between PBM, plan sponsors and manufacturers. Formularies also play a critical role in rebates. Inclusion of a drug on formulary can leverage greater manufacturer rebates.

Responsibilities:

Developed, implemented, and maintained quality and test procedures, processes and best practices for QA.

Developed and maintained test plan and test cases with associated test data based upon functional and non-functional requirements.

Identified test automation opportunities to improve efficiency and effectiveness of test services.

Contributed to test automation scripting standards and best practices.

Have good understanding of data architecture and ETL workflows.

Involved in validating various SSIS and SSRS packages according to functional requirements.

Created ETL test data for all ETL mapping rules to test the functionality of the SSIS Mapping.

Tested Complex ETL SSIS and SSRS Packages and Sessions based on business user requirements and business rules to load data from different source.

Developed Test Cases for Deployment Verification, ETL Data Validation, Cube Testing and Report testing.

Wrote SQL scripts to extract data from various databases and systems and compare against data warehouse.

Experience in understanding ETL testing requirements.

Good experience in documenting functional and integration acceptance test evidence.

Assisted the clients during the User Acceptance Testing (UAT).

Conducted tests, documented, and analysed test results and reviewed findings to development teams.

Reported and documented defects found during test cycles and participated in defect prioritization sessions.

Proficient in writing SQL queries independently, including complex joins.

Experience in root cause analysis and defect diagnosis techniques.

Created and executed end-to-end test plans for our Enterprise Data Warehouse and associated ETLs.

Environment:SSIS, SSRS, PL/SQL, Oracle 11g/10g, SQL Developer, TOAD, JIRA, Mainframes (MVS, CICS, JCL, DB2)

Horace Mann - Springfield, IL June’15 – Jan’17

Informatica Developer& Tester

Description: -

Horace Mann is primarily into Insurance market for educators. HoraceMann has multiple Insurance products in different line of businesses such as Life, Annuity, Group, Auto and Property. ALG (Annuity, Life & Group) gets Policy, Premium and Coverage data from McCammish in the form of daily files. McCammish processes the life insurance product data and sends the policy, premium and Coverage data on a daily basis.

Enterprise Data Warehouse (EDW) is required to have all the Policy related data at one enterprise level to support the operational and analytical reporting. Policy data includes Policies, coverage’s, events, relationships, invoices etc. EDW was designed to have both the Life legacy data and the conversion of Life and Annuity data.

Responsibilities:

Interacted with business users and business analyst to understand reporting requirements and understanding the mapping documentation.

Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.

Created test plan, test cases and performed Unit testing, SIT and UAT and send them to projects lead for approval.

Used Informatica to develop objects for extracting, cleaning, transforming and loading data into data warehouse.

Perform System Analysis and Requirement Analysis, design and write technical documents and test plans.

Worked with Management for creating the requirements and estimates on the Project.

Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.

Planning and Requirement analysis for large scale Centralized Wealth Data Warehouse.

Coordinated with DBA in creating and managing tables, indexes, table spaces and data quality checks.

Developed numerous mappings using the various transformations including Address Validator, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, Parser.

Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.

Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.

Implemented Slowly Changing Dimensions both SCD Type 1 & SCD Type 2.

Developed Test Scripts, Test Cases, and SQL QA Scripts to perform Unit Testing, System Testing.

Communicate with business users to understand their problem if required and send workaround to solve the issue and find the root cause and advice if required any enhancement.

Responsible for tuning of ETL processes.

Performed data analysis and dataprofiling using complex SQL on various sources systems including Oracle.

Developed various Informatica Workflows to load the data from various upstream systems using different methodologies i.e. trigger based pull, direct pull & file based push.

Extracted data stored in a multi-level hierarchy using Oracle Stored Procedures.

Published the records to downstream applications with help of ETL power center with the help of message queue.

Design and develop mappings, sessions and workflow as per the requirement and standards.

Created various AutoSys jobs for the scheduling of the underlying ETL flows.

Created the ETL exception reports and validation reports after the data is loaded in the warehouse database.

Used Informatica debugger in finding out bugs in existing mappings by analyzing data flow and evaluating transformations and done unit testing for the individual developed mappings.

Responsible for making changes in the existing configuration wherever required, making customizations according to the business requirements, testing and successfully moving the changes into production.

Modified existing mappings for enhancements of new business requirements.

Translated the PL/SQL logic into Informatica mappings including Database packages, stored procedures and views.

Worked on performance tuning of the mappings, ETL Environments, procedures, queries and processes.

Environment: Informatica 9.6.1, Oracle 11g, PL/SQL, MS-Excel, MS-Access.

Mahindra Satyam, Hyderabad, India July’ 13 - Mar’ 15

ETL Developer

Description: UTC Fire & Security formerly known as GE Security, working hard to make the world a safer place. It is a market leading brand, they deliver a full-range of fire safety and security solutions to a diverse customer base around the world.

Enterprise Data warehouse contains Information from various departments like Heat, Smoke, CO, Water blowers, Emergency speakers, fire panels and Maintenance requests. Involved in Design, implementation, Testing and Maintenance of UTC Enterprise Data warehouse.

Responsibilities:

As a team member involved in collecting all the requirements for building the data warehouse.

Involved in Creation and design of required tables, indexes and constraints for all production activities.

Played a key role in designing the application and migrate the existing data from relational sources to corporate warehouse effectively by using Informatica Power center.

Responsible for Extracting, Transforming and Loading data from Oracle, Flat files and placing them into targets.

Developed various mappings using Mapping Designer and worked with Source qualifier, aggregator, connected and unconnected lookups, Filter transformation, and sequence generator transformations.

Designed and created an Autosys job plan to schedule our processes.

Involved in Data Modeling and design of Data Warehouse and Data Marts in Star Schema methodology with conformed, granular dimensions and fact tables.

Used SQL, for database related functionality.

Environment: Informatica Power Center 7.1, Erwin, SQL Server 2005, Oracle, UNIX, Autosys.



Contact this candidate