Post Job Free
Sign in

Data Etl

Location:
Batavia, OH
Posted:
December 10, 2020

Contact this candidate

Resume:

Shilpa Marri 715-***-****

Sr. ETL QA Analyst *****.******@*****.***

PROFESSIONAL SUMMARY:

Around 10 Years of IT experience in Analysis and Testing of ETL-BI solutions for Data Migration, Hadoop CDS Eco System, Data Warehouse/BI Reports, and Manual Client/Server applications.

Experience in using ETL Informatica 10.2.0 tool and running ETL Sessions & Workflows using WF Monitor and testing Informatica Web Services application and used SOAPUI as third party tool.

Participated in Leading activities from Onsite and actively involved in Communication across all the teams with Program/Product Managers, PM’s, Development & BA/System Analysts to ensure smooth final end-user deliverable.

Worked on various Domains like P&C Insurance (Guidewire), HealthCare, Financial, Tax/Audit, Manufacturer, Retail, E-Commerce and Supply Chain.

Strong work experience on Property & Casualty Insurance with Guidewire Policy Center, have excellent knowledge in complete Policy life cycle and also have good knowledge on GW Claims center.

Worked with heterogeneous databases and expertise in writing Complex SQL Queries using Teradata, Oracle and SQL Server and in validating data into Data Warehouse/ETL applications.

Extensively used Water fall & Agile-Scrum methodologies and followed agile ceremonies Sprint Planning, Daily Scrum, Sprint Review, and Sprint Retrospective and used JIRA by breaking the epics into Stories which in turn creating the Sub-tasks and tracking the progress via these on a Scrum board.

Extensively tested the CDS Hadoop Eco system and also file ingestion process by using informatica from SFTP SOR files to Hadoop HDFS Conformed zone and from Conformed Zone to Semantic Zone which is Teradata DB.

Used the Hadoop ecosystem components HDFS, Apache Hive for querying and analyzing datasets stored in Hadoop files using the HUE tool.

Expertise in OLTP/OLAP System Study, Slowly Changing Dimensions types 1/2/3 (SCD'S) Analysis and E-R modeling, developing Database Schemas like Star flake schema and Snowflake schema used in relational, dimensional and multidimensional modeling

Strong working experience in the Data Analysis, Data Verification and Validation of an ETL Applications using Backend/Database Testing.

Tested MSI/COGNOS reports drilling options, prompt ordering, defaults, Metric Calculations, Filters and Export/Print functionality, Formatting Properties and validated the data backend against the Target Data Marts.

Performed Data Profiling and Data Quality (DQ) checks whether the data is delivered under quality rules and all jobs are working as per business requirement.

Experience with Database tools like Teradata SQL Assistant 15.10, TOAD 12.1/Oracle Sql Developer 19.2.1, and SQL Server Management studio 2017 for creating and executing SQL Queries and used for FTP/SFTP FileZilla or WINSCP as Open Source tool for accessing Unix Files.

Validating Sales date from POS to Financials in SAP/Database and to validate different tender types on POS: cash, cards, gift cards, Mall certificates, Payment Gateway Processing, Promotions, Discounts, Return and Exchange transactions.

Test functionality of ecommerce application (ETAIL) from POS that interacts with the database.

Extensive experience in Black Box Testing, Functional Testing, Integration Testing, System Testing, Regression Testing, Performance Testing on Warehouse Applications.

Involved in preparation of Traceability Matrix based on High/Detail requirements and Test Cases using ALM 12.0/ Azure DevOps, QA Sign off Documents and maintaining Weekly Status Reports.

American Modern Insurance Group, Cincinnati, OH Oct’2019 – Till Date

Sr. ETL QA Analyst

AMIG is the holding company for a number of subsidiary P&C insurance companies that provide specialty products for Line of business Homeowners and Auto/Recreational vehicles. The InfoCenter(IC) is a conversion project built through SAP BODS where Source is pulled from Guidewire UI which is loaded into TRF tables and then staged into Datahub using oracle and finally loaded in EDW the target DIM & FACT table. EDW data is consumed to build Premium/ Loss/ Financial(Billing)/ Operational/ Control reports using Cognos11 for business analytical purpose.

Responsibilities:

Analyze Source-to-Target mapping document, Technical design document (TDD), Physical Data Model (PDD) and DFD’s to understand the business process.

Understanding the Guidewire UI Product structure for LOB Homeowners/ Personal Auto and following the PC lifecycle and how data being loaded in DIM & FACT tables in EDW architecture.

Involved in major business processes in Policy decisions like Creating a quote, Policy Issue, Endorsements, Renewals, Cancellations & Reinstatements. These processes were tested and evaluated under various business situations that will integrate data in other modules/products like PC(Policy Center) and BC(Billing Center) for Direct/ Agency Bill policies.

Tested the Conversion data flowing from Guidewire UI data which is loaded into TRF tables and then Staged into Datahub using oracle and finally loaded in EDW the target DIM & FACT tables.

Validated the entire data flow using Oracle SQL queries using the Guidewire PC tables or Lookup tables based on mapping logic to retrieve the user information to cross validate in UI and databases.

Performed validations on History and Go-Fwd. Fix by preparing the data to cover various scenarios and wrote SQL scripts to verify the for the backdated/ current/ future transactions..

Validated the EDW migration for Truncate & load DIM tables and huge volumes of data which is maintained in Delta tables(SCD) for history updates, inserts and deletion of the records in different Test2/ Datemove2/ EDWTemp environments.

Tested the Policy Center UI Drill Down structure of data at Policy, PolicyTerm, Transactions, Units(Dwelling/Vehicle), Coverages/Coverage Terms(Policy/Unit level), Cost object, Taxes Fees, Producer, Pay Plans, Agency/Direct bills, Premium or Commission amounts.

Analyzed Mapping document and validated Extract Files/ Staging/ Reference/ EDW tables based on DDL, DQ/referential checks and business logic to be applied on target tables.

Used Serv-U SFTP for the Inbound and outbound file transfer and validated extract files between EDW and various 3rd party vendors like Rivington / Alpha / NMU/ Bordereau Files/ Enterprise/ GL.

Tested the file transfer Serv-U SFTP to fetch NSM or RIV the target extract files from Inbound or Outbound folders to compare against Cognos reporting for Accounting/Premium feeds.

Used the Azure DevOps to input Test Cases, Execute Test Cases, Defect Log and to Create Work items for the Product Backlog and linking the Related or Child Work items.

Validated the data using DB Links which are pointed to different schemas to compare and to make sure data is matched between SRC and TGT tables.

As per BI requirement tested Cognos package by creating Dynamic and used existing Static reports in Cognos Analytics to make sure respective data is matched against EDW.

Checked Report Layout(Attributes/Header/Footer), Drilling options, Prompt, Defaults, Metric Calculations, Filters, Export/Print functionality, Formatting Properties (Alignment, Decimal places).

Environment: Oracle SQL Developer, Azure DevOps(VSTS), Serv-U, SAP BODS, DataStage, IBM Cognos Analytics 11.0.13, Control-M, MS Excel, MS Access 10.0 and Windows XP.

Anthem, Inc. Cincinnati, OH Aug’2017 – Sep’2019

Sr. ETL QA Analyst

Anthem is one of the country's premier consumer health insurance offering different kinds of health insurance plans. This project is towards ETL data migration of Membership Enrollment for different Source systems like ISG and MEDISYS for Medicaid, Original Medicare & Medicare Advantage plans of Modern legacy (AR)system based one Hub architecture using Informatica 10.2 to load the data from Mainframe files to SQL Server Data base and finally presenting the data for business end users reporting using Tableau.

Responsibilities:

Worked closely with Product Owner in analysis of functional design specifications of ETL data migration applications of Data Transform for ISG (Individual and Small Groups) and MEDISYS federal program via Modernized AR broker portal.

Validated ISG data for the Original Medicare (PART A and PART B) and Medicare Advantage which is PART C “bundled” plans including Part A, Part B, and usually Part D to make sure all mapping transformations logics have been applied from end–to–end in eHub architecture.

Every registration process that happens in ISG/MEDISYS is stored as MF DB2 Segments in Raw text format. All segments have specific structured layouts in MF. The data will flow from source of truth which is RAW -> DOMN -> SDS - >SDS_BLUE layers in eHub.

Tested the complete ETL process of Member Domain tables of Group, GRP Contract, MBR, MBR Contract and Product to make sure ETL mapping is properly implemented across all ETL layers.

Implemented Iterative Agile – Scrum Methodology for rapidly changing or highly emergency requirements and following parallel development and testing by attending Daily Standup Calls, Sprint Planning & Execution, Retrospectives and Sprint Reviews.

Validated Hadoop Used the Hadoop ecosystem component Apache Hive, for querying and analyzing datasets stored in Hadoop files using the HUE tool.

Build and automated the regression tests using DVO automation tool which reads table definitions from Power Center metadata repositories and checked data for inconsistencies using Single Table, Table pair, SQL and Join Views using Flat or Heterogeneous databases.

Analyzed Source to Target mapping document based on Master Data, Translate Tables (lookup) & Summary tables (Metrics) and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business logic to be applied on target tables.

Tested Full & Incremental (DL) Loads for records to Inserts/Updates/Deletes for each table based on Date LastUpdated timestamp.

Validated the architecture of migrating the millions of data from Mainframe files to SQL Server database and tested the reverse engineering to make sure the SCD II updates are properly being applied back from SQL server to Mainframe file too.

Worked with TDM team to analyze all the requirements and create the test data based on the required scenarios for DL validation as per LAST_UPDT_DTM.

Used SQL Server Management Studio for SQL Server in creating & executing complex SQL queries for testing ETL process based on SRC-TRG mapping document.

Extensively used JIRA 7.2 by breaking the epics into Stories which in turn creating the Sub-tasks and tracking the progress via these on a Scrum board and making sure these Sub-tasks are closed by entering the work estimates.

Data Quality checks are performed on data points to testify the validity of Member data, which decides if the data will be processed from Domain Layer to SDS with available values - Valid Record Indicator (VLD-RCRD-IND)

Environment: SQL Developer, SQL Server Management Studio, IP Switch, MONGO DB, STUDIO 3T, HIVE, HUE, Putty, Informatica 10.2, JIRA 7.2.3, MS Excel, DVO tool 9.1.2 and Windows XP.

Wells Fargo Bank, N.A, Southfield, MI Jan’17 – Jul’17

Sr. ETL/Hadoop Tester

CDF DW (Commercial Distribution & Finance) is a data migration project acquired by Wells Fargo the North American business from GE Capital. CDF migrates customers from four global regions including Asia, Australia and New Zealand, Europe and the Middle East, and North America. The CDF is built through Informatica where Source is pulled from Flat Files/DB2/Oracle and staged into Teradata and finally loaded in target Access Layer CDF DW/ Finance DW/ RISK DW and reports are being generated using QlikView/OBIEE for External Users

Responsibilities:

Worked with Business /Technical Analyst in analysis of functional design specifications of ETL/Data warehouse applications four global regions including Asia, Australia and New Zealand, Europe and the Middle East, and North America.

Involved in creation of Test Plans, Test Cases, and Test Execution based on mapping specs to validate the Source to STG to Target Access Layer.

Analyzed Mapping documents for FINANCE/ RISK/ ADHOC applications based on Facts & Dimensions and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business rules to be applied on target tables.

Tested SCD II scenarios on application data interfaces and making sure New Inserts/ Updates / Deletes/ No Change of loading the data is done correctly.

Validated the Counts, JSON Structure using Swagger which is sample response of API and Data is compared from source Oracle DB to Target which is HIVE / MONGO DB queries using STUDIO 3T tool.

Analyzed and tested the BTEQ scripts where data has been loaded from Staging to Access Layer and final data is used by end users for Reporting.

Tested Complete Inventory Financing Process from Customers like Manufacturer/Distributor and Dealers, Invoices, Transaction Statement/ Payment Terms and Ships/Orders Inventory to Access Layer CDF DW/ RISK DW/ FINANCE DW.

Experience in using the Informatica Automation Tool DVO to validate Source and Target Data. Created test cases in DVO for automating regression validation for file to table validation or table to table validation.

Participated in business meetings and Involved in Preparing the UAT Test Scenario's, Test Cases to accomplish complete requirement and testing coverage.

Wrote complex SQL queries based on mapping rules in Staging area using SQL Server Management Studio, TOAD and used Teradata to test from Staging to Access Layer.

Log project related issues in the defect tracking tool identified for the project and creation of Traceability Matrix for the project for each entity and marking down Test requirement, Test Head line, Actual & Expected result.

Used file transfer WINSCP SFTP to fetch the target files from archive folder on shared drive after running the target feeds to compare against source.

Implemented Agile Process – SCRUM Methodology with Face-to-face conversation with project team for frequent deliveries of product to end-user.

Estimate the Project Requirements and escalate issues with Project Requirements to Project Manager / Test Manager as well escalate issues within the Application to the Client

Used ALM 12.0 as Test management tool to manage and organize Requirements Coverage, Test Cases, Test Scripts and Creation of defects.

Environment: Informatica Power Center 9, OBIEE, Oracle 10, SQL Server Management Studio, TOAD for Oracle, ALM 12.0, Teradata SQL Assistant, MS Excel and Windows XP.

JPMorgan Chase & Co., Columbus, OH Dec’13 – Jan’17

(Implementation Partner: TCS)

Onsite lead.ETL/Hadoop QA Analyst

JPMorgan is one of the leading global financial services firm with leader in investment banking, financial for consumers and small businesses, commercial banking, financial transaction processing and asset management. The project is under CCB LOB for an Enterprise Release ICDW application which is central repository where new changes are being implemented as part of existing functionality and is impacted across the different DW Marts like Customer, Channel, Deposit, CIG, Credit, Compliance & AML-KYC and finally the Business user reports are being built from ICDW mart tables using Tableau/Cognos reporting tool.

Responsibilities:

Analyze Functional specification document (FSD), Technical design document (TDD), Physical Data Model (PDD) and source-to-target mapping document (STTM) to understand the business process.

Leading and coordinating on daily basis with offshore team and responsible for planning activities, assigning the tasks, Status Reports and attending onsite calls.

Thoroughly followed the Agile Methodology by attending Daily Standup Calls, Breakout Sessions, Sprint Planning & Execution, PRB Sessions, Sprint Reviews & Retrospectives, Writing the User Stories and Story Cards using Jira.

Expertise in writing the Complex SQL queries using the Teradata SQL Assistant across all the 3 layers to make sure data is populated in final Mart tables as expected in FSD.

Tested the Pre-Post processing flow in Hadoop Ecosystem environment using CAIP architecture from SOR to HDFS Conformed zone and from Conformed zone to Semantic Zone which is Teradata.

Validated the File Ingestion Manifest/Marker File, DML/PI classification, TDQ Validations, Record Count, Column count and File-File and File-Table data validation and CDC validations across the CAIP flow in HDFS Conformed and Semantic Zone.

Tested the following ICDW architecture of DWH in ETL environment across all 3 Layers:

1.Validated from Landing Path to staging layer by comparing the Unix Files against Stage tables to make sure data is loaded correctly.

2.Validated the data from Staging to Integration layer, by applying the transformation logic by filtering out the records and loading into tables.

3.Finally, validated data from Integration to Semantic Layer the final Mart tables for business users on analysis of data and also used to generate reports and dashboards.

Participated in DDL, Negative Validation, Count Validation, Data validations based on the ETL Mapping Logic across ETL architecture.

Tested SCD II, Full &Delta loads to large amount of data to couple of interfaces and making sure Inserts & Updates of loading the data is done correctly.

Validated Report Layout(Attributes/Metrics/Header/Footer positions), Naming conventions, Totals & Grand Totals, drilling options, Prompt ordering, Prompt, Defaults, Metric Calculations, Filters, Export/Print functionality, Formatting Properties like Alignment, Decimal places, Performance testing

Performed ETL testing in both IST & UAT environments and also, assisted the End-users during UAT on how to test, document & walkthrough of the test results to get the Business Signoff.

Expertise in using the UNIX commands using putty to fetch the DAT/ZIP/GNZIP/Token files and export them to local server using the WinSCP tool and compare data against staging tables.

Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Worked with ALM 11.0 tool to input Test Cases, Execute Test Cases, Defect Log, Generating Traceability Matrix and fetching the defect reports.

Involved in End-End testing from fetching the ETL Test Data from Upstream application front-end systems for ETL mapping rules and making sure they are loaded correctly in ICDW tables.

Took part in Triage Meetings with the required parties after defect analysis to prioritize defect resolution.

Environment: Informatica 9.0.1, Teradata SQL Assistant 13.0, UNIX, Putty, Hadoop, HIVE, HUE, Control-M, ALM 11.0, MS Excel, MS Access, WINSCP, MSI.

GRANT THORNTON, Oakbrook Terrace, IL Apr’13 – Nov’13

Sr. ETL QA Analyst

Grant Thornton specializing in Audit (one of the largest accounting firms), Tax (Managing your company's tax strategy) and Advisory providing different services to private and public companies. SHPS/Mercer Project handles the ETL HRPayroll / Lawson application where Source is being pulled from XML/CSV files/DB Lawson Payroll application system loading into Stage GTODS which is Sql Server& Oracle and loading in Target Webservices DWH environment.

Responsibilities:

Worked with Business/Technical Analyst in analysis of T-Specs and Source-Target Mapping of ETL/Data warehouse applications of Data Transform.

Tested Data Integration thoroughly for different modules like Census, Wired Commute, HR Payroll (Employee Deductions, Partner Deductions, Employee Wellness, Employee Earnings) where is the Source is being pulled from Lawson 4 different CSV files loaded into Stage GTODS DB and finally into Target DB.

Used to Run the Informatica Jobs (Sessions & Workflows) through workflow Manager or Workflow Monitor to test SCD II and checking Applied and Rejected rows to validate the Target data.

For. Com applications validated by selecting Valid Informatica Web Services and sending the Request by invoking parameters/inputs and testing the Response in XML format using SOAPUI.

Validated by entering Test Data using front end Lawson Payroll System to reflect the changes made for Inserts and Update Scenarios backend as per mapping rules.

Wrote complex SQL queries based on mapping rules in Staging area using SQL Server Management Studio, TOAD and used TFS as test management tool to log the defects.

Validated the Full Refresh & Delta Loads, SCD II Scenarios, DDL, Counts, Not Nulls, Rejected Error Logs, Duplicates, PK & FK validation, business logic to be applied on target tables

Used file transfer WINSCP SFTP to fetch the target files from archive folder on shared drive after running the target feeds to compare against source.

Involved in Verification & Validation of Functional, Regression and System testing for different PAC application on different integration modules.

Involved in writing of Test Cases & Test Execution based on Source-Target Mapping specs to validate Initial and Subsequent Load of data.

Implemented Water fall Model, Weekly Status Update and Defect Triage meeting for project updates and to work closely with Business Analysts and developers.

Logged Test Cases against requirements & Defects using Microsoft TFS and MTM as a repository tool.

Environment:Informatica 9.0.1, Web Services, SOAPUI, MS SQL Server Management Studio, TFS, TOAD, MTM, MS Excel, MS Access, WINSCP.

METTLER TOLEDO, Columbus, OH Oct’12 – Mar’13

Sr. ETL QA Analyst

METTLER TOLEDO is a global manufacturer and marketer of precision instruments for use in laboratory, Transport, Utilities, Industrial and food retailing applications. The SDW project is built through SSIS Packages where Source is pulled from Flat Files and SAP MM & SD modules and finally loaded in target which is SQL Server and reports are being generated using BO & SSRS for External Users.

Responsibilities:

Worked with Business /Technical Analyst in analysis of functional design specifications of ETL/Data warehouse applications of Data Transform & SDW.

Analyzed Source to Target mapping document based on Master Data, Transaction Data, Translate Tables(lookup) & Summary tables(Metrics) and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business logic to be applied on target tables.

Tested loading of Source data (Customers, Devices, Manufacturer, Measures, Models, Weighing Configurations, Sensitivity, Linearity, Eccentricity, Technician, Temperature Tolerance, Service Provider) from SAP & Flat Files to SDW SQL Server target system.

Used SQL Server Management Studio for SQL Server in creating & executing SQL queries for testing ETL process based on SRC-TRG mapping document.

Involved in creation of Test Cases & Test Execution based on mapping specs to validate the platform data.

Used Quality Center 10.0 as test management tool to manage and organize Requirements Coverage, Test Cases, Test Scripts and Creation of defects and followed Defect Life Cycle.

Tested Full & Incremental Loads based on PROC_DATE for large amount of data.

Performed Functional Testing by entering the data in front end for Creating Customers and assigning devices through FSM, MiraCal & Technician Dashboard applications and validated backend the process of data flow against requirements

Logged project related issues in the defect tracking tool identified for the project and creation of Traceability Matrix for the project for each entity and marking down Test requirement, Test Head line, Actual & Expected result.

Implemented Iterative Agile – Scrum Methodology for rapidly changing or highly emergency requirements and following parallel development and testing.

Environment: SQL Server 2008, SQL Server Management Studio, SSIS, Business Objects XIR2, HP Quality Center 10.0, MS Excel, MS Access 10.0 and Windows XP.

LIMITED BRANDS, Columbus, OH Mar’12 – Oct’12

ETL QA Analyst

DSW, Columbus, OH July’11 – Mar’12

ETL QA Analyst

UNIVITA HEALTH, Eden Prairie, MN June’09 – Jun’11

Data Migration QA



Contact this candidate