Post Job Free

Resume

Sign in

Data Sql Server

Location:
Mason, OH
Salary:
$80/hr
Posted:
May 17, 2019

Contact this candidate

Resume:

SHILPA

ac9ekv@r.postjobfree.com

832-***-****

PROFESSIONAL SUMMARY:

Over 9 Years of IT experience in Analysis and Testing of ETL-BI solutions for Data Migration/ Hadoop CDS Eco System, Data Warehouse/BI Reports, and Manual Client/Server applications.

Experience in using ETL Informatica 10.2.0 tool and running ETL Sessions & Workflows using WF Manager and Monitor and testing Informatica Web Services application and used SOAPUI as third party tool.

Participated in Leading activities from Onsite and actively involved in Communication across all the teams with Program Managers, PM’s, Development & BA teams to ensure smooth final user deliverable

Worked on various Domains like Health Care Insurance, Financial, Tax/Audit, Manufacturer, Retail, E-Commerce, - and Supply Chain.

Worked with heterogeneous databases and expertise in writing Complex SQL Queries using Teradata, Oracle and SQL Server and in validating data into Data Warehouse/ETL applications.

Extensively used Informatica Automation DVO tool 9.1.2 to validate the huge data between the file to table, table to table using Single Table, Table pair, SQL Views and Join Views functions.

Extensively used JIRA by breaking the epics into Stories which in turn creating the Sub-tasks and tracking the progress via these on a Scrum board.

Extensively tested the CDS Hadoop Eco system and also file ingestion process by using Ab-initio Graphs from SFTP SOR files to Hadoop HDFS Conformed zone and from Conformed Zone to Semantic Zone.

Expertise in OLTP/OLAP System Study, Slowly Changing Dimensions types 1/2/3 (SCD'S) Analysis and E-R modeling, developing Database Schemas like Star flake schema and Snowflake schema used in relational, dimensional and multidimensional modeling

Strong working experience in the Data Analysis, Data Verification and Validation of an ETL Applications using Backend/Database Testing.

Tested MSI reports drilling options, prompt ordering, Defaults, Metric Calculations, Filters, and Export/Print functionality, Formatting Properties and validated the data backend against the Target Data Marts.

Performed Data Quality (DQ) checks whether the data is delivered under quality rules and all jobs are working as per business requirement.

Experience with Database tools like Teradata SQL Assistant 15.10, TOAD 12.1, and SQL Server 2014 Management studio for creating and executing SQL Queries and used for FTP/SFTP FileZilla or WINSCP as Open Source tool for accessing Unix Files.

Validating Sales date from POS to Financials in SAP/Database and to validate different tender types on POS: cash, cards, gift cards, Mall certificates, Payment Gateway Processing, Promotions, Discounts, Return and Exchange transactions.

Test functionality of ecommerce application (ETAIL) from POS that interacts with the database.

Extensive experience in Black Box Testing, Functional Testing, Integration Testing, System Testing, Regression Testing, Performance Testing on Warehouse Applications.

Involved in preparation of Traceability Matrix based on High/Detail requirements and Test Cases using ALM 12.0, QA Sign off Documents and maintaining Weekly Status Reports.

Anthem, Inc. Mason, OH July 2017 – Till Date

Lead ETL QA Analyst

Anthem is one of the country's premier consumer health insurance offering different kinds of health insurance plans. This project is towards ETL data migration of Membership Enrollment for different source systems like ISG, FACETS and MEDISYS for Medicaid, Original Medicare & Medicare Advantage plans of Modern legacy system based on eHub architecture using Informatica 10.2 to load the data from Mainframe files to SQL Server Data base and finally presenting the data for business end users reporting using Tableau.

Responsibilities:

Worked closely with Product Owner in analysis of functional design specifications of ETL data migration applications of Data Transform for ISG (Individual and Small Groups) and MEDISYS federal program via Modernized AR broker portal.

Validated ISG data for the Original Medicare (PART A and PART B) and Medicare Advantage which is PART C “bundled” plans including Part A, Part B, and usually Part D to make sure all mapping transformations logics have been applied from end–to–end in eHub architecture.

Every registration process that happens in ISG/MEDISYS is stored as MF DB2 Segments in Raw text format. All segments have specific structured layouts in MF. The data will flow from source of truth which is RAW -> DOMN -> SDS - >SDS_BLUE layers in eHub.

Implemented Iterative Agile – Scrum Methodology for rapidly changing or highly emergency requirements and following parallel development and testing by attending Daily Standup Calls, Sprint Planning & Execution, Retrospectives and Sprint Reviews.

Build and automated the regression tests using DVO automation tool which reads table definitions from Power Center metadata repositories, and checks the data for inconsistencies.

Used the Single Table, Table pair, SQL Views, Join Views to automate the data in DVO tool using Flat or Heterogeneous databases.

Analyzed Source to Target mapping document based on Master Data, Translate Tables (lookup) & Summary tables (Metrics) and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business logic to be applied on target tables.

Validated the architecture of migrating the millions of data from Mainframe files to SQL Server database and tested the reverse engineering to make sure the SCD II updates are properly being applied back from SQL server to Mainframe file too.

Used SQL Server Management Studio for SQL Server in creating & executing complex SQL queries for testing ETL process based on SRC-TRG mapping document.

Extensively used JIRA 7.2 by breaking the epics into Stories which in turn creating the Sub-tasks and tracking the progress via these on a Scrum board and making sure these Sub-tasks are closed by entering the work estimates.

Tested the complete ETL process of Member Domain tables of Group, GRP Contract, MBR, MBR Contract and Product to make sure ETL mapping is properly implemented across all ETL layers.

Data Quality checks are performed on data points to testify the validity of Member data, which decides if the data will be processed from Domain Layer to SDS with available values - Valid Record Indicator (VLD-RCRD-IND)

Tested Full & Incremental (DL) Loads for records to Inserts/Updates/Deletes for each table based on DateLastUpdated timestamp.

Environment: SQL Developer, SQL Server Management Studio, Informatica 10.2, Tableau 10.1, JIRA 7.2.3, MS Excel, MS Access 10.0, DVO tool 9.1.2 and Windows XP.

Wells Fargo Bank, N.A, Southfield, MI Jan’17 – June’17

Sr. ETL Tester

CDF DW (Commercial Distribution & Finance) is a data migration project acquired by Wells Fargo the North American business from GE Capital. CDF migrates customers from four global regions including Asia, Australia and New Zealand, Europe and the Middle East, and North America. The CDF is built through Informatica where Source is pulled from Flat Files/DB2/Oracle and staged into Teradata and finally loaded in target Access Layer CDF DW/ Finance DW/ RISK DW and reports are being generated using Qlikview/OBIEE for External Users

Responsibilities:

Worked with Business /Technical Analyst in analysis of functional design specifications of ETL/Data warehouse applications four global regions including Asia, Australia and New Zealand, Europe and the Middle East, and North America.

Involved in creation of Test Plans, Test Cases, and Test Execution based on mapping specs to validate the Source to STG to Target Access Layer.

Analyzed Mapping documents for FINANCE/ RISK/ ADHOC applications based on Facts & Dimensions and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business rules to be applied on target tables.

Tested SCD II scenarios on application data interfaces and making sure New Inserts/ Updates / Deletes/ No Change of loading the data is done correctly.

Analyzed and tested the BTEQ scripts where data has been loaded from Staging to Access Layer and final data is used by end users for Reporting.

Tested Complete Inventory Financing Process from Customers like Manufacturer/Distributor and Dealers, Invoices, Transaction Statement/ Payment Terms and Ships/Orders Inventory to Access Layer CDF DW/ RISK DW/ FINANCE DW.

Experience in using the Informatica Automation Tool DVO to validate Source and Target Data. Created test cases in DVO for automating regression validation for file to table validation or table to table validation.

Involved in Preparing the UAT Test Scenario's, Test Cases to accomplish complete requirement and testing coverage.

Wrote complex SQL queries based on mapping rules in Staging area using SQL Server Management Studio, TOAD and used Teradata to test from Staging to Access Layer.

Log project related issues in the defect tracking tool identified for the project and creation of Traceability Matrix for the project for each entity and marking down Test requirement, Test Head line, Actual & Expected result.

Used file transfer WINSCP SFTP to fetch the target files from archive folder on shared drive after running the target feeds to compare against source.

Implemented Agile Process – SCRUM Methodology with Face-to-face conversation with project team for frequent deliveries of product to end-user.

Estimate the Project Requirements and escalate issues with Project Requirements to Project Manager / Test Manager as well escalate issues within the Application to the Client

Used ALM 12.0 as Test management tool to manage and organize Requirements Coverage, Test Cases, Test Scripts and Creation of defects.

Environment: Informatica Power Center 9, OBIEE, Oracle 10, SQL Server Management Studio, TOAD for Oracle, ALM 12.0, Teradata SQL Assistant, MS Excel and Windows XP.

JPMorgan Chase & Co., Columbus, OH Dec’13 – Jan’17

(Implementation Partner: TCS)

Onsite lead. ETL/Hadoop QA Analyst

JPMorgan is one of the leading global financial services firm with leader in investment banking, financial for consumers and small businesses, commercial banking, financial transaction processing and asset management. The project is under CCB LOB for an Enterprise Release ICDW application which is central repository where new changes are being implemented as part of existing functionality and is impacted across the different DW Marts like Customer, Channel, Deposit, CIG, Credit, Compliance & AML-KYC and finally the Business user reports are being built from ICDW mart tables using Tableau/Cognos reporting tool.

Responsibilities:

Analyze Functional specification document (FSD), Technical design document (TDD), Physical Data Model (PDD) and source-to-target mapping document (STTM) to understand the business process.

Leading and coordinating on daily basis with offshore team and responsible for planning activities, assigning the tasks, Status Reports and attending onsite calls.

Thoroughly followed the Agile Methodology by attending Daily Standup Calls, Breakout Sessions, Sprint Planning & Execution, PRB Sessions, Sprint Reviews & Retrospectives, Writing the User Stories and Story Cards using Jira.

Expertise in writing the Complex SQL queries using the Teradata SQL Assistant across all the 3 layers to make sure data is populated in final Mart tables as expected in FSD.

Tested the Pre-Post processing flow in Hadoop Ecosystem environment using CAIP architecture from SOR to HDFS Conformed zone and from Conformed zone to Semantic Zone.

Validated the File Ingestion Manifest/Marker File, DML/PI classification, TDQ Validations, Record Count, Column count and File-File and File-Table data validation and CDC validations across the CAIP flow in HDFS Conformed and Semantic Zone.

Tested the following ICDW architecture of DWH in ETL environment across all 3 Layers:

1.Validated from Landing Path to staging layer by comparing the Unix Files against Stage tables to make sure data is loaded correctly.

2.Validated the data from Staging to Integration layer, by applying the transformation logic by filtering out the records and loading into tables.

3.Finally, validated data from Integration to Semantic layer the final Mart tables for business users on analysis of data and also used to generate reports and dashboards.

Participated in DDL, Negative Validation, Count Validation, Data validations based on the ETL Mapping Logic across ETL architecture.

Tested SCD II, Full & Delta loads to large amount of data to couple of interfaces and making sure Inserts & Updates of loading the data is done correctly.

Validated Report Layout (Attributes/Metrics/Header/Footer positions), Naming conventions, Totals & Grand Totals, drilling options, Prompt ordering, Prompt, Defaults, Metric Calculations, Filters, Export/Print functionality, Formatting Properties like Alignment, Decimal places, Performance testing etc.

Performed ETL testing in both IST & UAT environments and also, assisted the End-users during UAT on how to test, document & walkthrough of the test results to get the Business Signoff.

Expertise in using the UNIX commands using putty to fetch the DAT/ZIP/GNZIP/Token files and export them to local server using the WinSCP tool and compare data against staging tables.

Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Worked with ALM 11.0 tool to input Test Cases, Execute Test Cases, Defect Log, Generating Traceability Matrix and fetching the defect reports.

Involved in End-End testing from fetching the ETL Test Data from Upstream application front-end systems for ETL mapping rules and making sure they are loaded correctly in ICDW tables.

Took part in Triage Meetings with the required parties after defect analysis to prioritize defect resolution.

Environment: Informatica 9.0.1, Teradata SQL Assistant 13.0, UNIX, Putty, Control-M, ALM 11.0, MS Excel, MS Access, WINSCP, MSI.

GRANT THORNTON, Oakbrook Terrace, IL Apr’13 – Nov’13

Sr. ETL QA Analyst

Grant Thornton specializing in Audit (one of the largest accounting firms), Tax (Managing your company's tax strategy) and Advisory providing different services to private and public companies. SHPS/Mercer Project handles the ETL HR Payroll / Lawson application where Source is being pulled from XML/CSV files/DB Lawson Payroll application system loading into Stage GTODS which is Sql Server & Oracle and loading in Target Web Services DWH environment.

Responsibilities:

Worked with Business/Technical Analyst in analysis of T-Specs and Source-Target Mapping of ETL/Data warehouse applications of Data Transform.

Tested Data Integration thoroughly for different modules like Census, Wired Commute, HR Payroll (Employee Deductions, Partner Deductions, Employee Wellness, Employee Earnings) where is the Source is being pulled from Lawson 4 different CSV files loaded into Stage GTODS DB and finally into Target DB.

Used to Run the Informatica Jobs (Sessions & Workflows) through workflow Manager or Workflow Monitor to test SCD II changes and checking Applied and Rejected rows to validate the Target data.

For. Com applications validated by selecting Valid Informatica Web Services and sending the Request by invoking parameters/inputs and testing the Response in XML format using SOAPUI.

Validated by entering Test Data using front end Lawson Payroll System to reflect the changes made for Inserts and Update Scenarios backend as per mapping rules.

Wrote complex SQL queries based on mapping rules in Staging area using SQL Server Management Studio, TOAD and used TFS as test management tool to log the defects.

Validated the Full Refresh & Delta Loads, SCD II Scenarios, DDL, Counts, Not Nulls, Rejected Error Logs, Duplicates, PK & FK validation, business logic to be applied on target tables

Used file transfer WINSCP SFTP to fetch the target files from archive folder on shared drive after running the target feeds to compare against source.

Involved in Verification & Validation of Functional, Regression and System testing for different PAC application on different integration modules.

Involved in writing of Test Cases & Test Execution based on Source-Target Mapping specs to validate Initial and Subsequent Load of data.

Implemented Water fall Model, Weekly Status Update and Defect Triage meeting for project updates and to work closely with Business Analysts and developers.

Logged Test Cases against requirements & Defects using Microsoft TFS and MTM as a repository tool.

Environment: Informatica 9.0.1, Web Services, SOAPUI, MS SQL Server Management Studio, TFS, TOAD, MTM, MS Excel, MS Access, WINSCP.

METTLER TOLEDO, Columbus, OH Oct’12 – Mar’13

Sr. ETL QA Analyst

METTLER TOLEDO is a global manufacturer and marketer of precision instruments for use in laboratory, Transport, Utilities, Industrial and food retailing applications. The SDW project is built through SSIS Packages where Source is pulled from Flat Files and SAP MM & SD modules and finally loaded in target which is SQL Server and reports are being generated using BO & SSRS for External Users.

Responsibilities:

Worked with Business /Technical Analyst in analysis of functional design specifications of ETL/Data warehouse applications of Data Transform & SDW.

Analyzed Source to Target mapping document based on Master Data, Transaction Data, Translate Tables(lookup) & Summary tables(Metrics) and validating the DDL, Counts, Not Nulls, Duplicates, PK & FK validation, business logic to be applied on target tables.

Tested loading of Source data (Customers, Devices, Manufacturer, Measures, Models, Weighing Configurations, Sensitivity, Linearity, Eccentricity, Technician, Temperature Tolerance, Service Provider) from SAP & Flat Files to SDW SQL Server target system.

Used SQL Server Management Studio for SQL Server in creating & executing SQL queries for testing ETL process based on SRC-TRG mapping document.

Involved in creation of Test Cases & Test Execution based on mapping specs to validate the platform data.

Used Quality Center 10.0 as test management tool to manage and organize Requirements Coverage, Test Cases, Test Scripts and Creation of defects and followed Defect Life Cycle.

Tested Full & Incremental Loads based on PROC_DATE for large amount of data.

Performed Functional Testing by entering the data in front end for Creating Customers and assigning devices through FSM, MiraCal & Technician Dashboard applications and validated backend the process of data flow against requirements

Logged project related issues in the defect tracking tool identified for the project and creation of Traceability Matrix for the project for each entity and marking down Test requirement, Test Head line, Actual & Expected result.

Implemented Iterative Agile – Scrum Methodology for rapidly changing or highly emergency requirements and following parallel development and testing.

Environment: SQL Server 2008, SQL Server Management Studio, SSIS, Business Objects XIR2, HP Quality Center 10.0, MS Excel, MS Access 10.0 and Windows XP.

LIMITED BRANDS, Columbus, OH Mar’12 – Oct’12

ETL QA Analyst

Limited Brands, a retailer & parent company of Victoria's Secret VSS (RETAIL) and VSD (ETAIL). EDW (Enterprise DWH) structure is implemented at both Store level & Buy Online Return in Store where the Source is pulled from different systems like SAP, PIPE, JDA, MANU, People Soft, Flat Files and Teradata and loaded into the target EDW which is Teradata and reports are being generated using MSI tool for business users for decision Making, Analysis, Costing, Purchases, Sales & Dashboards.

Responsibilities:

Gathered from Business Analyst both Functional (F-Specs) & Technical (T-Specs) and worked with developers in understanding of ETL Mapping Applications.

Worked on testing different retail application like Vendor, Customer, Article, Sector, Category, Class, Sub-Class, SKU, Site, Geographical Area Hierarchy (District, Region, Zone), and Pricing & Promotions, Transaction (SKU, DAY, and WEEK).

A centralized repository called EMR is maintained to find the objects like Tables, Views, Aggregates and Metrics etc. and testing the EDW based on the Mapping Documents.

Experienced working with heterogeneous source systems where source is being pulled from SAP, PIPE, JDA, and MANU, People Soft, Flat Files and Teradata and loaded to Target is Teradata and compared the results in Excel sheet.

Build & executed the SQL queries based on mapping logic in Teradata SQL Assistant and used HP QC as test management tool to execute Test Cases and creating the defects.

Daily & Weekly jobs are scheduled through CONTROL-M since they are integrated with applications such as EDW, SAP and Oracle People Soft for the Full & Partial Loads into the Target Tables.

Validated Table Structure, Negative Validation, Field-by-Field validation, Scenario & Scheduling validations based on the ETL Mapping Logic.

Validated POS transaction receipts for SOOS/BORIS against EDW Base & Aggregate tables to reflect the Bar codes, Location, Item Description, Shopping Transaction (Date, Sale/Return, Payments, Unit Price sell Amt, Merchandise credit, Shipping Charges, Tax, Total Amt, Transaction Type Cd)

Validating Sales date from POS to Financials in SAP/Database and to validate different tender types on POS: cash, cards, gift cards, Mall certificates, Payment Gateway Processing, Promotions, Discounts, Return and Exchange transactions

Validated SCD II that maintained history for inserting new record and updating for each change for Base & Aggregate tables based on ETL Row_Update_DTTM stamp.

Validated MSI Adhoc Reports by comparing data to EDW base tables against MSI existing dev reports and also by comparing QA against PROD and exporting to excel for final comparison between two environments

Validated Report Layout (Attributes/Metrics/Header/Footer positions), Naming conventions, Totals & Grand Totals, drilling options, Prompt ordering, Prompt, Defaults, Metric Calculations, Filters, Export/Print functionality, Formatting Properties like Alignment, Decimal places, Performance testing etc.

Tested End-to-End SAP ECC 6.0-EDW upgrade tables to make sure all functionality and data is completely loaded.

Followed Defect Life Cycle and raised the defects based on priority and severity level in QC.

Implemented Agile Process for changing requirements and continuous development and worked closely with Business Analysts and developers.

Environment: Data Stage 8.5, MSI Reports, Teradata13.0, SAP R/3, Quality Center 10.0, Teradata SQL Assistant, Control-M, MS Excel, MS Access

DSW, Columbus, OH July’11 – Mar’12

ETL QA Analyst

DIGITAL RIVER, Eden Prairie, MN Feb’11 – July’11

ETL QA Analyst

UNIVITA HEALTH, Eden Prairie, MN June’09 – Feb’11

Data Migration QA



Contact this candidate