Post Job Free

Resume

Sign in

Data Analyst

Location:
Des Moines, IA
Salary:
98,000
Posted:
November 19, 2020

Contact this candidate

Resume:

PROFFESIONAL SUMMARY

●Total IT experience is ** years in field of Data warehouse, in Domains like Insurance (7-yrs)

, Investment Banking (3-yrs), Finance (3-yrs).

●Functioning as a Sr. Data Quality analyst (ETL) with key responsibilities of Estimation, data profiling, reviewing and working on project documents (Business Requirement Document, Functional specification document, Mapping Documents), writing test cases and Scenarios for Business Users (UAT), drafting Complex SQL Scripts, Source data analysis using Scripting language (UNIX), identifying/logging/tracking bugs.

●10 years of experience in Data Analysis, Data profiling, Data Validation, Data Cleansing, Data Verification and Data mismatch reporting and maintaining the data quality in project during User acceptance testing and working with Stakeholders.

●10 years of experience in querying the databases by developing and running SQL queries by implanting business transformation logic on utilities like Toad, and Golden-32 and Aqua Data Studio.

●5 years of experience in Informatica (ETL, MDM tool) design, develop, optimize, and support MDM processes, including performance tuning of existing processes.

●Hands on experience for working on Informatica Analyst tool for Data integration, mapping and maintaining the standards per business, analysis of source data for quality and trends.

●Involved in Formal Reviews and walkthrough for preparing UAT test plans and test Scripts.

●Reviewing and analyzing FSD (Functional Specification Documents): FSD document review involves the understanding and validation of design and technical implementation of Business Requirements. It includes the verification of all the Business requirements being covered in Functional Design.

●Reviewing, Validation, Verification and analyzing the logic in STM (Source to Target Mapping) documents.

●Have been writing SQL queries to validate the data in SQL Server and Oracle database to execute back-end testing.

●Using Informatica MDM suite of tools - design, develop, optimize, and support MDM processes, including performance tuning of existing processes.

●Manage the partnership with Data Integration Hubs around data modeling, data mappings, data validation, hierarchy management and overall security. Provide continuous enhancement and review of MDM matching rules, data quality and validation processes

●Experience on analyzing the source data using scripting language (Python/Unix and with selenium).

●11 years of experience in agile methodologies.

●3 years of experience in Reporting tools like Qlik Sense & Micro strategy Dashboard.

●6 years of experience in Test Case Management using HP Quality Center

●Test Case Management using IBM Clear Quest

●7 years of experience on working as liaison between onsite and offshore resources.

●10 years of experience in different Databases like Oracle, Teradata & SQL Server.

●Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.

●Expertise in Testing complex Business rules by creating mapping and various transformations

●Experience in Performance Tuning of SQL.

●Good analytical skills, communication, inter-personal skills and written skills.

EDUCATIONAL QUALIFICATION

●Bachelor of Electronics and Telecommunication from North Maharashtra University

(Equaling to 4 Bachelor’s degree in Computer Science in USA)

●1-year Course from Maples Institute in all Programing language (Java, Python, Mainframe, CICS, COBOL, SQL, UNIX shell Scripting).

Certifications:

●Scrum Master certification (Alliance)

●Certified by Maples on Mainframe (Jcl, Cobol, DB2, CICS and TSOTEST) – (IBM).

●LOMA 280 certifications (Insurance domain) – (Assurant health insurance).

TECHNICAL SKILLS

Languages

SQL, Python, PL-SQL, Shell Scripting on UNIX, Mainframe, JCL, Cobol, CICS, DB2, JAVA for Selenium

Databases

Oracle (10g,11g), SQL Server 2008, IMS-DB (Mainframe), Teradata

Tools

Oracle - PL/SQL, Oracle SQL Developer, Toad, Informatica (MDM,Workflow,Analyst), CA workload control center, HP Quality Center, Clear Quest, File-aid, SPUFI (Mainframe), TSOTEST, VISION+ (For Banking domain)

Reporting Tool

Tableau, Qlik Sense, Micro strategy dashboard & SSRS

PROFESSIONAL EXPERIENCE:

Client: Sammons financial group, Des Moines

Role: Sr. Data analyst/ETL –Data warehouse

Current Project

Project:

LineofBusiness - Life & Annuity (Reports for LIMRA for market evaluation)

Domain:

Insurance

Team Size:

5

Environment/Tool:

SQL Server 2016, IMS-DB (Mainframe as Source for HUB), CICS Mainframe screens, SSRS reports, XML, Informatica for ETL & Data profiling.

Project Description:

As part of this project, data team is creating the reports (XML & QlickSense) on monthly basis and sending across to LIMRA (3rd party vendor) for having market evaluation and produce the end report for Sammons financial group on maintaining global insurance guidance.

This is exiting process however from 2021-Jan, Sammons would be sending additional data in reports which are Coverage, Benefits, Sub-accounts, Life Participant details.

Responsibilities:

●Analyzed Business Requirements & Performing Data profiling and Data mining.

●Focused on Data Quality issues / problems that include completeness, conformity, consistency, accuracy, duplicates, and integrity.

●To perform data analysis & to find the content, quality, and structure of data sources, and monitor data quality trends using Informatica MDM/Analyst & SQL.

●Working closely with the Business Analysts, Source Systems Analysts, Developers to define and Validate the Transformation business rules.

●Understanding of the database designs like Star Schema and Snowflake Schema and relationship between Fact and Dimension tables

●Involved in developing detailed use cases for Functional and Regression Testing.

●Wrote SQL queries for each UAT Test case to validate the data between Source system, Data Warehouse, Data Marts and Reports.

●Executing work sessions for sample data to Business /QA.

●Validating SSRS reports and make sure data is getting extracted correctly from Data warehouse.

●Validation of Data from Source to Target.

●Designed, Developed and Executed Test Scenarios.

●Managed customer relationship with effective communication.

PROFESSIONAL EXPERIENCE:

Client: FHLB, Des Moines

Role: Sr. IT QA Data analyst/ETL –Data warehouse

Project:

STARS - HUB LEVERAGE & DWH3

Domain:

Banking – (Data warehouse – Simcorp/Advances/Investments/Borrowing

/Safekeeping & Capital Stocks in HUB)

Team Size:

4

Environment:

SQL Server 2016, IMS-DB (Mainframe as Source for HUB), CICS Mainframe screens, SSRS reports

Project Description:

Project is mainly the enhancement of different modules like Advances/Investment/Borrowing/Safekeeping & Capital Stocks in HUB.

Also, this includes, implementing correct data warehouse for business for single point of source for SSRS reporting purpose.

Responsibilities:

●Analyzed Business Requirements, design documents and Technical Specifications document.

●Focused on Data Quality issues / problems that include completeness, conformity, consistency, accuracy, duplicates, and integrity.

●To perform data analysis & to find the content, quality, and structure of data sources, and monitor data quality trends using SQL profiler.

●Worked closely with the Business Analysts, Systems Analysts, Developers, and DBAs to solve the issues identified during the testing process in a timely manner.

●Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.

●Understanding of the database designs like Star Schema and Snowflake Schema and relationship between Fact and Dimension tables

●Focused on Data Quality issues / problems that include completeness, conformity, consistency, accuracy, duplicates, and integrity.

●Wrote SQL queries for each Test case to validate the data between Source system, Data Warehouse, Data Marts and Reports.

●Validating SSRS reports and make sure data is getting extracted correctly from Data warehouse.

●Ran SQL queries to verify the number of records and validated the referential integrity rules as per the design specifications.

●Validation of Data from Source to Target.

●Designed, Developed and Executed Test Scenarios.

●Managed customer relationship with effective communication.

Client: Du-point Pioneer, Des Moines

Role: Sr. ETL Lead – Web-based & Data warehouse

Project:

TRAID DWH

Domain:

Data warehouse – Data centralizing into warehouse from research apps.

Team Size:

12 (2 Onshore and 10 Offshore)

Environment :

Oracle, SQL, Unix, Shell Scripting, Micro strategy Dash-board testing

Project Description:

The Double Haploid Dashboard project aims to define an end-to-end reporting solution by pulling the data from different systems and then that tracks progress of Double Haploids on the field and in the lab (Data Centralizing into DM). The dashboard will provide details to different audience based on DH requests and performance.

Responsibilities:

●Analyzed Business Requirements, design documents and Technical Specifications document.

●Worked closely with the Business Analysts, Systems Analysts, Developers, and DBAs to solve the issues identified during the testing process in a timely manner.

●Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.

●Created the QA Solution Design Framework using shell scripting which take SQL, PL/SQL scripts as input and run them against the required database providing the result and logs for each run in output and log files respectively.

●Understanding of the database designs like Star Schema and Snow flake Schema and relationship between Fact and Dimension tables

●Focused on Data Quality issues / problems that include completeness, conformity, consistency, accuracy, duplicates, and integrity.

●Wrote SQL queries for each Test case to validate the data between Source system, Data Warehouse, Data Marts and Reports.

●Have worked on Micro-strategy reports like Report layout, Naming Convention, Matrix calculations

●Ran SQL queries to verify the number of records and validated the referential integrity rules as per the design specifications.

●Validation of Data from Source to Target.

●Designed, Developed and Executed Test Scenarios.

●Managed customer relationship with effective communication.

Client: Assurant Health, Milwaukee, WI

Role: Sr. QA lead

Project:

HIX, Health Exchanges IM Policies

Domain:

Insurance domain

Team Size:

8 (1 Onshore and 7 Offshore)

Environment :

Mainframe, Oracle, SQL, Unix, Shell Scripting, Informatica (JOB FLOW),

Project Description

The Assurant Health company, being in the Insurance Industry, it has mandate to comply with the Health reforms Patient Protection and Affordable Care Act. Assurant Health is planning to re-align its IT applications to achieve platform consolidation, so that it can get ready to implement the new health reforms and also to reduce drastically on the IT infrastructure/ maintenance spending.

A health insurance exchange is a set of government-regulated and standardized health care plans in the United States, from which individuals may purchase health insurance eligible for federal subsidies. All exchanges must be fully certified and operational by January 1, 2014, under federal law.

Responsibilities:

●This project is to create new warehouse to maintain customer information. As a QA Engineer I used to run the test cases for the end to end process in that data warehouse.

●Worked with Business Analysts to define testing requirements to satisfy the business objectives.

●Analyzing data stage platform for source files.

●Responsible for creating complete test cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes.

●Created the QA Solution Design Framework using shell scripting which take sql, pl/sql scripts as input and run them against the required database providing the result and logs for each run in output and log files respectively.

●Used Online mainframes screens, for posting claims transactions for the new form# to get the results on Mainframe flat files as part of test data.

●Coordinated with onshore Team for Understanding the Requirement and Test Data Creation.

●Creating Informatica mapping for Generic Validation and Loading of Source files from Flex cube SOR.

●Validation of Data from Source to Target.

●Designed, Developed and Executed Test Scenarios.

●Under Micro-strategy reports validation of conditional formatting testing or threshold testing.

●Managed customer relationship with effective communication.

Client: Assurant Health, Milwaukee, WI

Role: ETL/DWH Tester

Project:1

New Business and Current Business / AH

Domain:

Insurance domain

Team Size:

14 ( 4 Onshore and 10 Offshore)

Environment :

Mainframe, Informatica (for running workflow), MySQL, UNIX, CICS online Mainframe system,

Project Description

New Business is to issue policies for individual and small group with different metallic levels. Current business is to modify existing individual and small group policies as per new metallic levels. Newly issued/modified policies provide essential health benefits and suitable to buy/sell through health insurance exchanges.

Primary Objective of this project is to construct and aggregate data warehouse.

Responsibilities:

●Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and have walkthrough on process for data stage.

●Drove Mapping Walkthrough for Assurant SME and business representative.

●Participated in Solution Design.

●Coordinated with Offshore Team for Understanding the Requirement and Test Data Creation.

●Conveniently Test Reports Designed by End Users (Under Microstrategy testing).

●Designed, Developed and Executed Test Scenarios.

●Interacted with client to gather the requirements and implement the same in the project

●Managed customer relationship with effective communication.

●Developed UNIX scripts to validate the flat files and to automate the manual test cases.

●Distribute Integrity Manager Results to Any Number of Users, Developers, or Administrators.

Project:2

PBM/ASSURANT HEALTH/PHARMACY

Domain:

Insurance Domain

Team Size:

10

Environment :

Oracle 11g, Clear Quest, Mainframe, Informatica (Mapping flows), PL/SQL

Project Description

New PBM CVSC introduction letters has to be sent to the active Group Account holders. GMS as a source system have to create one shots activity to identify active Account holders to process initial mailing. GMS will be using Input file extract for creation of initial one shot ID card mailing, criteria for using input file extract is yet to be defined.

The one shot activity to create new PBM Rx ID card request to Expression should have a transaction effective date of 01/01/2013.

Responsibilities:

Attending Client calls for Business understanding and Clarifications.

Created test case scenarios, executed test cases and maintained defects in Clear Quest and ETL graphs from Source to Target.

Extensive work in Clear Quest as a Defect tracking tool, for logging defects and defect reporting.

Reviewing Business Requirement Document, Functional specification document, Migration analysis document, Mapping Documents.

Compared the data sets resulted from the databases Oracle and SQL Server using the tool, SQL developer for Data Analysts

Queried databases using SQL queries to validate the data loaded into target tables.

Reported bugs and tracked defects using Clear Quest (CQ)

Through Validation of all the three layers of whole ETL flow, Landing, Integration and Symantec. Performed the tests in both the SIT, QA and contingency/backup environments

Performed all aspects of verification, validation including functional, structural, regression and system testing

Executing test scripts on different environments:

Based on Production needs test cases and scripts are executed on each build provided by development team, execution of test cases is based on the release notes provided by Deployment team to cover the scope of each builds on different environment. Test case execution involves both manual testing as well as Automation. Automation involves a Separate Automation framework for automatic execution.

Analyzed the data and applied relevant transformations and verified that the source data types are correct.

Worked on test data and completed unit testing to check all business rules and requirements are met.

Client : HSBC Pvt Ltd, India.

Role : ETL & Mainframe Tester (End to End tester)

Project:

Fentom (E-banking ITES Service portfolio RBS)

Domain:

Baking

Team Size:

20+ ( 2 Onshore / 18 offshore )

Environment:

Oracle 11g, Oracle SQL Developer, Microsoft SQL Server, HP Quality Center 9.2, Windows XP

Project Description:

HSBC, leading largest banks in global platform, has acquired the RBS

Bank (Indian cards and core banking portfolios to HSBC). This project involves Data Migration for Card and core banking customers from RBS bank (SQL Server) to HSBC (Oracle Server).

This migration project had following objectives:

1. As a part of this project all the RBS and SB Portfolio (e-banking/Transactions/Upgrade Edits) is to be converted into HSBC format. It is mandated that the full integration of Fenton’s in-scope businesses to HSBC’s products & applications propositions which are currently available in the concerned geography. The main objective of Project is to convert the acquired cards portfolio from RBS to the existing OHC portfolio of HSBC India (INM).

2. From data migration perspective, the main objective is to check whether the Source Data or Extract provided for migration is accurate, complete and consistent with business migration rules

Responsibilities:

1.Interaction with onsite teams/Offshore team.

2.Reviewing Business Requirement Document, Functional specification document, Migration analysis document, Mapping Documents.

3.Involved in functional, exploratory and Integration testing.

4.Performed Data validity testing for reports and feeds based on client's requirement.

5.Validated format of the source files.

6.Written SQL queries to access the data in SQL Server and Oracle database to execute back-end testing.

7.Creation of the special SQL Script files for Automation

8.Used PL-SQL functions and stored procedure to validate complex billing logic.

9.Participated in daily QA meetings to resolve technical issues.

10.Communicated with developers and Business Analysts to discuss issues and priorities.

11.Test Case Management using Quality Center

12.Interacting with senior peers or business to learn more about the data

13.Test data creation in various required formats up to millions of records as per banking requirements.

14.Identifying and logging defects in QC against these docs and getting clarification on it from DA/Business team.

Client : IBM Pvt Ltd, India.

Role : ETL/Mainframe Tester

Project:

Data Centralizing (D&B data )

Team Size:

12+ (Including support team )

Environment:

Informatica (ETL mapping using Power center tool), Oracle 11g, HP Quality Center 9.2, Windows XP, Mainframe (Cobol, JCL, PS and PDS flat files)

Project Description:

D&B is the leading provider of global business information, tools, and insight, has enabled customers to Decide with Confidence. D&B’s proprietary DUNS Right™ quality process provides customers with quality information whatever and wherever they need it. This quality information is the foundation of D&B’s solutions that customers rely on to make critical business decisions. Customers use D&B Risk Management Solutions to mitigate risk, increase cash flow and drive increased profitability, D&B Sales & Marketing Solutions to increase revenue from new and existing customers, and D&B Supply Management Solutions to identify purchasing savings, manage risk and ensure compliance within the supply base. D&B’s E-Business Solutions help customers convert prospects to clients faster.

Trade Teams involve in gathering data from certain customers of D&B relating to the trade payment experiences they have with businesses in the D&B databases, and using such data to predict future payment habits of such businesses. The accounts receivable files are placed in a database, the payment experience is calculated and then the payment experience information is available to customers who buy D&B, reports and products. This payment information is used by other companies, to help evaluate credit risk and opportunities to develop risk-based marketing plans.

Responsibilities:

1.Responsible for creating complete test cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes.

2.Validated format of the source files.

3.Written SQL queries to access the data in SQL Server and Oracle database to execute back-end testing.

4.Performing database testing to verify backend database transactions.

5.Communicated with developers and Business Analysts to discuss issues and priorities.

6.Test Case Management using Quality Center

7.Identifying and logging defects in QC against these docs and getting clarification on it from DA/Business team.

8.Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.

Vijay Bhagat

Sr. Data Quality Analyst

7171 Woodland Ave, Des Moines, IOWA

Email – adhzcq@r.postjobfree.com- 515-***-****

(LinkedIn - https://www.linkedin.com/in/vije-bhagat-3a8430bb/ )



Contact this candidate