Post Job Free

Resume

Sign in

ETL Tester

Location:
Jersey City, NJ
Posted:
September 30, 2023

Contact this candidate

Resume:

Specialist – ETL with Automation Testing

Mobile +1-551-***-****

Shivani Email ID - adz2de@r.postjobfree.com

Current location – Jersey City, New Jersey, USA

LinkedIn - https://www.linkedin.com/in/shivani-fnu-84441526a/

Professional Summary

7.5 Years of relevant IT Experience in ETL with Informatica Power Center, Selenium with Java, DWH, BI, Manual, SharePoint and functional Testing.

Experience in Mobile Application testing with different versions of ios and RoamBI Analytics App.

1 Year of relevant experience as part time lecturer in an engineering college, Punjab.

Exposure to end - to - end Data Warehouse and ETL Concepts.

Proficiency in designing SQL queries for ETL testing and Data Testing executing Test cases and testing the BI reports.

Good experience in implementation of various client-server and decision support system environments with a focus on Big Data, Business Intelligence, and database applications.

Excellent Analytical skills with good technical knowledge in Data Warehouse testing projects.

Experience of working on different domains like Financial services (FS), Investment Banking, Capital Market,Trade Finance, Insurance,Retail,Health Care.

Experience of working in Agile process.

Good knowledge of Informatica Power center and IICS

Involved in all phases of Testing i.e. from requirement analysis phase to project delivery.

Extensive experience of Web services and REST API testing using Rest Assured Java libraries & Postman.

Experience in creating Automation framework using Cucumber(BDD) and Selenium.

Through experience in Test Management Tool “HP Quality Centre” V 9.0, ALM and JIRA & Remedy.

Experience in Functional, GUI and Usability testing of a web application.

Strong analytical, communication skills and ability to accustom with the changes and to be a good team player.

Configuration Management and Version Control using SVN (Tortoise).

Good knowledge of AWS and Microsoft Azure,Azure Data Factory,Data Bricks and Azure Data Lake

Ability to collaborate with testers, developers, project managers and respective stakeholders in order to tackle the issues related to project delivery.

Open to any new technology or domain or tools based on project requirement.

Skills Summary

Testing type

ETL Testing,Big Data Testing

Testing type performed

Automation testing,Functionality testing, System Testing, GUI Testing, Regression Testing, WebService testing

Domain

Financial services (FS), Investment Banking, Capital Market,Trade Finance, Insurance,Retail,Health Care

ETL Tools

BI Tools

DB Tools

File Management Tools

Data bases

Test Management Tool

Operating System

API Testing

Automation Tools

Languages

DWH,SSIS, Informatica Power Center,IICS

SCA (Strategic Companion Analyzer) / IAM, Cognos

SSMS (SQL Server Management Studio), WinSQL

FileZilla, Putty, WINSCP

SQL Server, Oracle, Teradata,Netezza

Quality Centre 10, QC, HP ALM, JIRA

UNIX,Windows (XP Professional, Win 7, Win 10, Win 11)

Postman

Selenium Web Driver,Test NG,Cucumber

Core Java,PL/SQL,Python

Employment

Total Experience: 7.5 Years

Organization

Role/Designation

Duration

Capgemini India Private Limited, Pune, India

Associate Consultant

Mar, 2015 to Mar,2018

L&T Infotech, Mumbai, India

Senior Software Engineer

Sep, 2010 to Feb, 2015

Capgemini, India ( Barclays Bank)

Domain: AFPG(Asset Finance Platform Growth)[Banking] Oct, 2015 - Mar, 2018

Client (Barclays Bank):

Barclays is a major global financial services provider engaged in retail banking, credit cards, corporate and investment banking and wealth

management. Barclays moves, lends, invests and protects money for 48 million customers and clients worldwide

Project Description:

Asset financing refers to the use of a company's balance sheet assets, including short-term investments, inventory and accounts receivable, in order to borrow money or get a loan. The company borrowing the funds must provide the lender with security interest in the assets.

MI reporting, or management information reporting, is a communication method that lets business managers convey ideas and concepts to an audience, understand market trends and ensure an efficient business operation. This includes the provision of Management Information (MI)

reports on a periodic basis (monthly, or as agreed) and regular Service Review Meetings.QA’s standard MI reports incorporate a combination of quantitative data and qualitative service assessment in a format that allows data to be reviewed effectively and consistently.

Roles and Responsibility:

Requirement Analysis and prepared query log for the datwarehouse requirements.

Conducted Technology and application training sessions for new joiners to the project.

Create, design and execute test cases.

Running the UNIX commands to load the data.

Running the batch jobs to load the data from the source(ALFA UI) to downstreams(GCA) using Informatica powercenter and following the DWH Concepts

Validating the output data by running the SQL queries and Data Testing

Generating the BI reports from the Cognos Portal

Validating the BI reports data with front end.

Analyzing the test cases for Regression testing.

Implemented the complete Big data pipeline with real-time processing

Actively performed web services and REST API testing using RESTful API.

Experience in creating automation framework using Cucumber(BDD) and Selenium +Java and Selenium+Python

Developed automation script for Rest API and web service testing of Micro-services(Post,Get & Put method) using UFT API & Postman which integrates the Framework for Functional and Regression testing.

Capgemini, India ( Nordea Bank)

Domain: FATCA CCE[Banking] Mar, 2015 - Sep, 2015

Client (Nordea Bank):

Nordea Bank Abp,commonly referred to as Nordea, is a Nordic financial services group operating in northern Europe with headquarters in Helsinki, Finland. The name is a blend of the words "Nordic" and "idea".

Project Description:

The Foreign Account Tax Compliance Act (FATCA) is a U.S. law dated March 2010.The Purpose of FATCA is to provide the U.S. tax authority IRS information on US persons (subject to U.S. tax) accounts held with financial institutions in countries outside USA, in order to prevent U.S. offshore tax evasion.CCE engine classifies the customers using data sourced through FDW and stores back the classification file back to the legacy system.CCE handles the automatic classification where data sourcing is secured via FDW and manual remediation are handled together by BPM and back office department.

Roles and Responsibility:

Requirement Analysis and prepared query log.

Conducted Technology and application training sessions for new joiners to the project.

Prepared user manuals and Knowledge capture documents for the project in order to help new joiners’ understand the project better.

Create, design and execute test plans, test harnesses and test cases

Test execution of flat file data transfer to Teradata.

Running the UNIX commands to load the data.

Understanding the DWH concepts and Azure Data Lake,Azure Data Factory

Running the jobs/Workflows for ETL process in Informatica Power Center

Verified column mapping between source and target by running SQL queries and also doing the Data Testing

Designed, Reviewed & Executed Test cases.

Identified Test cases for Regression testing

Defect Analyzing and Reporting in QC

Writing sql queries for various scenarios like count test, primary key test, duplicate test, attribute test,

default check, technical data quality, business data quality.

Business rule validation on loaded table and generating the BI reports.

LTI, India ( Barclays Bank)

Domain: ABSA_EFB_SIT[Banking] Jan, 2014 - Feb, 2015

Client (Barclays Bank):

Barclays is a major global financial services provider engaged in retail banking, credit cards, corporate and investment banking and wealth

management. Barclays moves, lends, invests and protects money for 48 million customers and clients worldwide

Project Description:

ABSA and Barclays are embarking on a journey to enhance and replace their current ETL and DWH environments. This will entail a new implementation of Informatics which will replace the current ETL toolsets such as Abinitio, Oracle Pure Extract and Oracle Warehouse Builder. The current data warehouses are built on Oracle and will be replaced with Teradata.

The main scope of the Data Integration stream for Credit Risk project includes the Acquisition of new and existing Credit Risk data from primary source system into the One Africa Data Warehouse for Mortgage Home Loans, Exposure of this data for Credit consumption.

Roles and Responsibility:

Preparing the major test artifacts like High Level estimates, Test Plan and Test Strategy.

Requirement Analysis and prepared query log.

Conducted Technology and application training sessions for new joiners to the project.

Prepared user manuals and Knowledge capture documents for the project in order to help new joiners’ understand the project better.

Create, design and execute test plans, test harnesses and test cases

Test execution of flat file data transfer to Teradata.

Running the jobs/Workflows for ETL process in Informatica Power center

Verified column mapping between source and target.

Designed, Reviewed & Executed Test cases.

Identified Test cases for Regression testing

Defect Analyzing and Reporting in QC

Writing sql queries for various scenarios like count test, primary key test, duplicate test, attribute test,

default check, technical data quality, business data quality.

Business rule validation on loaded table

Compared source to target data using DTF tool.

Performed Data migration testing from Oracle to Teradata.

LTI, India ( IMS Health)

Domain: RoamBI Upgrade and IMI v1.2 Aug, 2013 - Dec, 2013

Client (IMS Health):

IMS Health, Inc. is the leading provider of global market information to the pharmaceuticals and healthcare industries. Compiling information into more than 10,000 reports, available on a regularly updated basis, IMS Health provides a wide variety of market knowledge to support strategic decision-making in all aspects of pharmaceutical company operations.

Project Description:

This project deals with the upgrading the RoamBI Version from 4.4 to 4.5.1 and checking whether is there any impact on the existing production cube reports after upgrading the RoamBI version. Also checking the cube reports after republishing them with and without changes. Also tested the different functionalities of the Mobile Insights App version 1.2

Roles and Responsibility:

Preparing the major test artifacts like High Level estimates, Test Plan and Test Strategy.

Preparing the test scenarios based on the Functional Requirement Specification.

Preparing and reviewing the test cases based on the Low Level Design document.

Perform the test environment readiness check before the start of the test execution.

Upgrading the RoamBI version from 4.4 to 4.5.1and checking the impact on existing production cube reports.

Testing the different functionalities of thr Mobile Insight App version 1.2

Updating the QC i.e. maintaining RTM, uploading test cases and walkthrough of QC to dev team, if required.

LTI, India ( IMS Health)

Domain: SMP(Sales Management Partner) Oct, 2010 - Jul, 2013

Client (IMS Health):

IMS Health, Inc. is the leading provider of global market information to the pharmaceuticals and healthcare industries. Compiling information into more than 10,000 reports, available on a regularly updated basis, IMS Health provides a wide variety of market knowledge to support strategic decision-making in all aspects of pharmaceutical company operations.

Project Description:

This project deals with the ETL test and cube testing. The source files will be taken from the ftp server and loaded from source systems to staging area by running the batch files. Once loaded into the staging area, ETL will be performed i.e. transformation rules will be applied and data will be loaded into the data mart using the SSIS Packages. The data mart extracts will be taken from the GLA and cube will be built using GCB (Global Cube Builder). Earlier we were hosting the cubes manually once the cube was built successfully .Later on if the cube data refreshed due course of time, in order to publish those data we were hosting that again & so on. In order to mitigate this tedious task, client came up with a new system called as Hosting Uplift i.e. hosting the cubes automatically. Here if the data gets refreshed then user does not have to host it again, rather he just have to place the .abf file in the FLA(File Landing Area) & hosting will start automatically as the cube template already exist in the system. Template is guarded by the File Watcher, which continuously watching the FLA for arrival of the template related .abf file to start the process. Once the cube is hosted the data validation and structural validation will be done using SCA (Strategic Companion Analyzer) which is a BI tool. Once Validation is done, the reports will be generated and shared with the clients.

Roles and Responsibility:

Preparing the major test artifacts like High Level estimates, Test Plan and Test Strategy.

Preparing the test scenarios based on the Functional Requirement Specification and the mapping rules.

Preparing and reviewing the test cases based on the Low Level Design document and the mapping rules.

Perform the test environment readiness check before the start of the test execution.

Updating the QC i.e. maintaining RTM, uploading test cases and walkthrough of QC to dev team, if required.

Building the cubes via GCB(Global Cube Builder)

Hosting the cubes using the Hosting uplift app(ADH – Automated Data Hosting)

Assigning user privileges through the Hosting Uplift app to access the cubes in SCA and testing the BI reports.

Executing the test cases, logging defects in MQC and there by linking those with the failed test cases.

Preparing the test execution document or test report for the future iterations.

Maintaining the Project artifacts in the specified Share point locations i.e. SVN & ARC.

Updating the project status to the onsite manager & stakeholders in the daily status call and also through the DSR (Daily Status Report) and WSR (Weekly Status Report).

Coordinate defect triage calls with the developers and client managers to report the risks found during the testing.

Education

Graduation – B.Tech in Electronics and Communication from Guru Nanak Dev University, Amritsar in 2008

Certifications

ISTQB Foundation Level certified.

NCFM Mutual Funds Certified.

NCFM Financial Markets Certified.

Place:

Date: (Shivani)



Contact this candidate