Post Job Free

Resume

Sign in

Data Etl

Location:
Apex, NC
Posted:
October 01, 2020

Contact this candidate

Resume:

Narendra Chaudhari

919-***-****

adgksl@r.postjobfree.com

Permanent address: - Morrisville NC 27560

Work authorization: - US Citizen

Experience Summary:

Over 8 years of IT experience in various phases of IT projects such as Development, Testing, deployment and application support.

Hands on experience with all phases of Software Testing Life Cycle (STLC)

Have the ability to elicit, understand and articulate business requirements and perform detailed analysis to map them to functional QA requirements.

Experience in testing Data Marts, Data Warehouse/ETL Applications developed in Data stage and Informatica using Oracle, Teradata and UNIX.

Strong experience in testing Data Warehouse (ETL) and Business Intelligence (BI) applications.

Have experience in all phases of Software Testing Life Cycle (STLC), experienced in developing Test Cases, Test Plan and Automation test scripts using Selenium with Java, BDD, Gherkin, UI, Postman, Rest API, UI in a CI/CD environment. Diversified experience in Automation, Manual testing and Business Analysis.

Expertise in Selenium automation using Selenium WebDriver, Selenium Grid

Have pleasant experience on working with agile mythologies such as Master, scrum, owner and team

Expertise in creating Test plan documents and Test strategy documents. Expertise in designing the test scenarios and scripting the test cases to test the application.

Good exposure with databases like Oracle 11g, Teradata, SQL Server 2008

Strong experience in Data Analysis, Data Validation, Data Profiling, Data Verification, Data Loading.

Have good knowledge on Working within the Data Access Layer with Object Relational Mapping (ORM) (Entity Framework), and an understanding of RDBMS modeling and design.

Experience in Dimensional Data Modeling using Star and Snow Flake Schema

Strong working experience in the Data Analysis and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL)

Experience in maintaining Full and Incremental loads through ETL tools on different environments.

Experienced in Defect Management using Mercury Test Director, HP Quality Center

Good experience in writing SQL in order to data validation in migration as part of backend testing

Worked with different data sources ranging from VSAM files, CSV file, flat files, and Teradata, oracle and SQL Server databases.

Expertise in Informatica Power Center 9.x/8.x/7.x/6.x extracting data from Oracle, SQL Server and DB2 databases.

Extensive experience in Extraction, Transformation and Loading of data using Informatica from heterogeneous sources.

Experienced in testing performance in an Agile methodology

Experience with Apache JMeter, LoadRunner, Performance Analysis, and Web Performance Monitoring/Tuning

Have good knowledge on different type of testing such as load testing, stress, DB testing etc..

Handsome experience in working with UNIX/Lunix scripts in executing and validating logs for the issues.

Extensively used SQL queries, Unix\Linux commands for data verification and backend testing.

Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

Have experiences testing on different components like customer data, Agreement data, Deal, Excess, violation, Asset, Organization, Application and securities.

Hands on experience in writing SQL queries, Joins, Functions in Oracle and SQL Server databases.

Have good knowledge on healthcare area like pharmacy and blood donation center.

Good expertise in using TOAD, SQL*Plus and Query Analyzer.

Strong experience in UNIX shell scripting and monitoring the UNIX logs to check for any errors.

Created the test instances for the project in the Test Lab and Test Plan module of Quality Center.

Strong experience in Backend testing on Relational Databases (RDBMS) like Oracle, SQL server 2012, Tara Data using complex SQL queries.

End to End responsibility to close the defects and fixes.

TECHNICAL EXPERTISE:

ETL Tool: Informatica Power Center 8.6, Data stage 8.5, DVO

Reporting Tools: Excel, SSRS, excel pivot

Database: Oracle, SQL Server, DB2, Teradata, MS-Access, hive, MY SQL Workbench

Utilities: Toad, Oracle SQL Developer, SQL Advantage, SSIS, SSRS, Putty, WINSCP

Testing Tools: HP Quality Center 9.2/Test Director, HP ALM 11.0, Selenium, JIRA, dvo, icedq,

Programming Lang: SQL, PL/SQL, VB. Script, JAVA, JSON

Automation Tools: Eclipse IDE, Cucumber, Karate, Postman, Rest API, UI, API, Jenkins, GIT, TestNG, JUnits etc.

Bachelor degree from Kathmandu Nepal

WORK EXPERIENCE:

Wells Fargo, Raleigh NC

SDET/Scrum Role July’2019 – Present

The Brokerage Data warehouse (BDW) project is very large project where wells Fargo is dealing with different kind of wealth and investment related product. Wells Fargo is dealing several of Enterprise trading data, financial industry regulatory data, customer account, household account, and various types of equities and securities all over the world. In this project wells Fargo is Moving the data from UMD stage to ODS brokerage data warehouse.

Involved in the entire software development Life Cycle of the projects from Initiation to Implementation.

Involving in Waterfall and Agile Projects, worked very closely with Product owners in Agile and attended different meetings like Sprint Planning meeting, Scrum, Backlog grooming, Sprint Review and Retrospective

Assessed and analyzed user stories, Business Requirements (BRD), Functional Requirements (FRD), Technical design documents (TDD) and participated in Sprint planning, Review Sessions and Scrum Meetings.

Expertise in writing and executing Test plans, Test Suites, Test Cases, Test summary reports, for automation and manual testing as well, based on User requirements, System requirements and Use Case documents.

Being actively used JIRA/ALM to manage stories test cases, defects, Sprint planning related activities.

Organizing weekly meetings with offshore managers, Onsite Coordinator & Onsite manager to discuss the Project goals and report the status of the overall project status.

Actively being used Linux/ Unix server in Dev, CTE, QA/UAT environment for script run, job load, file transfer, and validate the flat and delimited files

Actively being used Informatica PC designer for creating metadata based on mappings, and workflow managers, workflow monitors for jobs updates.

Being used DVO ETL automation tool for validation of file to file, file to table, table to file and table to table where we create table pair and single validation.

Strongly being used SQL queries to validate the data in UDM stage, ODS stage by using oracle sql developer tool.

We have been creating controls file and DDL file to load extracted files in to temp table in order to validate the file vs table.

Organizing and attending defect triage meeting, code review meeting, Test execution review meeting with all SDLC team.

Preparing defect report, test run report, RTM report and sign of document.

Environment: ETL Informatica, DVO, Oracle, SQL Server, Quality Center/ALM, SQL, Apache, Teradata SQL Assistant, SQL Server, UNIX/Linux, SSRS, MS Access, Flat Files, JIRA, Reporting database, Microsoft word, Excel, Excel pivot, macro,PPT

Fidelity Investment, Durham NC

Automation Engineer/Scrum Master Role Oct’2019 – Jun’ 2020

Fidelity is developing an application called Nortek Application. In This project, The Fidelity Benefits Markets (FBM) offers employers and their employees an extensive network of national and regional medical, dental, vision, life and disability, HSA benefits in addition to tax-savings options and access to wellness tools and programs; all in an integrated view with retirement benefits and payroll. In this project we are validating Employee enrollment, plan, details, plan selection etc. with their dependents.

Assessed and analyzed user stories, Business Requirements (BRD), Functional Requirements (FRD) and participated in Sprint planning, Review Sessions and Scrum Meetings.

Expertise in writing and executing Test plans, Test Suites, Test Cases, Test summary reports, for automation and manual testing as well, based on User requirements, System requirements and Use Case documents.

Being actively used JIRA to manage stories test cases, defects, Sprint planning related activities. And Used HWA matrix to display our test result reports.

Designed and implemented Test Framework based on Data Driven Framework and Cucumber, karate Framework.

Strong experience with software development approaches like Agile and Iterative.

Implemented TestNG and Junit automation framework for regression testing

Performed Functional testing as per user stories and also performed Integration Testing and System Testing using Selenium Web Driver, Postman, UI and API.

Being used GIT bush to get repo from GIT repository and push the changed code to original code in to the repository.

Scripted the Test cases and managed the framework dependency jars using Maven. Also used Maven to perform build from Jenkins Continuous Integration.

Performed BDD (Behavior Driven Development) using Cucumber Karate Features, Scenarios and Step Definitions in Gherkin format.

Performing manual and automation test at the Fidelity Benefit market (FBM) Nortek Application using with Selenium Web Driver, Eclipse IDE, Postman, UI and API.

Performed Cross Browser Testing on different types of browsers to run all the tests at the same using TestNG.

Working knowledge with cloud based code repository system like GitHub and Source Tree.

Verify dataflow through the front and backend – using My SQL workbench Queries to extract data.

Performing the testing in Dev. QA, SIT and UAT environments.

Production validation and production support during the implementation of the project.

Recreated client issues in test environment and gave demo to developers.

Environment: Selenium Web Driver, Selenium IDE, JIRA, Eclipse IDE, HP Quality Center, JAVA, JSON, HTML, UI, API, Postman, Cucumber, Karate, Maven, TestNG, Junit, My SQL workbench, Oracle, Jenkins, GIT Bush, MS Excel, MS Word, PPT etc.

Wells Fargo Bank, NC Nov’2018– Oct’ 2019

Sr. QA System Analyst /Tech Test lead

The EPP ITforIT project is about software hardware, and operating system related project .In this project the Wells Fargo has distributed millions of devices and the main purpose of this project is to analyze the application (software ) are installed in the each device are being used actively . In this project QA team are analyzing the data which are coming from raw and sanitized layers in to conformed and curated layer and preparing the repot to make decision about business team.

Responsibilities:

Documented the business requirements, developed test plans, test cases created for the database backend testing and to test database functionality.

Involved with Agile team and worked out different sprint module.

Backend testing of the DB by writing SQL queries and PL/SQL scripts to test the integrity of the application in Teradata databases, Hive cloud base.

Have used multiple advance SQL queries function such as join, union, alter, create, distinct and some other function to join different database in single query while executing the script in hive

Expertise in writing SQL Statements in database to make sure whether the data is populated in Data Mart/Data warehouse According to Business Rules.

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, System (End to End) and Backend (Database).

Responsible for validation of Target data in Data Warehouse and Data Marts, Hadoop/clouds which are Transformed and Loaded using Informatica

Involved in investigating the errors by tracking the root cause for the error.

Have Used Informatica PC designer, workflow Managers, Workflow Monitors

Have created Metadata for the TDD in PC designer.

Used ETL DVO automation tool for table pair and single test validation.

Collected evidence for each step of the process to ensure that any errors are captured in time and resolved immediately.

Experience working in an Agile method

Created QA test scenarios, test cases and test case walk-through in partnership with Business SMEs

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, System (End to End) and Backend (Database)

Participate in QA test case execution

Attend Daily Defect triage and execution meetings

Created QA Traceability Matrix for the work stream

Participate in identifying Regression test cases

Identified and escalate any QA Risks/Issues for the Line of Business in partnership with the Business SMEs for the line of Business to the UAT Manager

Used JIRA for QA defect tracking and logged multiple defects

Environment: ETL Informatica, Oracle, SQL Server, Quality Center/ALM, SQL, Apache Hadoop, Hive, Teradata SQL Assistant, SQL Server, UNIX/Linux, SSRS, MS Access, Flat Files, JIRA, Reporting database, Microsoft Visio, Excel, Excel pivot, macro, word,PPT

Anthem Inc. Norfolk VA Aug 2018– Oct’ 2018

Sr. ETL Automation Engineer

Anthem is a leading health benefits company dedicated to improving lives and communities, and making healthcare for more than 74 million people, including nearly 40 million within its family of health plans. Anthem- Encounter project is about claim process of Medicaid, Medicare, and other third-party insurance like pharmacy, dental, vision etc.

Responsibilities:

Design, develop and implement automated tests in an ATDD (Acceptance Test Driven Development) setting.

Write and execute automated tests using various test automation tools (ex. Informatics dvo, ICDEQ).

Involved with Agile team and worked out different sprint module.

Deliver test automation solutions in accordance with enterprise standards and within development and operational guidelines.

Have used multiple advance SQL queries function such as join, union, alter, create, distinct and some other function to join different database in single query while executing the script

Mentor and direct other testers in automation and software engineering principles.

Perform reviews of automation and application code and present test results to project teams.

Engineer automated test and metric reporting solutions.

Develop in-depth system and application knowledge to provide higher test quality and coverage.

Constantly evaluate and enhance the test automation strategy and approach.

Collaborate as part of the scrum team in grooming user stories and development of acceptance criteria for the user stories.

Share and communicate ideas both verbally and in writing to staff, business sponsors, managers, and technical resources in clear concise language that is appropriate to the target audience.

Participate in communities of practice to share knowledge, learn, and innovate.

Research and implement tools that support faster delivery with high quality.

Identified and escalate any QA Risks/Issues for the Line of Business in partnership with the Business SMEs for the line of Business to the Manager

Used JIRA for QA defect tracking and logged multiple defects

Environment: ETL Informatica, Oracle, SQL Server, Quality Center/ALM, SQL, Teradata SQL Assistant 6.0, Teradata, SQL Server, MS Access, Flat Files, JIRA, Reporting database, DVO, ICDEQ, excel,ppt, word etc.

Wells Fargo Bank, NC Dec’2016– July’ 2018

Sr. QA Analyst / Backend/UAT Tester

The CoRS project is the largest independent securities firms doing business in the United States. Counterparty Credit Risk Management (CCRM) is responsible for ensuring that an appropriate Risk Management framework exists for the primary trading and lending desks and products that they offer, with Particular Emphasis on Counterparty Credit Risk. These tasks include the identification, measurement and aggregation, approval, pricing and reporting of counterparty credit risk.

Responsibilities:

Documented the business requirements, developed test plans, test cases created for the database backend testing and to test database functionality.

Involved with Agile team and worked out different sprint module.

Backend testing of the DB by writing SQL queries and PL/SQL scripts to test the integrity of the application and Teradata databases.

Have used multiple advance SQL queries function such as join, union, alter, create, distinct and some other function to join different database in single query while executing the script

Expertise in writing SQL Statements in database to make sure whether the data is populated in Data Mart/Data warehouse According to Business Rules.

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, System (End to End) and Backend (Database).

Have experiences testing on different components like customer data, Agreement data, Deal, Excess, violation, Asset, Organization, Application and securities.

Responsible for validation of Target data in Data Warehouse and Data Marts which are Transformed and Loaded using Informatic

Involved in investigating the errors by tracking the root cause for the error.

Collected evidence for each step of the process to ensure that any errors are captured in time and resolved immediately.

Experience working in an Agile method

Created QA test scenarios, test cases and test case walk-through in partnership with Business SMEs

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, System (End to End) and Backend (Database)

Participate in QA test case execution

Tested the data CORS Vs Adaptive and also tested in Production and to make sure data coming from adaptive is populates correctly in CoRS.

Attend Daily Defect triage and execution meetings

Created QA Traceability Matrix for the workstream

Participate in identifying Regression test cases

Identified and escalate any QA Risks/Issues for the Line of Business in partnership with the Business SMEs for the line of Business to the UAT Manager

Used JIRA for QA defect tracking and logged multiple defects

Environment: ETL Informatica, Oracle, SQL Server, Quality Center/ALM, SQL, Teradata SQL Assistant 6.0, Teradata, PL/SQL, SQL Server, UNIX, SSRS, MS Access, Flat Files, JIRA, Reporting database, Selenium

FINRA, NYC, NY Nov 15– Oct 16

Sr. ETL/ Backend Tester

The Financial Industry Regulatory Authority, Inc. (FINRA) is the largest independent regulator of securities firms doing business with the public in the United States. Our core mission is to pursue investor protection and market integrity, and we carry it out by overseeing virtually every aspect of the brokerage industry.

Responsibilities:

Documented the business requirements, developed test plans, test cases created for the database backend testing and to test database functionality.

Involved with Agile team and worked out different sprint module.

Assisted in gathering the business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse project

Tested different Informatica Mappings especially are used effectively to develop and maintain the database.

Developed ETL test cases for various lines of businesses based on ETL mapping document.

Backend testing of the DB by writing SQL queries and PL/SQL scripts to test the integrity of the application and Teradata databases.

Expertise in writing SQL Statements in database to make sure whether the data is populated in Data Mart/Data warehouse According to Business Rules.

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, System (End to End) and Backend (Database)

Have experiences testing on different components like customer data, Agreement data, Deal, Excess, violation, Asset, Organization, Application and securities.

Responsible for validation of Target data in Data Warehouse and Data Marts which are Transformed and Loaded using Informatica

Involved in investigating the errors by tracking the root cause for the error.

Collected evidence for each step of the process to ensure that any errors are captured in time and resolved immediately.

Used Informatica Power Center to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files)

Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.

Deployed reusable transformation objects such as maplet to avoid duplication of metadata, reducing the development time.

Extensively used ETL methodology for testing and supporting data extraction, transformations and loading processing in a corporate-wide-ETL Solution using Informatica

Have programming skills in SQL and experience in Oracle and Teradata databases on UNIX and Windows platforms.

Backend testing of the DB by writing SQL queries and PL/SQL scripts to test the integrity of the application and Teradata databases.

Written Complex SQL Queries to do manual testing for every release and check the configurations.

Tested several different types of reports including Report Layout, Naming Conventions, Totals, Sub-Totals, Drilling options, prompts, metric calculations, drill maps and security filters using Business Objects.

Used Business Objects for testing the scheduled reports and viewing the history.

Wrote SQL and PL/SQL scripts to validate the database systems and for backend database testing.

Raised defects in HP Quality Center/HP ALM /JIRA defect tracking system.

Designed Traceability Matrix to match the test scripts with the Functional design document.

Good experience in writing SQL in order to data validation in migration as part of backend

Testing. Worked with ETL group for understating mappings for dimensions and facts.

Assisted in promotion of Data stage code and UNIX from UAT to Production.

Tested data from various sources like Teradata, Oracle flat files and SQL Server. Worked on issues with migration from development to testing.

Environment: ETL Informatica, Oracle, SQL Server, Quality Center/ALM, SQL, TOAD, Teradata SQL Assistant 6.0, Teradata, PL/SQL, SQL Server, UNIX, WinSQL, SSRS, MS Access, Flat Files.

PLS Financial, Chicago, IL April 13 – Sept 15

Sr. ETL/ DWH Tester

PLS is reshaping the consumer retail financial services industry through its foresighted development of innovative financial products and services, and its commitment to exceptional customer service.

Responsibilities:

Involved in the entire software Testing Life Cycle of the projects from Initiation to Implementation.

Create test plan and test cases for manual and automated testing from the business requirements to match the project's initiatives.

Tested various ETL transformation rules based on log files, data movement and with help of SQL.

Created and executed detailed test cases with step by step procedure and expected result. Maintained the test logs, test reports, test issues, defect tracking using Quality center.

Extensively used Data Stage for extraction, transformation and loading process.

Tested to verify that all data were synchronized after the data is troubleshoot, and also used SQL to verify my test cases.

Used Informatica Power Center to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different sourcesystems (SQL server, Oracle, Flat files).

Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, SequenceGenerator and Normalizer.

Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.

Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD/Exceed.

Tested several complex reports generated by Cognos including Dashboard, Summary Reports, Master Detailed, Drill Down and Score Cards

Worked with ETL group for understanding Data Stage graphs for dimensions and facts.

Worked as ETL Tester responsible for the requirements / ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.

Facilitated the sharing of structured data across different information systems using XML Created and executed automation scripts using QTP

Tested Data Stage Hashed files for extracting and write data - an intermediate file in a job.

Ran workflows created in Informatica by developers then compared before and after transformation of data generated to ensure that transformation was successful.

Good experience in validating the Adhoc and OLAP cube. Used XSLT for document is transformed into another XML document that uses the formatting vocabulary written several complex SQL queries for validating Cognos Reports.

Validated Cognos Standard Reports created based on Frame work manager and Cube.

Worked

UNIX shell scripts. Worked with releasing scripts live and tested.

Tracked bugs using Quality Center and generated the defect reports for review by the client and the management teams.

Performed ETL testing and extensively used SQL functionalities. Used TOAD to connect to Oracle Database.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Tested different master detail, summary reports, ad-hoc reports and on demand reports using Cognos Report Studio.

Strong in writing UNIX shells scripting. Automated and scheduled the Data stage jobs using UNIX Shell Scripting.

Participate in weekly meeting with the management team and walkthroughs

Environment: Data stage 8.1.1, Oracle10g, Cognos8.1, Windows NT/2000/XP, SQL SERVER, SSRS, TOAD, Quality center 10, UNIX. JIRA, SQL server HP/ALM

VISA Inc, Foster city, CA March 12 – Feb 13

ETL QA Tester

Responsibilities:

Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.

Assisted for the business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse project

Promoted Unix/Data Stage application releases from development to QA and to UAT environments as required.

Wrote Complex SQL queries and PL/SQL subprograms to support the test case results, SQL performance tuning.

Extensively used SQL programming in backend and front-end functions, procedures, packages to test business rules and security.

Involved in user training sessions and assisting in UAT (User Acceptance Testing).

Assisted in promotion of Data Stage code and UNIX Shell scripts from UAT to Production.

Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Created the test environment for Staging area, loading the Staging area with data from multiple sources.

Assisted in System Test and UAT testing scenarios as required

Worked with thin client and thick client (Business objects)

Extraction of test data from tables and loading of data into SQL tables.

Created ETL test data for all ETL mapping rules to test the functionality of the Data Stage Mapping

Test and evaluate the video codec and processing technologies and new features with respect to reliability, accuracy and usability

Extensive experience in writing complex SQL and PL/SQL scripts.

Tested the ETL Data Stage mappings and other ETL Processes (Data Warehouse Testing)

Written several complex SQL queries for validating Cognos Reports.

Tested the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and Formulae etc.

Used Maven, Selenium Grid to execute Selenium automation suites on different platform like Java, browser combinations in parallel.

Identified weaknesses in QA Processes, Web testing, Selenium Automation. Suggested &

Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases

Responsible for Data mapping testing by writing complex SQL Queries using WINSQL

Experience in creating UNIX scripts for file transfer and file manipulation.

Validating the data passed to downstream systems.

Worked with business team to test the reports developed in Cognos

Used Quality Center to track and report system defects

Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Data Stage, MVS, UNIX, Oracle, SQL Server 2005, DB2, Business Objects, Windows 2000 Pro, XML, XSLT, XSD, Rapid SQL, PL/SQL, TOAD, Dat Files, XML Files.

Java, Selenium tool



Contact this candidate