Post Job Free

Resume

Sign in

Data Manager

Location:
Westborough, MA
Salary:
Market
Posted:
February 18, 2018

Contact this candidate

Resume:

GURPREET K. BHATIA

QA Automation / ETL Tester

Westborough, MA, USA

615-***-****

ac4ipw@r.postjobfree.com

TECHNOLOGICAL PROFICIENCIES

Manual Testing

Informatica Power canter

Oracle

MS SQL Server

Selenium

TestNG

Microsoft Office Suite

Jira

JAVA

WordPress

OTHER SKILL

HTML

CSS

Unix Shell scripting

JavaScript

Adobe Dreamweaver

Adobe Photoshop

COMMUNITY INVOLMENT

Volunteer – Westborough public school

ACTIVITY

Dance & Music - Participated in many competitions

SUMMARY

Over 5 years of experience in Software testing environment & ETL Developer. Performed GUI testing, system testing, functional testing, black box and white box testing on Web and desktop Applications. Strong 3+ experience in Data Warehousing and ETL using Informatica PowerCenter 9.5, 8.6 .1, Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, and Informatica server) ETL, Data mart, OLAP and OLTP.

EDUCATION

2001 - 2005 Bachelor of Engineering (B.E.) in Computer Science from Jawaharlal Institute of Technology, India

2012 - 2013 Web design certification BHCC, Boston MA.

PROFESSIONAL EXPERIENCE

2016 - Current QA Automation Tester

Freelancer

I worked in many web and desktop application with some freelancer companies. I have tested following applications: Direct TV now, Restore, Enigma recovery, Open Any File, True motion, Barnes & noble, OpenTable, TOMO(touch of modern), RBDigital, Crackle, vevo, Media complete, DailyBreak

Websites: - Canada Goose, Bjs, Modcloths, shoes, Ikea soft toy, examplefly, vast communication etc.

Involved in Functional, Beta, UI, Integration, system security, performance, load, and Regression Testing.

Implement testing methodologies (Waterfall and Agile) and implement the testing tools.

Analyzed business team requirements and developed Test cases.

Design, code, test and maintain automation test scripts for applications across test and production environments.

Establish and maintain a detailed test automation framework and methodology.

Executed Test cases to verify actual results against expected results.

Navigational testing of the hyperlinks provided in the web pages in different browser and operating system combination.

Defect Reporting

2017-2017 ETL Tester

The Data Warehousing development for Guardian involves a tester of Data Mart which will feed downstream reports, testing of User Access Tool using which users can create ad-hoc reports, and run queries to analyze data in the proposed Data Mart. The current project replicated the functionality incorporated in the Underwriting Measures, but with daily update, a more robust architecture, and enhanced reliability.

Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)

Developed inline view queries and complex SQL queries and improved the query performance for the same.

Did extensive work with ETL testing including Data Completeness, Data Transformation & Data Quality for various data feeds coming from source.

Executed workflow based on customer requirements.

Wrote UNIX Shell Scripts and pmcmd command line utility to interact with Informatica Server from command mode.

2007 - 2008 Informatica Developer

KSD, Noida

worked as an ETL developer. We have around 85 feeds coming in daily as flat files. I worked on 13 feeds. Each feed will have around 15 different validations which we have to implement while bringing the data to staging from flat file. We have 3 target tables in each mapping which are ETL_DATA, INVALID_DATA_LOG and UNMAPPED_DATA. All valid data should go to ETL_DATA. Any record that fails at least 1 validation goes to INVALID_DATA_LOG table. Any record that fails to lookup the legacy segments to extract current data segments should go to UNMAPPED_DATA table. I created couple of Mapplets for the validation. When a record fails a validation, we have to attach respective message to that record, like this a record may end up with no messages or couple of messages. I added Normalizer transformation to split the record to multiple records based on number of validations it failed and moved this data to INVALID_DATA_LOG table.

Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all application and systems.

Involved in Extraction, transformation, loading and Implementation.

Worked on Informatica tool –Source Analyzer, Mapping Designer and Transformations.

Developed mappings using Informatica Power Center Designer to transform and load the data from various source systems like Flat files, oracle and loading to Oracle target database.

Extensively used various types of transformations such as source qualifier, Expression, Joiner, Update strategy, Aggregator, Lookup to load the data.

Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Created sessions to run the mappings and set the session parameters to improve the load performance.

Created complex joins, transformations of all types to pass data through ETL maps.

Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.

Used Debugger to test the data flow and fix the mappings.

2006 - 2007 Informatica Developer

KSD, Noida

Here I worked in 2 different modules which are Sales and Sample History. One of the modules is Sales which is related to sales on each Vendor level. We get the data in the form of Flat files on a daily basis and we process the data and load it to our Oracle Data Warehouse. We have different Mappings that deal with Sales, Returns and Incentives. Once the data gets loaded to Target, Incentives are calculated with Aggregations for each Quarter and will be sent to respective Vendors. Sales data is reported for each Sales Region, Product and Year level and further drilled to different Hierarchies by the Report Developers. Regarding Sample History module, here we used Slowly Changing Dimensions (Type I & II).

Interacted with the Business users to identify the process metrics and various key dimensions and measures.

Analyze the source data, perform impact analysis see how the data would fit into the data model designed and how it would help in feeding the reports.

Developed PL/SQL procedures/Packages for loading the data from stage table to

Facts/Dimensions with complex transform logic.

Created complex mappings using various transformations like SQ, filter, sorter, aggregator, expression, union, joiner, router, SQL, Look up and various other transformations.

Designed the ETL process and customized templates around Informatica and Oracle.

Created Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.

Deal with data issues in the staging flat files and after it was cleaned up it is sent to the targets.

2005-2006 SQL Developer

Write complex SQL queries for validating the data against different kinds of reports.

Worked with Excel Pivot tables.

Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Excel and SQL.

Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.

o



Contact this candidate