Bharath
Phone: 732-***-**** Ext : ***
Email: *******.*@***********.***
PROFESSIONAL SUMMARY:
Over 9 years of experience in Data modeling, Datawarehouse Design, Development and Testing using ETL and Data Migration life cycle using IBM Data Stage and Quality Stage 8.5,9.1.
Knowledge of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Star and Snowflake schema design (Data Modeling)
Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Modify, Aggregator, XML parsing stages.
Good in developing strategies for Extraction, Transformation and Loading (ETL) mechanism.
Created Job Sequencers to automate the job during failures and exception handling.
Experience in RDBMS using different database Oracle, Netezza, DB2
Experience in developing UNIX shell scripting.
Expert in Scheduling and monitoring the jobs using Control M and Autosys.
Experience in Data validation option tool to perform ETL validations.
Knowledge on Big Data Concepts (HDFS, Map-Reduce, HiveQL, Hbase and Pig).
Good experience in working with Onsite - Offshore team model projects.
Experience on using testing tools such as QC, Test Director, Bugzilla and QTP.
Scheduling test cycles and effectively testing the product, properly documenting the reported bugs and tracking the bugs until fixed.
Well versed in designing and executing test cases for GUI testing, Functionality, Regression and Integration Testing.
Strong knowledge of Unit Testing, Functional, End to End (E2E) System Integration Testing (SIT), User Acceptance (UAT) testing and Risk Based Testing.
Understanding of Grey Box, white box, black box, functional and customer oriented testing.
Experience in designing and executing test cases for GUI testing, Functionality, Regression and Integration Testing.
Extensively used SQL queries for data validation and interaction with database.
Excellent knowledge in Logical and Physical Data Models in Erwin Data Modeler.
Knowledge and experience working with Agile, Scrum, Waterfall methodologies.
Excellent communication, interpersonal, analytical and leadership skills, self-motivated, quick learner and a team player.
TECHNICAL SKILLS:
ETL Tools
IBM Infosphere Data stage, Informatica, DVO
Programming/Scripting Languages
C, C#, VBScripts, PL/SQL
Databases
MS Access, Oracle, DB2, SQL Server, Teradata
Scheduling Tools
Control M, Autosys
Testing and bug tracking Tools
HP ALM, QTP, Test Director, Bugzilla, Borland star team
Methodologies
Agile, Waterfall
Operating Systems
Windows XP/Vista/7, UNIX, Linux
WORK EXPERIENCES:
Client: eBay Enterprise Oct 2014 – till present
Project Title: BI Retooling Location: King of Prussia, PA
Role: Lead ETL Datastage Developer
Description:
This project is to enhance existing e-commerce DwH platform to include vertex application (tax) data so that the data is available to build the reports in Business Objects. To implement the load balances and scheduling the jobs according to the client needs so that the feeds to be delivered at the earliest.
Responsibilities:
Worked in Agile methodology and Involved in story writing sessions to get the requirements which are targeted to release.
Thorough Impact Analysis is done and then created detailed design, source to target mappings.
Modified the Data Model based on the requirements and worked with DBA’s to implement.
Designed and developed ETL processes using Datastage designer to load data from the xml files, to staging database and from staging to the target Data Warehouse database.
Used Datastage stages namely, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Data Capture, Funnel, Merge, Peek, Row Generator stages in accomplishing the ETL Coding.
Developed job sequencer with proper job dependencies, job control stages, triggers.
Involved in performance tuning and optimization of Datastage mappings.
Scheduled the jobs in Autosys Dev and Prod environments,
Documented ETL test plans, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
Participated in Daily Stand ups and provided demo at the end of each iteration.
Involved in the meetings with the production support team for any hard fixes at the product level.
Environment: IBM DataStage & Quality Stage 9.1 (Designer, Director, Admin), Netezza, Oracle 11g, DB2, Unix, Autosys,Business Objects Reporting.
Client: United Services Automobile Association Nov 2013 – Sep 2014
Project Title: Assign Peer Groups Location: San Antonio, TX
Role: Lead ETL Datastage Developer
Description:
The project is to create analytical repository and assign the groups to the members accounts and classify based on the business rules so that business can track and build the adaptive transactional models for detecting the suspicious financial activity. This project has been implemented in agile.
Responsibilities:
Thorough understanding in implementation of Agile Scrum methodology
Responsible for creating detailed design and source to target mappings.
Responsible to communicate with business users and project management to get business requirements and translate to ETL specifications.
Involved in meetings with Data modelers to create logical and physical model in Erwin Data Modeler.
Responsible for creating a detailed Analysis and Design documentation.
Extensively worked on building datastage parallel jobs using various stages like Oracle Connector, Funnel, Transformer stage, Sequential file stage, Lookup, Join and Peek Stages.
Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
Used Run time Column propagation to achieve re-usability.
Developed Job Sequences, job activities, execute commands, user variable activity and implemented logic for job sequence loops, recovery and restart.
Developed UNIX Shell scripts and scheduled the jobs using Control M.
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.
Used DVO as an Automation Tool to validate Source to Target transformation Logic on different databases.
Prepared status summary reports with details of executed, passed and failed test cases.
Involved in Release plan and had discussions on Exit Criteria with the project team frequently.
Attending daily and weekly status meetings & stand ups.
Environment: IBM Web Sphere Datastage 9.1 (Design, Director, Admin), IBM DB2 Database, SQL Server, ORACLE 11G, Netezza, Unix, Windows.
Client: United Services Automobile Association Feb 2013 – Oct 2013
Project Title: Transactional Profiling Location: San Antonio, TX
Role: Lead ETL Datastage Developer
Description:
The project was based on the creation of the analytical repository of aggregated high risk transactions to be used for building a transactional behavioral of an customer so that it will enable the business users to write the business rules and build the adaptive transactional models for detecting the suspicious financial activity. This project has been implemented in agile. The project objective was to collect, organize and store data from different operational data sources to provide a single source of integrated and historical data for the purpose of analysis and decision support to improve the client services.
Responsibilities:
Involved in understanding of business processes and coordinated with business analysts to get specific user specific stories and acceptance criteria.
Worked closely with DBA's to develop dimensional model using Erwin and created the physical model.
Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.
Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases.
Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
Extensively worked with sequential file, dataset, file set and look up file set stages.
Creation of jobs sequences.
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.
Created and executed SQL queries to perform source to target testing on DB2 database. Performed the following tests:
Data Extraction
Validity
Uniqueness
Data Integrity
Data Transformation
Data Quality
Initial and Incremental Load Test
Tested the entire data reconciliation process for multiple source and target systems.
Attending daily and weekly status meetings & stand ups.
Environment: IBM Web Sphere DataStage and Quality stage 8.5 (Designer, Director, Admin),Quality Stage,IBM DB2 Database, SQL Server,ORACLE 11G,Teradata,Unix, Windows.
Client: United Services Automobile Association Jan 2011 – Jan 2013
Project Title: Data Quality and Reporting Location: San Antonio, TX
Role: ETL Datastage Developer
Description:
Project was involved in developing a dashboard to show the data quality discrepancies in the daily load jobs across all the sources so that to fine tune the domain values from the front end and make sure data is appropriate to write the business rules and reduce money laundering activities.
Responsibilities:
Interacted with End user community to understand the business requirements and in identifying data sources.
Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.
Helped in preparing the mapping document for source to target.
Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, Flat Files (Fixed Width) to staging database and from staging to the target Data Warehouse database.
Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
Developed job sequencer with proper job dependencies, job control stages, triggers.
Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
Participated in weekly status meetings.
Environment: IBM DataStage 8.1 (Designer, Director, Admin), Oracle 10g,SQL Server 2008, DB2 UDB, Flat files, Sequential files.
Client: United Services Automobile Association Apr 2010 – Dec 2010
Project Title: Customer Risk Score Location: Hyderabad, India
Role: ETL Datastage Developer
Description:
This project is to identify the customers who might be high risk for money laundering activity. Vendor tool is used to track the alerts and necessary actions. Interacted with heterogeneous sources and the data load has been validated according to S2T mapping. Business rules has been validated for the alert generation process.
Responsibilities:
Developed the source to target process and mapping documentation.
Designed and developed jobs for extraction of data from different data feeds into IBM DB2 database.
Developed jobs for handling different data transformations as per specified requirements using stages like Join, Merge, Lookup, Transformer and Aggregator etc.
Used Change Capture Stage, Sort Merge and Funnel for developing the Delta process.
Designed and developed Shared Containers that can be reused by other parallel jobs.
Developed jobs for loading Data into DB2 target database.
Designed the Unit testing and integrated testing process and necessary documentation.
Involved in performance tuning to reduce the time consumption.
Used DataStage Manager for Importing and exporting jobs into different projects.
Used UNIX and PERL scripts to execute jobs and also used the DataStage
Director for scheduling, executing and monitoring jobs.
Environment: IBM Web Sphere DataStage 7.5(Designer, Director, Manager),IBM DB2 Database, SQL Server,Oracle,DB2, Teradata, Unix.
Client: AXA – Insurance Aug 2008 – Mar 2010
Project Title: Data Migration TATV Location: Chennai, India
Role: ETL Developer
Description:
This is a data migration project. Analyzed the heterogeneous sources and documented Source to Target mapping (S2T). Implemented ETL mappings based on S2T. Performance tuning is done by placing the partitions and changing the indexes on the tables. Unit tested all the mappings and given for the SIT.
Responsibilities:
Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.5.
Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
Designed and developed mappings using Source Qualifier, Expression, Lookup, Aggregator, Filter, Sequence Generator, Update Strategy, joiner and Rank transformations.
Worked with Functional team to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.
Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.
Environment: Informatica Power Center 8.5.1,RDBMS,Oracle 9i, Toad, Windows XP, Flat files, SQL
Client: AXA – Insurance Jul 2006 – Jul 2008
Project Title: Data Migration LAR Location: Chennai, India
Role: ETL Developer
Description:
This is a data conversions project which has been taken part after an acquisition of small firm by AXA to accommodate the customers to the new insurance plans and benefits.
Responsibilities:
Worked with the business analysts to gather the requirements.
Designed logical and physical data models for star and snowflake schemas using Erwin.
Wrote sequences for automatic generation of unique keys to support primary and foreign key constraints in data conversions.
Created and modified SQL*Plus, PL/SQL scripts for data conversions.
Developed and modified triggers, packages, functions and stored procedures for data conversions
Wrote SQL, PL/SQL, SQL*Plus programs required to retrieve data using cursors and exception handling.
Designed Data Modeling, Design Specifications and to analyze dependencies.
Creating indexes on tables to improve the performance by eliminating the full table scans and views for hiding the actual tables and to eliminate the complexity of the large queries.
Fine-tuned procedures/SQL queries for maximum efficiency in various databases using Oracle Hints, for Rule based optimization.
Environment: Oracle 9i, SQL*Plus, PL/SQL, Erwin, Toad.