Resume

Sign in

Data Manager

Location:
Central Islip, New York, United States
Posted:
February 14, 2018

Contact this candidate

Resume:

JAYA BHARATHI RAMASUBBU

Sr. ETL/Informatica Developer

ac4gzr@r.postjobfree.com 631-***-****

Summary:

** + years of IT experience as ETL Developer in Requirements gathering, analysis, design, coding, documentation and implementation, Testing and support of Data Warehouse applications using Informatica Power Center.

Implemented data warehousing methodologies for Extraction, Transformation, and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor.

Extensively worked on ETL Informatica Transformations effectively including -Joiner, Aggregator, Lookup, Filter, Router, Normalizer, Update Strategy, Stored Procedure, Sorter, SQL Transformation, Union, Source Qualifier, Java, XML Generator and created complex mappings.

Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database, Oracle, SQL Server and DB2.

Highly proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL.

Extensive experience with Informatica IDQ for performing Data Analysis, Data Profiling and

Data Governance and for Implementing IDQ plans for standardization and matching user data.

Strong experience working with XML, Flat Files along with loading and retrieving data from different sources using SCD Type1/Type2/Type3 in PowerCenter.

Extensive experience in Oracle SQL development to create tables, Views, Triggers, Indexes and Stored Procedures.

Strong knowledge of XML, XML Schema, XSD.

Experience in using Pitney Bowes Tool.

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Good Experience in creating complex SQL Queries and Complex Joins for data analysis.

Strong knowledge of software development life cycle methodologies such as Agile and Waterfall.

Experience in Production Support and change management.

Good knowledge in Microsoft Visio suite.

Worked on Web services to pull the data from

Experience in all phases of software development life cycle.

Involved in source profiling, data analysis and performance tuning of existing SQL queries.

Experience in Object Oriented Programming and Query Analysis.

Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.

Skills in creating Test Plan from Functional Specification, and Detailed Design Documents and thorough with Deployment process from DEV to QA to UAT to PROD.

Extensive hands on experience on Informatica Mapping performance tuning, identifying and removing performance bottlenecks and in coordinating with source system owners for day-to- day ETL progress monitoring, ETL Technical documentation, and Maintenance.

Experienced in working with Onshore – Offshore communication model.

Good Knowledge of working with various sources like Oracle, DB2, SQL Server, Teradata and Flat Files.

Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Independently perform complex troubleshooting, root-cause analysis and solution development.

Strong business understanding of verticals like Insurance and Finance.

Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

Understand the business rules completely based on High-Level document specifications and implements the data transformation methodologies.

Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

Experience in production support.

Team Player, Motivated able to grasp things quickly with analytical and problem-solving skills.

Technical Skills:

ETL : Informatica Power Center 8.X/9.X/10.X, Informatica Data Quality 9.6.1/9.1.1

Databases : Oracle 11g/12c, DB2, MS SQL Server 2008, Teradata

RDBMS Load Tools : TOAD 8.x, SQL Developer

Methodologies : Dimensional Modelling, Physical Data Modelling, Logical Data Modelling, Conceptual Data Modelling, Star Schema, Snowflake Schema

Languages : XML, VB

Scripting : UNIX Scripting

Operating Systems : Windows XP, Windows 7/8/10, Linux

Scheduler : Autosys, Informatica Scheduler, ZENA

Other Tools : Pitney Bowes Tool, HP Quality Center

Certification:

Certification

PowerCenter Data Integration 10: Developer, Specialist Certification on 11/22/2017

Education:

Qualification

Institute

Year of passing

Percentage Bachelor of Engineering

Anna University, India

2002-2006

76

Professional Experience:

Project – RBC Policy Migration

Employer: Tata Consultancy Services Limited

Client - AVIVA CANADA Insurance (May 2016- Jan 2018), CANADA

Role: ETL Developer and Designer

Aviva Canada is one of the leading property and casualty insurance groups in Canada providing home, auto and business insurance to more than three million customers. The company is a wholly owned subsidiary of UK-based Aviva plc and has more than 3,000 employees, 25 locations and approximately 1,500 independent broker partners. Aviva Canada is the second largest general insurance business in the Aviva Group. General insurance is a key growth area for Aviva and a core component of the Group’s customer composite strategy, providing customers with life insurance, general insurance, health insurance and asset management. This project is responsible to migrate data from RBC Insurance provider to AVIVA so extracted data from different sources like Oracle, Flat Files and load into target tables. Actively involved as a developer for preparing design documents and interacted with data modelers to understand the data model and ETL logic.

Responsibilities:

Involved in gathering requirements with business users.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.

Coordinated with source application team and developed all ETL mappings using various transformations like XML Generator, Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Java, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Extensively used all the features of Informatica Versions 9.x and 10.x including Designer,

Workflow manager, Workflow monitor and Repository Manager.

Extract data from flat files, Oracle and load the data into the target database.

Developed reusable transformations and mapplets, which can be used for multiple mappings.

Developed complex mappings using corresponding Source, Targets and Transformations like

Filter, Router, Update strategy, lookup, stored procedure in extracting data in compliance with the business logic.

Wrote stored procedures in Oracle and UNIX Scripts for automated execution of jobs.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Created profiles, complete initial data profiling, and adhoc profiling using Informatica Data Quality (IDQ).

Understand the different types of sources involved and relationship between the tables.

Created technical documents, tracked user acceptance and updated the user guide and operational manuals.

Developed ETL stage mappings to pull the data from the source system to the staging area.

Created Scorecards in IDQ Developer and Analyst to identify the trend of the quality of the data.

Designed and developed complex mappings using Informatica power center and Informatica developer (IDQ).

Participated in special projects during and performs other duties as assigned.

Participated in design, code, and test Inspections throughout life cycle to identify issues.

Extensively worked on data profiling and data quality rules development.

Created mappings in Informatica Developer (IDQ) using Parser, Standardizer and Labeler, Exception, Merge Transformations.

Extensively used pmcmd command to invoke the workflows from Unix shell scripts

Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using

Autosys.

Created sessions, sequential and concurrent batches for proper execution of mappings using

Server manager.

Migrated development mappings to QA and Production environment.

Developed error tables and audit table for loading bad records.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Prepared Knowledge Management Articles and Non-Functional Requirement documents for scheduling the code in Production and for the L1, L2 and L3 support.

Environment: Informatica Power Center 9.x/10.x, Informatica Data Quality(IDQ) 9.6.1/9.1.1, Oracle 11g, Oracle Designer, Unix Shell Scripting, Putty, Win SCP, HP QC, ZENA Scheduler.

Project – spRIGHT Data Migration

Employer: Tata Consultancy Services Limited

Client: Superpartners Insurance (DEC 2011- MAY 2014), Melbourne, Australia, (MAY2014- APR 2016), Bangalore, India

Role: ETL Developer and Designer

As Australia’s largest superannuation service provider, Superpartners exclusively collaborate with not-for-profit industry superannuation funds to create a better future for members. For 30 years, Superpartners has supported some of the country’s largest and most respected industry funds to provide tailored solutions and a seamless service experience to their members and employers.

As part of this project, we migrated data from oracle to SQL Server since oracle server based application was replaced by the new application called spRIGHT.

Roles & Responsibilities:

Work closely with Project Manager to develop and update the task plan for ETL work and to keep the manager aware of any critical task issues and dependencies on other teams.

Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Java, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Ensure the ETL code delivered is running, conforms to specifications and design guidelines.

Developed reusable transformations and mapplets, which can be used for multiple mappings

Support development teams with performance tuning and troubleshooting issues.

Developed ETL stage mappings to pull the data from the source system to the staging area.

Perform root cause analysis on all processes, resolve all production issues, validate all data, perform routine tests on databases, and provide support to all ETL applications.

Monitor all business requirements, validate all designs, schedule all ETL processes, and prepare documents for all data flow diagrams.

Analyze and interpret all complex data on all target systems and analyze and provide resolutions to all data issues and coordinate with data analyst to validate all requirements, perform interviews with all users and developers.

Developed error tables and audit table for loading bad records.

Provide observations of the overall system and suggest automated steps to full-fill the business operations and the requirements.

Good understanding of performing file level verification tasks via UNIX Shell scripts and command-line utilities etc

Involved in internal and external code review.

Participated in weekly status meetings, conducted internal and external reviews among various teams, and documenting the proceedings.

Supported QA and UAT testing.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Object migration to different environments and managing the releases

Good understanding of data mapping, data validation, data manipulation, data analysis use cases.

Environment: Informatica Power Center 9.x, Informatica Data Quality (IDQ) 9.6.1/9.1.1,SQL Server, Oracle 11g, Oracle Designer, UNIX Shell Scripting, Putty, Win SCP, HP QC, ZENA Scheduler.

Employer: Tata Consultancy Services Limited

Client: AVIVA (OCT2011-NOV 2011), Bangalore, India

Project – EDW Landing zone

Role: ETL Developer

The key objective of this Teradata Landing zone project initiative is to build an Enterprise Data Warehouse, which will establish single version of truth.

Roles & Responsibilities:

Involved in creating COBOL files, Teradata target tables, views and importing them to Informatica reusable folder and standardizing as per DSN standard rules

Collecting statistics for all the Teradata target tables to make faster data load and retrieval

Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Java, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Developing staging and SCD Type-2 mappings using various Transformations like Source Qualifier, Expression, Connected/Unconnected Lookup, Router, Filter, Update strategy, Sequence generator.

Developed reusable transformations and mapplets, which can be used for multiple mappings

Developed Unit test cases and involved in unit test to check for consistency

Created the parameter file for extracting the data on daily basis

Creating Reconciliation mappings for all project interfaces

Involved in writing Shell scripts to check file availability and to take backup of files daily.

Developed error tables and audit table for loading bad records.

Preparing ETL and Teradata DDL Deployment guides and adding the required workflows to Deployment group for smooth migration of code to higher environments

Coordination with project team in various levels of development and testing.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Participated in weekly status meetings.

Involved in internal and external code review

Environment: Informatica Power Center 9.x, Teradata, Unix Shell Scripting, Putty, Win SCP, HP QC.

Employer: iTWINE Technology

Astral Jet (JUNE 2007 – May 2011), Bangalore, INDIA

Project – Astral Jet Data Migration

Role: ETL Developer

This application is used for Hospital Maintenance which Includes Patient Records Maintenance. Medication for Patients, Scheduling Component for Doctors, Drug Interaction Reports etc. As part of this project, we migrated data from oracle and various file sources into to SQL Server.

Roles & Responsibilities:

Preparing ETL Deployment guides and adding the required workflows to Deployment group for smooth migration of code to higher environments.

Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Developed Mappings based on mapping document.

Developed reusable transformations.

Developed ETL stage mappings to pull the data from the source system to the staging area.

Collecting statistics to make faster data load and retrieval.

Involved in internal code review.

Involved in Unit test and documenting the Test results.

Participated in weekly status meetings.

Developed error tables and audit table for loading bad records.

Creating source and target definitions and importing them to Reusable folder.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Identifying and Preparing Data for Test Execution.

Environment: Informatica Power center 8.5.1, SQL Server, UNIX scripts, PUTTY, Windows XP.



Contact this candidate