Post Job Free

Resume

Sign in

Data Manager

Location:
Bay Shore, NY
Posted:
January 20, 2021

Contact this candidate

Resume:

JAYA BHARATHI RAMASUBBU

Sr. Data Warehouse/Guidewire Migration/Data Migration/ETL/Informatica Developer

adjkul@r.postjobfree.com 631-***-****

Summary:

** + years of IT experience as Sr. Data Warehouse/Data Migration/ETL Developer in Requirements gathering, analysis, design, coding, documentation and implementation, Testing and support of Data Warehouse applications using Informatica Power Center.

Implemented data warehousing methodologies for Extraction, Transformation, and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor.

Extensively worked on ETL Informatica Transformations effectively including -Joiner, Aggregator, Lookup, Filter, Router, Normalizer, Update Strategy, Stored Procedure, Sorter, SQL Transformation, Union, Source Qualifier, Java, XML Generator and created complex mappings.

Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database, Oracle, SQL Server and DB2.

Highly proficient in performance analysis, monitoring and SQL/Postgre SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL.

Extensive experience with Informatica IDQ for performing Data Analysis, Data Profiling and

Data Governance and for Implementing IDQ plans for standardization and matching user data.

Strong experience working with XML, Flat Files along with loading and retrieving data from different sources using SCD Type1/Type2/Type3 in PowerCenter.

Extensive experience in Oracle SQL development to create tables, Views, Triggers, Indexes and Stored Procedures.

Strong knowledge of XML, XML Schema, XSD.

Working Knowledge in Python.

Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects.

Good Experience in creating complex SQL Queries and Complex Joins for data analysis.

Strong knowledge of software development life cycle methodologies such as Agile and Waterfall.

Experience in Production Support and change management.

Involved in source profiling, data analysis and performance tuning of existing SQL queries.

Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.

Skills in creating Test Plan from Functional Specification, and Detailed Design Documents and thorough with Deployment process from DEV to QA to UAT to PROD.

Extensive hands on experience on Informatica Mapping performance tuning, identifying, and removing performance bottlenecks and in coordinating with source system owners for day-to- day ETL progress monitoring, ETL Technical documentation, and Maintenance.

Experienced in working with Onshore – Offshore communication model.

Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Independently perform complex troubleshooting, root-cause analysis, and solution development.

Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

Understand the business rules completely based on High-Level document specifications and implements the data transformation methodologies.

Technical Skills:

ETL : Informatica Power Center 8.X/9.X/10.X

Databases : Oracle 11g/12c, DB2, MS SQL Server 2008, Teradata, Greenplum

RDBMS Load Tools : TOAD 8.x, SQL Developer

Languages : XML, SQL

Scripting : UNIX Scripting

Operating Systems : Windows XP, Windows 7/8/10, Linux

Scheduler : Autosys, Informatica Scheduler, ZENA, TIVOLI

Other Tools : HP Quality Center

Certification:

Certification

PowerCenter Data Integration 10: Developer, Specialist Certification on 11/22/2017

Education:

Qualification

Institute

Year of passing

Percentage Bachelor of Engineering

Anna University, India

2002-2006

76

Professional Experience:

Project – One Data

Client: Argo Group/ Donyati, Texas (June 2020 –Sep 2020).

Role: Data Governance Analyst

The purpose of the project is to add enrichment columns in staging area based on business logic and sending data to Data warehouse using Informatica/Postgre SQL.

Project – Preprocessor

Client: Broadridge Financial Services, Edgewood NY (Nov 2018 –Apr 2020).

Role: Sr Informatica ETL Developer and Designer

In this Project, we are receiving clients/customer’s investment data from various resources. Using Informatica, we are cleansing and calculating data for 1099 and 1042S tax form printing. After cleansing and computing, Data’s are stored in Greenplum DB. Due to the data volume,3years of customer/client’s data in current production environment. Cleansing and calculations are performed based on Clients Name details, Address, Income, Movement, Cost Basis. This project supports multiple clients based on parameters, Currently Broadridge is supporting JP Morgan Prime retail, JP Morgan Custody, UBS, and Wells Fargo clients.

Responsibilities:

Involved in gathering requirements with business users.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center and database objects in Postgre SQL.

Coordinated with source application team.

Involved in customers form correction activities.

Extensively used all the features of Informatica Versions 10.x including Designer,

Workflow manager, Workflow monitor and Repository Manager.

Extract data from flat files, VSAM and load the data into the target database (Greenplum).

Developed reusable transformations and mapplets, which can be used for multiple mappings.

Developed complex mappings using corresponding Source, Targets and Transformations like

Filter, Router, Update strategy, lookup, stored procedure, Normalizer, SQL, XML Generator in extracting data in compliance with the business logic.

Wrote functions in Greenplum and UNIX Scripts for automated execution of jobs.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Understand the different types of sources involved and relationship between the tables.

Scheduled jobs using TIVOLI.

Project – Application Migration

Employer: Wipro Limited

Client: UBS Financial Services Inc, Weehawken, New Jersey (Apr 2018- Oct 2018).

Role: ETL Developer and Designer

As part Application Adoption Management project, few of the informatica application were migrated to cloud environment. Involved in code migration and testing the outputs between the new and old environment.

Responsibilities:

Involved in gathering requirements with business users.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.

Coordinated with source application team.

Extensively used all the features of Informatica Versions 10.x including Designer,

Workflow manager, Workflow monitor and Repository Manager.

Extract data from flat files, Oracle and load the data into the target database.

Developed reusable transformations and mapplets, which can be used for multiple mappings.

Developed complex mappings using corresponding Source, Targets and Transformations like

Filter, Router, Update strategy, lookup, stored procedure in extracting data in compliance with the business logic.

Wrote stored procedures in Oracle and UNIX Scripts for automated execution of jobs.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Understand the different types of sources involved and relationship between the tables.

Updated app user about progress of the project on weekly/daily basis.

Project – Guidewire Policy/Billing/Claim Migration

Employer: Tata Consultancy Services Limited

Client - AVIVA CANADA Insurance (May 2016- MAR 2018), CANADA

Role: ETL Developer and Designer

Aviva Canada is one of the leading properties and casualty insurance groups in Canada providing home, auto and business insurance to more than three million customers. The company is a wholly owned subsidiary of UK-based Aviva plc and has more than 3,000 employees, 25 locations and approximately 1,500 independent broker partners. Aviva Canada is the second largest general insurance business in the Aviva Group. General insurance is a key growth area for Aviva and a core component of the Group’s customer composite strategy, providing customers with life insurance, general insurance, health insurance and asset management. This project is responsible to migrate data from RBC Insurance provider to AVIVA so extracted data from different sources like Oracle, Flat Files and load into Guidewire System which has 3 components i.e. Policy Center, Claim Center and Billing Center. Actively involved as a developer for preparing design documents and interacted with data modelers to understand the data model and ETL logic.

Responsibilities:

Involved in gathering requirements with business users.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.

Coordinated with source application team and developed all ETL mappings using various transformations like XML Generator, Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Java, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Extensively used all the features of Informatica Versions 9.x and 10.x including Designer,

Workflow manager, Workflow monitor and Repository Manager.

Extract data from flat files, Oracle and load the data into the target database.

Wrote stored procedures in Oracle and UNIX Scripts for automated execution of jobs.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Understand the different types of sources involved and relationship between the tables.

Created technical documents, tracked user acceptance, and updated the user guide and operational manuals.

Expertise in SQL and PL/SQL programming and Created PL/ SQL packages.

Developed ETL stage mappings to pull the data from the source system to the staging area.

Created Scorecards in IDQ Developer and Analyst to identify the trend of the quality of the data.

Designed and developed complex mappings using Informatica power center and Informatica developer (IDQ).

Participated in special projects during and performs other duties as assigned.

Participated in design, code, and test Inspections throughout life cycle to identify issues.

Extensively worked on data profiling and data quality rules development.

Created mappings in Informatica Developer (IDQ) using Parser, Standardizer and Labeler, Exception, Merge Transformations.

Extensively used pmcmd command to invoke the workflows from Unix shell scripts

Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using

Autosys.

Created sessions, sequential and concurrent batches for proper execution of mappings using

Server manager.

Migrated development mappings to QA and Production environment.

Developed error tables and audit table for loading bad records.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Prepared Knowledge Management Articles and Non-Functional Requirement documents for scheduling the code in Production and for the L1, L2 and L3 support.

Project – spRIGHT Data Migration

Employer: Tata Consultancy Services Limited

Client: Superpartners Insurance (DEC 2011- MAY 2014), Melbourne, Australia, (MAY2014- APR 2016), Bangalore, India

Role: ETL Developer and Designer

As Australia’s largest superannuation service provider, Superpartners exclusively collaborate with not-for-profit industry superannuation funds to create a better future for members. For 30 years, Superpartners has supported some of the country’s largest and most respected industry funds to provide tailored solutions and a seamless service experience to their members and employers.

As part of this project, we migrated data from oracle to SQL Server since oracle server-based application was replaced by the new application called spRIGHT.

Roles & Responsibilities:

Work closely with Project Manager to develop and update the task plan for ETL work and to keep the manager aware of any critical task issues and dependencies on other teams.

Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Java, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Ensure the ETL code delivered is running, conforms to specifications and design guidelines.

Developed reusable transformations and mapplets, which can be used for multiple mappings

Support development teams with performance tuning and troubleshooting issues.

Developed ETL stage mappings to pull the data from the source system to the staging area.

Perform root cause analysis on all processes, resolve all production issues, validate all data, perform routine tests on databases, and provide support to all ETL applications.

Monitor all business requirements, validate all designs, schedule all ETL processes, and prepare documents for all data flow diagrams.

Analyze and interpret all complex data on all target systems and analyze and provide resolutions to all data issues and coordinate with data analyst to validate all requirements, perform interviews with all users and developers.

Developed error tables and audit table for loading bad records.

Provide observations of the overall system and suggest automated steps to full-fill the business operations and the requirements.

Good understanding of performing file level verification tasks via UNIX Shell scripts and command-line utilities etc.

Involved in internal and external code review.

Participated in weekly status meetings, conducted internal and external reviews among various teams, and documenting the proceedings.

Supported QA and UAT testing. Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Object migration to different environments and managing the releases

Good understanding of data mapping, data validation, data manipulation, data analysis use cases.

Employer: Tata Consultancy Services Limited

Client: AVIVA (OCT2011-NOV 2011), Bangalore, India

Project – EDW Landing zone

Role: ETL Developer

The key objective of this Teradata Landing zone project initiative is to build an Enterprise Data Warehouse, which will establish single version of truth.

Roles & Responsibilities:

Involved in creating COBOL files, Teradata target tables, views and importing them to Informatica reusable folder and standardizing as per DSN standard rules

Collecting statistics for all the Teradata target tables to make faster data load and retrieval

Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Java, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Developing staging and SCD Type-2 mappings using various Transformations like Source Qualifier, Expression, Connected/Unconnected Lookup, Router, Filter, Update strategy, Sequence generator.

Developed reusable transformations and mapplets, which can be used for multiple mappings

Developed Unit test cases and involved in unit test to check for consistency.

Involved in writing Shell scripts to check file availability and to take backup of files daily.

Developed error tables and audit table for loading bad records.

Coordination with project team in various levels of development and testing.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Participated in weekly status meetings.

Employer: iTWINE Technology

Astral Jet (JUNE 2007 – May 2011), Bangalore, INDIA

Project – Astral Jet Data Migration

Role: ETL Developer

This application is used for Hospital Maintenance which Includes Patient Records Maintenance. Medication for Patients, Scheduling Component for Doctors, Drug Interaction Reports etc. As part of this project, we migrated data from oracle and various file sources into to SQL Server.

Roles & Responsibilities:

Preparing ETL Deployment guides and adding the required workflows to Deployment group for smooth migration of code to higher environments.

Coordinated with source data team and developed all ETL mappings using various transformations like Normalizer, Filter, Source Qualifier, Router, Joiner, Union, Update Strategy, Expression, SQL Transformation, Aggregator, Lookup, Sorter Transformation etc.

Developed Mappings based on mapping document.

Developed reusable transformations.

Developed ETL stage mappings to pull the data from the source system to the staging area.

Collecting statistics to make faster data load and retrieval.

Involved in internal code review.

Involved in Unit test and documenting the Test results.

Participated in weekly status meetings.

Developed error tables and audit table for loading bad records.

Creating source and target definitions and importing them to Reusable folder.

Maintained good quality on deliveries, even in complex tasks. Moved the code to Functional Testing without any defects and helped the testing team to understand the functionality.

Identifying and Preparing Data for Test Execution.



Contact this candidate