Post Job Free
Sign in

Sr. ETL Consultant

Location:
Santa Ana, CA
Posted:
May 17, 2018

Contact this candidate

Resume:

Ramya Chavvakula

Email: **.*****.*@*****.***

Phone: 979-***-****

Summary:

Over 8 years of experience in Full life cycle of Software Development (SDLC) including Business Requirement Gathering & Analysis, System study, Application Design, Development, testing, Implementation, System Maintenance, Support and Documentation

Expert in all stages of Software development life cycle (SDLC) and Agile

Enterprise Data Warehousing: Experience and Expertise in ETL process (Informatica Power Center 9.6,9.5.1,9.0.1)

Expertise in implementing the XML and Web Services Transformations

Experience and Expertise in ETL process (Informatica PowerCenter Mappings, Mapplets,

Transformations, Workflow Manager and Workflow Monitor

Experience and Expertise in Operational Data Store (ODS) design, Data warehouse and mart design methodologies such as Star schema, Snowflake, designing slowly changing dimensions and fact tables. Also, expertise in extracting data from different source systems like Oracle, XML, Flat files and SQL Servers and loading into Data warehouse, Datamarts & ODS systems

Extensively worked on Web services consumer transformations & Error handling methodologies

Implemented Various Performance Tuning techniques on Mappings

Implemented Slowly Changing Dimensions Type 1, Type 2, and Type 3 methodology for accessing the full history of accounts and transaction information

Good experience in maintaining the versioning and deployment instruction documents while moving the code to various environments

Good understanding of Data warehousing, ER Modeling and Dimensional Modeling concepts

Expertise in Job Scheduling using Autosys and Tidal

Good working knowledge on RDBMS (Oracle, MS SQL Server)

Good knowledge on writing the PL/SQL Scripts like Procedures, Triggers and Functions

Involved in Unit Testing and Data Validation for Developed Informatica Mappings

Proven ability to quickly learn and apply new technologies and have creativity, innovation, and ability to work in a paced environment

Team player with excellent communication and problem-solving skills

Good Working Experience in UNIX Shell Scripting

Also strong in communication, interpersonal skills, multi-tasking, flexible in work schedules

Eager and keen on developing robust and reliable reports

Built adhoc reports in Tableau for data validation

Reporting

oUnderstanding user requirement

oRobust design for slicing and dicing, filtering, drill down and drill through

oScalable and adaptable for business requirement

oDimensional modeling

ETL

oDesigned for data integrity

oOptimized loads for Faster loading

oLeast user impact

oScalable and manageable

Oracle

oUnderstanding of Page level architecture of DB

oAutomated scripts for reusability

oQA scripts for data validation

oDesign table for

Technical Skills:

Reporting tools

Tableau, Looker, Cognos

ETL Tools

Informatica 9.x, OWB 10r2

Data Modeling

Lucid Charts, Dimensional Data Modeling, Star Schema, Snow Flake Modeling, Fact, Dimensions Tables

Data Warehouses

Jaros

Databases

ORACLE, SQL Server, MS Access

Scheduling Tools

Tidal, Autosys

Languages

C, SQL, PL/SQL

Operating Systems

Batch, Shell scripts

Bachelor Degree of Technology: ANU, Andhra Pradesh, India.

Visa Status: GC

Professional Experience:

L.A Care

Dec 2017- Feb 2018 Team size :8

Los Angeles, CA

Environment: Informatica Power Center 9.6, Autosys, Tableau, Oracle 10g, SQL server

ETL Developer

L.A Care The organization provides health insurance for low-income individuals in Los Angeles County through various health coverage programs. We used to work on tickets using JIRA. We work round the clock for on time delivery of work. We have sprints, used to work accordingly

Used Informatica PowerCenter as an ETL to extract data from source like MS SQL Server, Flat files, Oracle and loaded to target

Extensively worked with the various client component of Informatica like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Repository Manager, Workflow Manager, and Workflow Monitor

Extensively used different transformations like Lookup, Joiner, Aggregator, Filter, Sorter, Expression, Update Strategy, Source Qualifier, Rank, Router to create several mapping and mapplets

Implemented Slowly Changing Dimension phenomenon Type-1, Type-2 using Informatica ETL mapping

Created mapplets and reusable session for the performance tuning

Worked with Informatica workflow monitor in running and debugging its components and monitoring the resulting executable version

Involved in fine-tuning of sources, targets, mappings and sessions for performance optimization

Daily monitoring of the mappings that ran the day before and fixing the issues

Involved in unit, system and end-to-end testing of the design

Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations

Extensively worked on the performance tuning of the Mappings as well as the sessions

Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions

First American

Oct 2015 – Nov 2017 Team Size: 10

Santa Ana, CA

Informatica Consultant

First American is a financial service company and is a leading provider of title insurance and settlement services to the real estate and mortgage industries. It provides title insurance protection and professional settlement services for homebuyers and sellers, real estate agents, homebuilders and developers, title agencies and legal professionals to facilitate real estate purchases, construction, refinances or equity loans. We used to work on tickets using Issue viewer. Client needs have to be addressed as soon as possible and we work round the clock for best results. I play a vital role for on-time delivering on sprints, QA and Reporting requirements in the project.

Environment: Informatica Power Center 9.6, Autosys, SQL server

Responsibilities:

Involved in requirements gathering, functional/technical specification, Designing and development of end-to-end ETL process for Data Warehouse

Studied the existing OLTP system(s) and Created facts, dimensions and star schema representation for the data mart

Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems

Imported Source/Target Tables from the respective databases and created reusable transformations (Joiner, Routers, Lookups, Rank, Filter, Expression, and Aggregator) in a Mapplet and created new mappings using Designer module of Informatica

Worked with SQL Transformation and Java Transformation

Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager and Workflow Monitor

Extensively worked on the performance tuning of the Mappings as well as the sessions

Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions

Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies

Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations

Coordinated with DBA for tuning sources and targets and calculating table space and growth of database

Migrated the code from DEV repository to QA and then to PROD Repositories while in our monthly releases

Prepared a technical documentation and SOP’s for the existing system

Involved in ETL loads (Enable/disabled) through Informatica Admin Console during all the monthly

releases

Created test cases and completed unit, integration and system tests for Data Warehouse

Involved in Production support and ON Call support

Used Autosys to schedule jobs

Assist with migrating jobs into production

Monitor Production Jobs to ensure successful execution

Responsible for problem identification and resolution of failed processes or jobs and assisted customers in troubleshooting issues with their applications in Autosys

Proactive in finding issues and seeking and implementing solutions

Perform daily Production Support recurring tasks and monitoring of the tickets queue, assign tickets, resolve tickets, update root cause and remediation

Verizon

Oct 2013 – Oct 2015 Team Size: 8

Phoenix, AZ

Informatica Consultant

We work closely with our company which is multi billion investment company for building a versatile Data warehouse which will also act as a data hub to address many other system's data needs. With understanding of Informatica tool, DWH and business understanding we are able to help the customer the best way for end to end development

Environment: Informatica Power Center 9.6, Autosys, SVN, Tableau, Oracle 10g

Responsibilities:

Used Jira to track and maintain project progress

Responsible for requirement gathering, user meetings, discussing the issues to be resolved and

translated the user inputs into ETL design documents

Extracted data from various heterogeneous sources like Oracle, Flat files

Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse

Worked with Informatica Power Center server which executes tasks based on work flows. The work flows can be monitored using a work flow monitor. Jobs inside the program are designed in a mapping designer, which creates mapping between source and target.

Mapping is a pictorial representation about flow of data from source to target

Worked with Source Analyzer, Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse

Worked on several transformations such as Filter, Joiner, Sequence Generator, Aggregator, Source Qualifier, Expression, Lookup (Connected and Unconnected), Joiner, and Router, Web services, XML and Normalizer Transformations in Informatica

Created shell script to pass database connections, parameter entries for source and target

Involved in Production support and ON Call support

Used Autosys to schedule jobs

GE Aero – GRNI

Jun 2012 – Sep 2013 Team Size: 6

Houston, TX

Informatica Consultant

GRNI Stands for Goods Received and Not Invoiced. This delta amount of PO and AP piles up in Accrual account. Business’s target was to clear these by directly targeting the Buyers and Suppliers. The business had an urgent need to create an efficient reporting structure to understand the existing accrual, track and resolve the issue. I was leading the project to create a system where reports were developed to help the business to make a $ XX million in profit the first quarter after we implemented the system. The dashboard is for CXO level to track the performance. Dashboard help to find the accrual amounts across various other parameters too. The dashboard also has the Top 10 (Buyer, Sup and POs) for each OU to resolve the accrual issues more effectively. Drill Through/ Detail reports help to understand the Managers to understand the system easily. These reports presently also help a lot during closing of books and reduced man hours from days to minutes.

Environment: Informatica Power Center 9.1, Oracle 10g, PL/SQL, Tableau

Responsibilities:

Built adhoc reports in Tableau for data validation

Communicate with business customers to discuss requirements issues and feedback

Developed ETL components using Informatica to cater business requirements

Used Informatica file watch events to pole the FTP sites for the external mainframe files

Optimally used relational SQL wherever possible to minimize the data transfer over the network

Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations

Fixed the invalid mappings and troubleshoot the technical problems of the database

Performed unit testing at various levels of the ETL and actively involved in team code reviews

Performance tuning was done at the functional level and map level

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects

Pre and post session assignment variables were used to pass the variable values from one session to other

Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs

Created shell scripts to trigger Informatica workflows

Effectively used Informatica parameter files for defining session, workflow, FTP connections and relational connections

Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting

Production Support has been done to resolve the ongoing issues and troubleshoot the problems

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements

GE Aero – DWH Sync

Jan 2012 – March 2012 Team Size: 6

Hyderabad

Informatica Consultant

GE Aero has a shared (Jaros - Noetix) DWH. There are about 400 reports which run on DWH. The source is Oracle ERP 11i. Noetix has a triggered based logic with staging area to pump in data to DWH on a regular basis. Due to multiple reasons the some records don’t flow through and there is no standard reconciliation logic to check if the data is up to date. We have designed a robust logic to validate and also pump the delta records to DWH. This required reverse engineer the existing data model to find loop holes. Impact was phenomenal in delivery of reports and many month end book closing went without any glitches. We have automated the process and also were able to provide some Data Quality checks i.e. row is available but data in a column is varying. Made a million $ impact by timely delivery and reliable solution.

Environment: Informatica Power Center, Repository Manager, Oracle 10g, PL/SQL.

Responsibilities:

Worked Extensively with ERP functional team to gather the data mappings for ETL specifications and

Documenting ETL process design

Used Informatica Designer to create complex mappings using different transformations, Mapplets and

Reusable transformations for cleansing and move data to a Data Warehouse

Worked with Power Center Designer tool in developing mappings to extract and loading the data from

stage to pre load and later to data warehouse

Responsible for monitoring all the sessions that are scheduled, running completed and failed using

Workflow Monitor

Involved in debugging the Mappings that failed using debugger to validate the mappings and gain

troubleshooting information about data and error conditions

Modified all the existing Informatica Mappings by applying performance tuning techniques for better

Performance

Fixed bugs relating to daily load of data into Data warehouse for Business Intelligence reports

Extensive use of SQL Query, Joining tables, SQL functions in Queries, PL/SQL with the use of Cursors

Migrated code/objects from the development environment to the QA/testing environment to facilitate

the testing of all objects developed and check their consistency end to end on the new environment

Performed Unit Testing and Integration Testing for both Full and Incremental Loads

Participated in weekly end user meetings to discuss data quality, performance issues and ways to

improve data accuracy and new requirements, etc.,

GE Aero Kronos DWH Integration

Oct 2011 – Dec 2011

Hyderabad

Kronos is a Non ERP, 3rd Party T&A. GE Aero, selected workforce management solutions from Kronos Incorporated to control labor costs, minimize compliance risk, and improve workforce productivity. This data had to be merged with the existing DWH to track the Project's health and also the performance of employees.

Environment: Informatica Power Center Designer 9.1, Informatica Repository Manager, Oracle 10g, PL/SQL, Cognos 8

Responsibilities:

Organized multiple requirement discussion meetings with Business Users and Source system experts and formed functional and technical specs

Coordinating with business users, source system experts and DBAs regarding business functionality definitions and ETL design.

Developed ETL with oracle pl/sql procedures and Unix shell scripts for history data migration

Using Informatica Power Center Designer, analyzed the source data to Extract & Transform from various source systems (oracle 10g, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports

Using Informatica Power Center created mappings and mapplets to transform the data according to the business rules

Wrote PL/SQL stored procedures & triggers, cursors for implementing business rules and

transformations

Monitor daily, weekly, monthly Data Warehouse load processes and other DW related processes

Effectively used Source Designer,Target Analyzer to extract and load data from corresponding databases

Developed Mappings, Sessions, Workflows as per designed document by using client components of Informatica Power Center

Created ODBC connections to source and targets

Importing the metadata and relational data based on the source and target systems

Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement

Developed Stored Procedures and Cognos was used for reporting purposes

Constantly interacted with business users to discuss requirements

SolPro

(July 2010 – Sep 2011) Team Size: 12

Hyderabad

Management Trainee

SolPro - It’s a unifying procurement solution to help capture and reconcile all spends in an organization. It also helps businesses to successfully manage and leverage all spending categories. I was involved in all aspects of designing and developing a complete solution for automating the entire purchasing process, from purchase order creation on the front end to decision-making and reporting tools that provide management control, to integration with back office systems.

Environment: Cognos 8, Informatica 8.6.1, Oracle 10g, PL/SQL.

Responsibilities:

Involved in understanding requirements, analyze new and current systems to quickly identify required

sources and targets

Involved in preparation of technical documentation, design transformations that are consistent with the

goals of the existing data warehouse, and worked with the development team to implement the solution

Experience in designing, developing, and testing Informatica extract/transform/load processes in a Data Warehouse

Used the Workflow Manager to create Workflows, Worklets and Tasks

Written Minus Queries to make sure all data is loaded into target tables

Used Cognos to report out any mismatches

Used design and development techniques that resulted in efficient, maintainable, and high quality ETL

processes

Used problem solving skills and quickly understood relationships between data with little

documentation



Contact this candidate