Post Job Free
Sign in

ETL/Informatica Developer

Location:
Edison, NJ
Posted:
October 18, 2017

Contact this candidate

Resume:

AVINASH BABU GAJULA

: 732-***-****

: *******@******.***

Professional Summary

* ***** ** ****-***********/ETL/Teradata experience in the Business Requirements, Analysis, Design, Development, Data Exchange and Implementation and Testing of Data warehouse/Data Mart systems.

Expert knowledge in using Power Center tools – Mapping Designer, Workflow Manager and Workflow Monitor, Repository Manager, Admin console.

Excellent knowledge of Data warehousing and Business Intelligence Concepts like Star Schema, Snowflake Schema and slowly changing dimensions, with emphasis on ETL implementations, making data readily available for reporting needs of end users.

Extensively worked on Data migration, Data cleansing, Data Staging of operational sources, CDC using ETL.

Experience in Performance tuning and Error Handling techniques and debugging ETL components using Informatica debugger.

Proficient in implementing complex business rules through transformations (Expression, Aggregator, Filter, Lookup, Router, Joiner, Update Strategy, Normalizer), reusable components (transformations, mapplets, sessions, worklets) in informatica.

Expertise in working in Teradata V2R12.x systems and used utility Teradata SQL

Expert in Data Extraction and integration of heterogeneous Data Sources such as Oracle, DB2,Teradata, MS SQL-Server, Salesforce.com, Siebel-CRM, Mainframes, Web Services, Flat files.

Extensive knowledge in MongoDB concepts

Excellent problem solving, communication, leadership, analytical and interpersonal skills working independently or as part of a team, highly effective at communicating with all levels of management and coworkers.

Experienced using SQL scripting, TSQL, Indexes, Functions, Procedures, Triggers and Cursors.

Outstanding skills in analyzing business process and end-user needs, detail oriented and committed to delivering superior quality work.

Worked with SQL, PL/SQL procedures and functions, stored procedures and packages within the mappings.

Implement PL/SQL requirements or rules into the ETL tool.

Experience in Performance Tuning of sources, targets, mappings and sessions.

Identified gaps and risk in the process and automated reports using VBA resulting in 100% accuracy, time saving and risk reduction.

Strong communication skills and excellent experience in working with small and big teams under stringent timelines.

Involved in unit level, integration and system testing.

Familiar with Waterfall and Agile methodologies and familiar with all the stages of SDLC & STLC.

Certifications

Certified ISTQB Tester (Foundation level) from BCS (The Charted Institute for IT)

Certification in ITIL v3 Foundation from EXIN in 2011

Technical Skills & Soft Skills

Data Warehousing : Informatica Power Center 9.x/8.x/7.x, Informatica Power Exchange 8.6.1

Databases : Oracle 11g/10g, Teradata R13, R12, v2R6, SQL Server, DB2

Languages : SQL, PL/SQL, Teradata SQL, Java, VBA Teradata tools &

Utilities query facilities : SQL Assistant, Visual Explain, Index Wizard, View point

Testing Concepts : SDLC, STLC, Testing Levels, Testing Types

Scheduling Tools : Autosys, Control M

Automation Tools : QTP10.0

Tools : HP Quality Centre, ALM

Hadoop/Big Data : HDFS, MapReduce, Pig, Hive, Sqoop, Oozie, Zookeeper

Microsoft Office : Word, Excel, Power Point, Access, SharePoint 07

Operating Systems : Windows XP/2000/Vista, 7, UNIX

Achievements

Achieved Just Do It Certification (Using Six Sigma Methodology) for Developing and Testing the Trend Analysis Reporting Tool, which saved 99% of Line Of Business time in generating reports with 100% accuracy.

Received Platinum award for being part of Limca Book of Records in Toastmaster’s impromptu speech National Record.

Received Gold award for the stepped up to support Technical Change Governance team in fixing a VBA issue. When the team was in a position that put delivery of required Controls at risk.

Received Silver award on completing the automation for Service Level Management team which resulted in process saves approximate 20 man hours/month.

Recognized by Stateside management team with 2 Bronze award and 3 Silver award for providing seamless support and regular improvement & automations

Academic Qualification

Bachelor of Technology from Jawaharlal Nehru Technological University.

Domain Expertise

Banking

Telecom

Insurance

Retail

Health care

ITIL Service Management

Quality Assurance (ISO, Six Sigma)

Work Experience

Client: PepsiCo (Texas) March 16 - till date

ETL-Informatica Developer

Description: PepsiCo is a world leader in convenient snacks, foods and beverages. Pepsi products can be found in nearly 200 countries around the globe, working in the development of Operational data store (ODS) and Data marts to support Business Intelligence needs. Informatica is the ETL tool to process the data from different sources to Oracle data warehouse.

Responsibilities:

Involved in requirement analysis, ETL design and development for extracting data from the source systems like Oracle, flat files and loading into DataMart.

Worked as an onsite coordinator managing the team in the off shore.

Developed complex mapping logic using various transformations like Expression, Lookups Joiner, Filter, Sorter, Router, Update strategy, Normalizer and Sequence generator

Converted functional specifications into technical specifications (design of mapping documents).

Created complex Informatica mappings to implement Change Data Capture mechanism by using Type-2 effective date and time logic.

Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor.

Worked on creating PL/SQL, Stored Procedures, Database Triggers, Views, performance tuning and optimization.

Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.

Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO_DATE, Decode, and IIF function.

Effectively communicated with the Business Partners and team members, the problem and the expected time resolution.

Responsible for providing timely feedback and necessary help/cooperation to meet client expectations

Created Unix scripts as it is a new application

Documented existing mappings as per standards and developed template for mapping specification document.

Preparing implementation plan for moving the code to production such as pre implementation, post implementation and back out plan.

Preparing Rollout Plan to knowledge transfer the Production support team and support the application.

Perform Unit and Integration testing. Reporting of bugs and bug fixing.

Client: Bank of America (New Jersey) Aug 13 – March 16

ETL-Informatica Developer

Description: GRIFFIN initiated as a solution to existing EMEA regulatory reporting workflow which will be flexible enough to efficiently and effectively accommodate further entities and likely future amendments to the rules. It aims to deliver an efficient and robust reporting system for regulatory reporting that enriches the data at lowest level of granularity requirement which also ensures positions/ transactions can be reconciled back to General Ledger

Responsibilities:

To understand and analyze the requirement documents and explaining them to team whenever and wherever necessary.

Design and creation of detailed Technical Mapping document with information on implementation of business logic.

Wrote several Teradata SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request.

Prepare Unit and Integration test cases.

Worked on Pushdown optimization for Teradata and Informatica

Familiar with Embedded and Normalized data models in MongoDB.

Extracted data from various source systems like Oracle and flat files as per the requirements and loaded it to Teradata.

To design Informatica mappings by using various basic transformations like filters, routers, source qualifiers, lookups etc and advance transformations like aggregators, XML source qualifiers, sorters etc.

Building reusable components include reusable expression, mapplets, sessions and flows to facilitate flexible portfolios.

Create workflows and set the session dependencies.

Wrote services to store and retrieve user data from the MongoDB for the application on devices.

Worked extensively with aggregate functions like Min, Max, First, Last, and Count in the Aggregator Transformation.

Extensively used SQL Override, Sorter, and Filter in the Source Qualifier Transformation.

Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.

Worked with various re-usable tasks, workflows, mapplets, and re-usable transformations.

Worked with Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer

Perform Unit and Integration testing. Reporting of bugs and bug fixing.

Used session logs, Informatica Debugger, and Performance logs for error handling when we had workflows and session fails.

Worked with the third party for Automation of job processing using Autosys scheduler, establishing automatic email notifications to the concerned persons by creating email tasks in workflow manager.

Facilitate QA testing and creating the high level QA test document.

Acting as an onsite coordinator and working together with teams in different locations.

Client: Bank of America (New Jersey) March 11 – Aug 13

ETL-Informatica Developer

Description: Managed Assets is one of the biggest groups in Merrill Lynch which generates 25% of GWM revenue. They want the data for all the asset classes in the flat file format. The strategic service being DATASOA (TIBCO Business works) has already invented the wheel to cleanse and transform the data; hence it was decided to build a reusable ETL code who consumes the messages from DATASOA and then supply the data to various clients in the form of the flat file. The design has been very flexible and it is currently catering to 5 different clients.

Responsibilities:

As an ETL Developer, gathered business requirement and implementing their business needs.

To design Informatica mappings by using various basic transformations like filters, routers, source qualifiers, lookups etc and advance transformations like Aggregators, Normalizers, XML source qualifiers, sorters etc.

Create workflows and set the session dependencies.

Worked on procedures and functions.

Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.

Develop UNIX shell script to perform file cleansing, archiving the file, creating a parameter file, Data Exchange etc.

Tuning the transformations, mappings and sessions for performance enhancements.

Worked on coding and fine tuning SQL scripting, PL/SQL Stored Procedures

Done migrating data from MySQL to SQL server and MongoDB

Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.

Developed mappings/sessions using Informatica Power Center for data loading.

Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.

Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x.

Preparing implementation plan for moving the code to production such as pre implementation, post implementation and back out plan.

Preparing Rollout Plan to knowledge transfer the Production support team and support the application.

Perform Unit and Integration testing. Reporting of bugs and bug fixing.

Client: Cisco (Charlotte, NC) July10 – Mar 11

ETL-Informatica Developer

Description: Cisco is the worldwide leader in networking that transforms how people connect, communicate and collaborate. At Cisco customers come first and an integral part of our DNA is creating long-lasting customer partnerships and working with them to identify their needs and provide solutions that support their success

To support the above objective there is one application called Kinaxis, which bring in the statistics of existing demand and planning. Data needs to fetch from different source systems and load into target database for Kinaxis.

Responsibilities:

Design the mapping to populate the data from different sources to oracle database.

Testing, debugging the data loading for full and incremental Loads.

Optimizing/Tuning mappings for better performance and efficiency.

Worked in various types of transformation like Lookup, Update Strategy, Stored Procedure, Joiner, Filter, Aggregation, Rank, Router etc. of Informatica.

Involved in writing the test cases for Unit Testing.

Review the documentation and coding created by other team members.

Responsible for design the High-level technical design & detailed ETL design.

Defining Data Movement standards for data warehouse from different application

Resolving if any issues in transformations and mappings during the development, Data Exchange and bug fixing

Change request creation and additional requirement documenting to the same and Validation of test scripts

Responsible for facilitating the business users for integration and UAT

Worked on issues identified during the post production issue tracking meetings held with the business users and Monitoring and tracking those issues to closure.

Participating in the project status meeting and providing track status as of date

Client: HDFC Insurance (Hyderabad, India) May 09 – July10

ETL Consultant

Description: HDFC insurance provider focused on Life, Home and Auto insurance; it has grown to provide insurance to more than 700,000 policyholders living across India.

Responsibilities:

Worked with various Informatica client tools such as Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.

Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator etc.

Implemented weekly error tracking and correction process using Informatica.

Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Involved in unit testing, user acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Developed the DDL’s, DML’s, PL/SQL stored procedures, indexes and for ETL operations on Oracle, and SQL server databases.

Involved in developing test data/cases to verify accuracy and completeness of Data Exchange and ETL process.

Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.



Contact this candidate