Post Job Free

Resume

Sign in

ETL Lead Consultant

Location:
Dallas, TX
Salary:
$75/hour
Posted:
June 09, 2022

Contact this candidate

Resume:

Bhanu Sheetal Kodam

E: adq8vz@r.postjobfree.com

P: +1-469-***-****

L: Dallas, Texas

Professional Summary:

Having 11 years of experience in Analysis, Design and Development of ETL Methodologies, Technologies.

6+ Years of Experience in Informatica Data Quality and Agile Methodology.

Well Versed with the Designing and Development as Extraction, Transformation and Loading of Data in structured format to Data Warehouse.

4.5 Years of Experience in IICS Informatica Cloud.

Created IICS connections using various cloud connector in IIC administrator

Designed, developed and implemented ETL processes using IICS data integration

Experience in working on Data Quality Assistant &PowerCenter integration

Experience on Import/Export Object’s - Import /Export Projects using both Basic and Advanced methods.

Data Quality Score carding - Building Scorecards in the Analyst Tool

Having hands on experience of Data Profiling, Standardization Matching and Consolidation Techniques

Experience in creating reference tables in the Analyst Tool, Design and develop Mapplets, validate addresses, Identify duplicate records

Experience in Data Extraction, Data Profiling, Data Cleansing, Data Standardization, and Data Deduplication using Informatica Data Quality.

Extensively worked with Mapping Parameters and Session Parameters.

Developed Slowly Changing Dimension Mappings of type I and type II.

Worked with different data sources like Flat Files, Oracle Database definition.

Knowledge on UNIX and Sql, Coding.

Proficient in using PMCMD (Informatica command line utility) to create, schedule and control workflows, tasks, and sessions.

Involved in Code Migration.

Implemented Error Handling Strategies in all dimension loads.

Optimizing Informatica Mappings and Sessions to improve the performance.

Analytical and Technical aptitude with the ability to solve complex problems and Can work very well in a team as well as independently under minimum supervision.

Worked on Data Profiling using Informatica Analyst Tool.

Testing Using Power center data Validation tool

Experience in Zena Job Scheduling Tool

Technical Skills:

ETL/Data Quality Tools: Informatica Power center 9.5/9.6/10.1/10.2, Informatica IDQ, IICS

Databases : SQL server 2012, Oracle 11g

Scheduling Tools : Zena Job scheduling tool

Operating Systems : Windows 2000/XP

Other Tools : Toad, SQL Developer, Power center data validation tool.

Code Migration Tools: Jenkins

Education: Bachelor of Technology in Information Technology from Jyothishmathi College of Engineering & Technology, JNTU Hyderabad 2010.

Achievements: Won APEX award from Accenture as a trendsetter in work, Using the latest technology delivering high performance.

Professional Experience:

Engaged with Ekodus Inc with Cox Communications (Client), from Oct’2020-At present

Worked with Genpact, Hyderabad from Sep’2017 to Sep’2020

Worked with Tech Mahindra, Hyderabad from Nov’2015 to Aug’2017.

Worked with Accenture, Hyderabad from Mar’2014 to Nov’2015.

Worked with IBM India Pvt Ltd from May’2011 to Feb’2014.

Project Summary:

Ekodus Inc., Remote to Dallas, Texas

Client: COX Communications Oct’20 – At present

COX Communications. is a data service provider holding company based in Atlanta, and We capture Customer data and Process the Accounts of customers till MDM.

Roles and Responsibilities:

Working with Business Teams, and Production teams for Requirements gathering, and Project

coordination.

Analyzing Business Requirement Documents and Working on Jira Tickets and ALM Tickets for

Informatica Code Changes and Bug Fixes to Load Correct Business LOB’s data into STG tables

Worked on Integrating Various Heterogeneous source systems like Oracle, Teradata XML Files, and

HDFS Flat files into Target Database’s (Oracle and Teradata).

Written Python script to download the files from partner system and Checked the Volume counts

between Data file and control files.

Written few shell Scripts to Extract the files from Hadoop Distributed File System (HDFS) and handled

the delimiter changes to import files into informatica Source Directory.

Extracting the Source data, loading into Metadata Tables, and generating XML files to importing the

Mappings in Informatica Power Center.

Design and Developed Simple and Complex ETL Informatica Data Quality (IDQ) Mappings using

various Transformations like (Decision, Merge, Match, Address Validator, Parser, Labeler, standardizer,

Case converter, Comparison, Duplicate Record Exception and Key generator etc.).

Deployed the Data Quality Mappings into Power center client Designer.

4.5 Years of Experience in IICS Informatica Cloud.

Created IICS connections using various cloud connector in IIC administrator

Designed, developed and implemented ETL processes using IICS data integration

Design, and developed simple and complex ETL Informatica Power center Mappings and Workflows

Using with Various Transformations, and Tasks Data loaded into STG and BASE tables.

Created Informatica Mapping with SCD Type-1 and 2 Methodology to track the Historical Data.

Managed the Meta data associated with the ETL processes used to populate the Data Warehouse.

Improving Session Performance by Identifying Bottlenecks in Source Qualifier and Transformation

Level using with Debugger to Debug the Informatica Mappings and fixed them.

Using PMCMD Commands to run Informatica Jobs In DEV, SIT, UAT and PROD Environments.

Creating Parameter files and Autosys Jobs to run the Pre-Processing Scripts and Informatica workflows

Dynamically on Daily and Monthly loads.

Technologies: Informatica PowerCenter 10.1,10.2 SQL Server, and Informatica Analyst, Informatica Data Quality (IDQ).

Genpact, Hyderabad IN

Client: COX Communications Sep’17 – Sep’20

COX Communications. is a data service provider holding company based in Atlanta, and We capture Customer data and Process the Accounts of customers till MDM.

Roles and Responsibilities:

Mainly Involved in ETL Developing.

Worked on Data Profiling using Informatica Analyst Tool.

Created Profile, Scorecard, Rules and DQ Mappings.

Creating test cases using Power center data validation option.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle and then loading into Oracle database using Informatica.

Identifying the different patterns in data and preparing the sample template

Creating the DQ rules to cleanse and standardize the data

Prepared the name parsing templates for name parsing

Developed several complex Informatica mappings using transformations like Lookup, Router, Update Strategy, Aggregator, Filter, Joiner, and Sorter to incorporate business rules in transformation.

Implemented Error Handling and validation rules.

Performance tuned the mappings by optimizing the Transformations, Informatica functions, filtering source data at the source qualifier.

Created reusable Transformations and Mapplets to use in multiple mappings.

Extensive knowledge on Target Load Order Plan for loading data correctly into different Target Tables.

Created Sessions and configured Workflows in the Informatica Workflow Manager.

Extensively involved in tuning the Mapping, Sessions and Stored Procedures to obtain optimal performance.

Extensively involved in Unit Testing.

Technologies: Informatica PowerCenter 10.1,10.2 SQL Server, and Informatica Analyst, Informatica Data Quality (IDQ).

Tech Mahindra, Hyderabad IN

Client: Selective Insurance Nov’15-Aug’17

Selective Insurance Group, Inc. is a regional insurance holding company based in Branchville, New Jersey, and provides property and casualty insurance products and insurance services to customers in the United States through its subsidiaries. Selective provides insurance, alternative risk management products and related services to businesses and individuals, and administers flood insurance policies for the National Flood Insurance Program. The Insurance Operations segment writes commercial lines and personal lines property and casualty insurance through independent insurance agents in 22 states in the Eastern and Midwestern regions of the United States.

Roles and Responsibilities:

Mainly Involved in ETL Developing.

Worked on Data Profiling using Informatica Analyst Tool.

Created Profile, Scorecard, Rules and DQ Mappings.

Creating test cases using Power center data validation option.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle and then loading into Oracle database using Informatica.

Identifying the different patterns in data and preparing the sample template

Creating the DQ rules to cleanse and standardize the data

Prepared the name parsing templates for name parsing

Developed several complex Informatica mappings using transformations like Lookup, Router, Update Strategy, Aggregator, Filter, Joiner, and Sorter to incorporate business rules in transformation.

Implemented Error Handling and validation rules.

Performance tuned the mappings by optimizing the Transformations, Informatica functions, filtering source data at the source qualifier.

Created reusable Transformations and Mapplets to use in multiple mappings.

Extensive Knowledge on Target Load Order Plan for loading data correctly into different Target Tables.

Created Sessions and configured Workflows in the Informatica Workflow Manager.

Extensively involved in tuning the Mapping, Sessions and Stored Procedures to obtain optimal performance.

Extensively involved in Unit Testing.

Technologies: Informatica Power Center 9.1,10.2 SQL Server, and Informatica Analyst.

Accenture, Hyderabad IN

Client: Level3 Communications Mar’14-Nov’15

Level3 Communications is a communication project, It provides Bandwidth, Internet services, IPBUS services to all clients in North America. Level 3 also acquired many different small broadband companies. It collects orders from different clients, provides bandwidth and other services also in America.

Roles and Responsibilities:

Mainly Involved in ETL Developing.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle and then loading into Oracle database using Informatica.

Created Source and Target Definitions in the repository using Informatica Source.

Analyzer, Warehouse Designer, Transformation Developer and Mapping Designer.

Developed several complex Informatica mappings using transformations like Lookup, Router, Update Strategy, Aggregator, Filter, Joiner, and Sorter to incorporate business rules in transformation.

Implemented Error Handling and validation rules.

Performance tuned the mappings by optimizing the Transformations, Informatica functions, filtering source data at the source qualifier.

Created reusable Transformations and Mapplets to use in multiple mappings.

Extensive Knowledge on Target Load Order Plan for loading data correctly into different Target Tables.

Created Sessions and configured Workflows in the Informatica Workflow Manager.

Extensively involved in tuning the Mapping, Sessions and Stored Procedures to obtain optimal performance.

Extensively involved in Unit Testing.

Technologies: Informatica Power Center 9.1, Oracle 11G, and UNIX.

IBM, Hyderabad IN

Client: Bank of America Feb’12-Feb’14

Bank of America Merryl Lynch is a bank which is acquired by Bank of America. All the customers information is stored in Data Warehouse i.e. Atomic Data Repository. There are about 12 regions where we get this data from which gets processed on a daily basis. Type 2 mappings are used to store the history of customer’s data.

Roles and Responsibilities:

Mainly involved in ETL development.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle and loading into Oracle database using Informatica.

Developed several complex Informatica mappings using transformations like Lookup, Router, Update Strategy, Aggregator, Filter, Joiner, and Sorter to incorporate business rules in transformation.

Implemented Error Handling and validation rules.

Performance tuned the mappings by optimizing the Transformations, Informatica functions, filtering source data at the source qualifier.

Created reusable Transformations and Mapplets to use in multiple mappings.

Extensive Knowledge on Target Load Order Plan for loading data correctly into different Target Tables.

Created Sessions and configured Workflows in the Informatica Workflow Manager.

Extensively involved in tuning the Mapping, Sessions and Stored Procedures to obtain optimal performance.

Extensively involved in Unit Testing.

Technologies: Informatica Power Center 9.1, Oracle 11G, and UNIX.

IBM, Hyderabad IN

Client: CISCO May’11-Feb’12

CISCO is a manufacturer of VOIP phones, they have customers around across the world CISCO takes the order of the VOIP phones for their customers and delivers them within the desired period according to the agreement the invoiced number of phones, services given and warranty etc all are captured in Data warehousing.

Roles and Responsibilities:

Mainly Involved in ETL Developing.

Extracting, Scrubbing and Transforming data from Flat Files, Oracle and then loading into Oracle database using Informatica.

Created Source and Target Definitions in the repository using Informatica Source

Analyzer, Warehouse Designer, Transformation Developer and Mapping Designer.

Developed several complex Informatica mappings using transformations like Lookup, Router, Update Strategy, Aggregator, Filter, Joiner, and Sorter to incorporate business rules in transformation.

Implemented Error Handling and validation rules.

Performance tuned the mappings by optimizing the Transformations, Informatica functions, filtering source data at the source qualifier.

Created reusable Transformations and Mapplets to use in multiple mappings.

Extensive Knowledge on Target Load Order Plan for loading data correctly into different Target Tables.

Created Sessions and configured Workflows in the Informatica Workflow Manager.

Extensively involved in tuning the Mapping, Sessions and Stored Procedures to obtain optimal performance.

Extensively involved in Unit Testing.

Developed Informatica mappings and mapplets by using various transformations like Lookup, Joiner, Expression, Aggregator, Update Strategy, etc.

Worked with Mapping Parameters and session parameters to pass the values dynamically.

Implemented Type 2 Slowly Changing Dimensions of Effective Date Range.

Implemented Error handling strategy in mappings.

Performance tuned the mappings by optimizing the Transformations, Informatica functions, filtering source data at the source qualifier.

Technologies: Informatica Power Center 8.6 and Oracle 10G.



Contact this candidate