Abhishek Kotagiri
Senior Informatica Developer
Cell: +1-408-***-****
Email: ********.******@*****.***
Objective:
Certified Informatica Developer with 8+years of experience in requirement analysis, development,
testing, support, performance tuning, deploying the code to QA and Production environments.
Experienced in Health insurance, Finance, Pharmaceutical and Banking industries. Actively looking for a
position as an Informatica Developer in a fast growth company.
Technical Skills:
ETL&BI Tools : Informatica Power center 9.6/ 9.5.1/9.1.0, SSIS and Cognos.
Data Bases : SQL Server 2008, DB2, Oracle 11gR2/10g/9i, TeradataV2R5, Netezza.
Development Tools : Toad, SQL Developer, SSMS, IBM Data Studio
Operating Systems : Windows 2007/2003, Windows XP, UNIX, Mainframe
Programming/Languages : SQL, PL/SQL, T-SQL, UNIX shell scripting, Perl Scripting
Scheduling Tools : Control M, Tidal and Autosys.
Methodologies : Data Modeling Logical/Physical/Dimensional, ER modeling and Dimensional modeling, Star/Snowflake, ETL, OLAP, Complete software development life cycle.
Professional Summary:
Expertise in Software development life cycle, design and implementing the solutions for Data
warehouses and Data marts using Extraction, Transformation and loading (ETL) mechanism with
the Star schema and Snowflake schema.
Experienced in ETL processes utilizing Informatica Power center tools such as Repository Manager,
Designer, Workflow Manager and Workflow Monitor to design data conversions from a variety of
relational, XML and flat file source systems to data warehouse/data marts.
Experience in implementing the business rules by creating transformations (Expression, Aggregate, Source qualifier, connected and Unconnected Lookup, Router, Rank, Update Strategy, Normalizer) also developing Maplet, worklets and Mappings.
Experience in Performance tuning mappings, SQL Optimization, identifying and resolving performance bottlenecks at various levels like sources, targets, mappings, and sessions.
Experience in using Informatica tools, SQL Scripts, Stored Procedures and execution of test plans for loading data successfully into the target Systems.
Experience in developing SQL and PL/SQL scripts for relational databases such as Oracle, Microsoft SQL Server, IBM DB2 Experience in UNIX programming and shell scripting for Informatica pre & post session operations and database administration activities.
Worked on Change data capture/control on insert, update operations.
Research & development of Informatica cloud version with the existing jobs.
Extensive interaction with business users, IT Managers in analyzing business requirements and translating requirements into functional and technical specifications.
24x7 Production Support for business continuity.
Knowledge of FACETS healthcare application and HIPAA X12 274 file format.
Experienced on Claims data processing which is enrolling, processing, declining, accepting, and dispatching against the claims. Converting ICD9 to ICD10 through look ups/crosswalks.
Experience in preparing documentation like HLD, LLD, TTD and Test case documentation.
Good Team Player with excellent communication, analytical and documentation skills.
Willingness to learn new concepts, technologies and ability to articulate alternative solutions.
Professional Summary:
Client : Anthem Inc. Norfolk, Virginia.
Duration : Feb 2016 to till date.
Role : Senior Informatica/SQL Developer
Project : CA 274 Provider Healthcare directory for California State DHCS
Environments : Informatica Power Center 9.5.1., SQL Server, IBM DB2 on Mainframes, SSIS, SQL, PL-SQL, T-SQL, TIDAL for scheduling and JIRA for defect/issue tracking.
Description:
Anthem Inc is the Health insurance which will provide the Health insurance provider, claims, enterprise reporting to the Business owners across the US. CA 274 Provider Healthcare directory for California is for implementing data in files to the business for 29 counties in California State DHCS on monthly basis. This project is also provides CA state Medicaid and MMP information to the DHCS.
Worked with Business in regards to the requirements, understanding them thoroughly for the complete project outcome.
Analyzed the business requirements, technical specification and physical data models for ETL mapping and process flow.
Implemented the business rules and extracted the provider, vendor, claims data from various sources such as SQL Server CA Secure and EPDS R6 (DB2 on Mainframe) loaded the required provider and claims data into flat files as per the specifications.
Applied the change data capture on the incoming source data.
Converted the ICD9 format to ICD10 format through crosswalks.
Worked on FACETS application, created the job between Facets and the ODS data mart.
Coded SQL scripts and T-SQL scripts based on the business requirement and tested the data set.
Implemented the transformations such as but not limited to Expression, connected/unconnected look up’s, router, sorter, Joiner, union, transactional control, aggregator, filter, stored procedure, sequence generator and etc.
Created Workflows, tasks, database connections, FTP connections using workflow manager.
Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event- Wait, Control)
Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
Implemented and documented all the best practices used for the data warehouse.
Improving the performance of the ETL by indexing and caching.
Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
Used third party scheduling tool TIDAL for scheduling the jobs.
Daily Interactions with End-to-End(E2E) team for defining the project requirements, based on understanding of the business needs and help them in analysis sessions
Co-ordination the tasks between Enterprise reporting team on daily basis during project life cycle.
Co-ordination with Architects, advisors, managers, DBA’s, Informatica administrator’s and other developers during project life cycle.
Reviewing the code developed by team developers and providing the inputs on fixes.
Prepare the Design documents and provide the test data characteristics.
Client : Affinion Group, Westerville, Ohio.
Duration : July 2015 to Jan 2016.
Role : Senior Informatica Developer/ Production Support
Project : Affinion Helix- Steady state ( MasterCard, Rainbow, ADVO card)
Environments : Informatica Power Center 9/1.0, Oracle, SQL Server, UNIX, SQL, PL-SQL, TIDAL for scheduling (file watcher) and HP quality center for issue tracking.
Description:
Affinion Group is the global leader in the designing, marketing and servicing of comprehensive customer engagement and loyalty solutions that enhance and extend the relationship of millions of consumers with many of the largest and most respected companies in the world. Within a global reach, the Affinion Group has offices throughout the United States, the United Kingdom, Europe, South America and South Africa.
Responsible for design and development of rating data mart for financial Data Warehouse.
Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.
Implemented and documented all the best practices used for the data warehouse.
Improving the performance of the ETL by indexing and caching.
Created Workflows, tasks, database connections, FTP connections using workflow manager.
Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
Used third party scheduling tool TIDAL for scheduling the jobs.
Created UNIX shell scripting for automation of ETL processes.
Daily Interactions with End-to-End(E2E) team for defining the project requirements, based on understanding of the business needs and help them in analysis sessions
Co-ordination the tasks between offshore and onsite on daily basis during project life cycle.
Reviewing the code developed by offshore and providing the inputs on fixes.
Prepare the Design documents and provide the test data characteristics
Playing a key role in resolving the ETL issues faced during System Integration Testing (SIT)
Client : Fannie Mae, Herndon, VA
Duration : Dec 2013 to July2015.
Role : Senior Informatica Developer
Project : Enterprise Data mart
Environment : Informatica Power Center 9.5.1/9/1.0, Oracle, SQL Server, UNIX, SQL, PL-SQL, T-SQL and TIDAL for scheduling.
Description:
Loan Accounting Initiative (LAI) was implemented at Fannie Mae to handle end-to-end Loan accounting system. This was implemented inside an existing Sub Ledger Systems (SLS) which handles the accounting events into individual debits and credits, maintains sub ledger balances and integrates with the General Ledger.
Responsibilities:
Involved in design, development and maintenance of database for Data warehouse project.
Involved in Business Users Meetings to understand their requirements.
Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 8.x
Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
Wrote PL/SQL scripts and T-SQL scripts for the complex requirement for the stored procedures.
Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
Worked extensively with the connected lookup Transformations using dynamic cache.
Worked with complex mappings having an average of 15 transformations.
Created and scheduled Sessions, Jobs based on demand, run on time and run only once
Monitored Workflows and Sessions using Workflow Monitor.
Performed Unit testing, Integration testing and System testing of Informatica mappings
Wrote UNIX scripts for the business needs.
Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process.
Client : Eli Lilly pharmaceutical, Indianapolis, IN
Duration : Sep 2011 to Nov 2013
Role : Informatica Developer
Project : Financial Reporting Warehouse
Environment: Informatica Power Center 8.6.0/9.1, Oracle 10g, SQL Server 2005, Netezza 4.5, Teradata V2R5, Toad, PL/SQL, BTEQ, Fast Export, Fast load, Multi load, Linux, and Windows 2000 / XP, Autosys for scheduling.
Description:
Eli Lilly and Company (NYSE: LLY) is a global pharmaceutical company and one of the world's largest corporations. Answers V10 Sales and Marketing Data Mart are basically meant for development of a Data Mart from a Data Warehouse. Development of Data Mart is again divided into 3 phases like Extraction, Loading and Aggregation. In the Extraction phase we are supposed to read the data from source tables of the data warehouse and store it in a staging area as flat files.
This phase involves complexity as we are supposed to implement most of the business rules and consider all possible constraints. This will make loading process simple. In the loading Phase we are supposed to read the data from staging area as flat files, which are created while Extraction, and load it into the Data Mart. After loading the data into Data Mart we are supposed to aggregate the data, for the tables like Reference, Dimensions and Facts.
Responsibilities:
Prepared a detailed Technical Design Document after analyzing the Functional Specifications and the Architectural Diagram. The Technical Design Document captured all functional and technical requirements.
Worked with different sources like Oracle, Teradata, SQL Server, DB2 UDB and flat files.
Worked on Complex SQL queries and
Developed numerous Complex Informatica Mappings, Mapplets and reusable Transformations.
Designed and created complex source to target mapping using various transformations inclusive of but not limited to Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions.
Developed Workflows with various Tasks and Parameter files.
Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event- Wait, Control).
Defined Target Load Plan and Constraint based loading for loading data appropriately into multiple Target Tables.
Performed Code review to ensure that ETL development was done according to the company’s ETL standard and that ETL best practices were followed.
Documentation of Technical specifications, business requirements and functional specifications for the Informatica mappings created for the building of the entire data warehouse. .
Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
Client : US Bank, Minnesota, USA
Duration : Jun 2010 – Aug 2011
Role : Informatica Developer
Project : Drill Accounts and Cost Analysis
Environment : Informatica Power Center 8.1.1/8.6.0, Oracle 10g, SQL Server 2005, Netezza, Teradata, Toad, PL/SQL, Linux and Windows 2000 / XP, control-M for scheduling
Description:
Project involved in design of data warehouse for US Bank. Main objectives of the project to improve data access and reporting, improve data integrity and consistency across the organization, the scope of the project is also to maintain present and historic data to support client’s business process and help senior management for better decisions. Entire ETL process is placed to get reports i.e. monthly financial reports, detailed budget status report, expenditure displays, non grant expenditure details, balances for current fiscal year, detailed budget status report for revenue display, details of all revenue transactions with budget invoice amount and balance display.
In US bank, Informatica Infrastructure is one of the biggest infrastructures for data warehousing which support tier 1 business critical application. It’s contains multiple features like Informatica Power Center Infrastructure. We are supporting Dev, QA and PROD environments and it’s contains around 100+ applications and using multiple source, target & functionalities systems, ASCI & Unicode services.
Responsibilities:
Implemented the business rules by creating transformations (Expression, Aggregate, Source qualifier, connected and Unconnected Lookup, Router, Rank, Update Strategy) and developing Maplet and Mappings.
Developed numerous Complex Informatica Mappings, Mapplets and reusable Transformations.
Developed Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions.
Developed Workflows with various Tasks and Parameter files.
Created different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event- Wait, and Control).
Performance tuning, identifying and resolving performance bottlenecks at various levels like sources, targets, mappings, and sessions.
Experience in using Informatica tools, SQL Scripts, Stored Procedures and execution of test plans for loading data successfully into the target Systems.
Experience in developing SQL and PL/SQL scripts for relational databases such as Oracle, Microsoft SQL Server, IBM DB2, Netezza and Teradata.
Experience in UNIX programming and shell scripting for Informatica pre & post session operations and database administration activities.
Client Name : BMC Healthcare, USA. Duration : Oct 2008 - May'10
Role : Informatica Developer
Project : BMC Analytics Data store
Environment: Informatica Power Center 8.1, Oracle10g, Teradata, PL SQL, Windows XP, Control-M for scheduling.
Description:
BMC Healthcare is one of the largest non-profit healthcare organizations providing healthcare services. It provides a comprehensive range of inpatient, clinical and diagnostic services in more than 70 areas of medical specialties and subspecialties. During this project I have worked closely with the data warehouse development team, customers, business analysts and other colleagues in the ITS department to analyze operational data sources, determine data availability, define the data warehouse schema and develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse.
Responsibilities:
Involved in design, development and maintenance of database for Data warehouse project.
Involved in Business Users Meetings to understand their requirements.
Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 8.x
Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
Created the Jobs between Facets to Staging and Staging to Data warehouse.
Worked extensively with the connected lookup Transformations using dynamic cache.
Worked with complex mappings having an average of 15 transformations.
Created and scheduled Sessions, Jobs based on demand, run on time and run only once
Monitored Workflows and Sessions using Workflow Monitor.
Performed Unit testing, Integration testing and System testing of Informatica mappings
Coded PL/SQL scripts.
Wrote UNIX scripts for the business needs.
Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process
Created Universes and generated reports on using Star Schema.
Education:
B.Sc. (MPCS) Computer Science from Osmania University, Hyderabad, India- 2008
References:
Available upon request.