Email id: firstname.lastname@example.org Mobile:636-***-****
Over 10 years of ETL and data integration experience in developing ETL mappings and scripts using Informatica PowerCenter 10.0/9.x/8.x/7.x, Oracle and Informatica Data Quality.
Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
Experience in working with business analysts to identify, study and understand requirements and translated them into ETL code in Requirement Analysis phase.
Proficient in analyzing and translating business requirements to technical requirements.
Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
Well versed in OLTP Data Modeling, Data warehousing concepts.
Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
Experience in integration of various data sources like Oracle, DB2, Sybase, SQL server and MS access and non-relational sources like flat files into staging area.
Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.
Experienced in UNIX work environment, file transfers, job scheduling and error handling.
Knowledgeon developing and debugging Informatica mappings, mapplets, sessions and workflows.
Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
Worked on Performance Tuning, identifying and resolving performance bottleneck in various levels like sources, targets, mappings and sessions.
Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions, packages.
Involved in Unit testing, System testing to check whether the data loads into target are accurate.
Experience in support and knowledge transfer to the production team.
Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.
Extensive functional and technical exposure. Experience working on high-visibility projects
Assign work and provide technical oversight to onshore and offshore developers
Excellent analytical/ communication skills and good team player.
ETL Tools: Informatica Power Center 9.6, Informatica Data Quality IDQ, SSIS
Languages: SQL, PLSQL, UNIX Shell Scripting
Methodology: Agile SCRUM, Waterfall
Databases: Teradata14/13/v2r12/v2r6/v2r5, Oracle11i/10g/9i, SQLSERVER2005/2008,
Operating Systems: Windows, UNIX, Linux
IDEs: PL/SQL Developer, TOAD, Teradata SQL Assistant, SQL * Loader, Erwin 3.5
BI Reporting Tools: Tableau, SSRS, OBIEE
Scheduling Tools: Control-m, Autosys,
Tracking Tools: JIRA, Rally
Master’s in computer applications from Osmania University Hyderabad, India.
Client: Anthem Inc April 2017 -Current
Informatica Lead, Tetrasoft Inc, St Louis, US
Project : Edward
Edward is the Enterprise Data Warehouse and Research Depot, EDWard allows the company to take data from many different source systems, both internal and external to the company, Populates that data into EDWard, and then transforming that data into value added information including items like:
• Consumer Profile
• Cost / Quality of Care
• Client Reporting
• Clinical Analytics
Involved in Business Users Meetings, JAD sessions to understand client requirements.
Transform the Business requirements into Technical requirements. Transform the Design into Flow Charts, Workflows and Explain that to the Developers.
Worked on Heterogeneous Source and Target Systems for Movement of Data
Worked with FTP Teams and Vendor to establish the connectivity and the Transmission protocols
Peer reviewed each Design and given suggestion for an Optimum design and Code based on Performance and Coding standards.
Design and Development of ETL routines, using Informatica Power Center within the Informatica
Mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected sored
Procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source
Qualifiers and data flow management into multiple targets using Routers were extensively done.
Worked on profiling the Data, Analyzing the Join conditions, Development of Mappings and write the Informatica code & UNIX scripts.
Worked on code merge between two different development teams.
Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
Wrote codes in Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process. Also wrote codes in FTP scripts and Post Process Scripts for Data Validation.
Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
Fixed the invalid mappings and troubleshoot the technical problems of the database.
Coded BTEQ Teradata scripts to extract Data and Send it to the Vendor.
Effectively worked on Onsite and Offshore work model. Conducted Project Health and Status meeting using SCRUM calls.
Created and scheduled Sessions for different Job setups like on demand, run on time and run only once.
Worked with Scheduling Teams to setup the Job flows and Processes.
Developed implementation and test plans, built software acceptance criteria, coordinated and worked with clients to oversee the acceptance and dissemination process.
Performed Unit, Integration, System testing of Informatica mappings and Workflows.
Created Test plans and Test cases and discussed with the Stakeholders for the things in scope and documented them in the rational tools.
Developed Error and Validation reports as per the User requirements
Created Universes and generated reports in Business Objects using Star Schema.
Maintained Quality Procedure for the code and continuously monitor and audit to ensure code meets quality goals.
Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor, Teradata V 13.0, Fast load, Multiload, TPT Teradata SQL Assistant, PL/SQL, UNIX.
Client: Anthem Inc Apr 2012 – May 2017
Informatica Offshore Lead, Tetrasoft Ind Pvt Ltd, Hyderabad, India
CRDW (Central Region Data Warehouse).
The Central Region Data Warehouse (CRDW) houses all Membership (members and groups), Premium, Claims (Medical and Pharmacy) and Capitation and Provider data for Facets, Nasco, FEP, and WGS Business. The warehouse contains remitted claims both approved and denied for end user reporting for Actuarial. CRDW load schedules are on Daily, Weekly, Monthly basis
Approach of the CRDW project will include all parts of the data extraction, data validation, and subsequent support. The extraction will be implemented on the CRDW side. Also handle the bugs found in production and extend support to the regular production data loads. All the programs were developed in Mainframes and ETL objects in Informatica and Teradata
Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
Involved in the creation of Informatica mappings to extracting data from Flat Files to load in to Stage area.
Designed and Created validation and External loading scripts like MLOAD and FLOAD for Teradata warehouse using Informatica ETL tool.
Involved in error handling, performance tuning of mappings, Testing of Informatica Sessions, and the Target Data.
Experienced in loading data into Data Warehouse/Data Marts using Informatica, Teradata Multiload, Fast load and BTEQ utilities.
Extensively worked on indexes in Teradata and Creation of proper Primary Index (PI) talking into consideration of both planned access of data and even distribution of data across all the available AMPS.
Designed ETL to collect stats on regular basis (when the volume increases by 3 % more)
Loading data by using the Teradata loader connection, writing Teradata, utilities scripts (Fast load, Multiload) and working with loader logs.
Scheduled workflows, BTEQ Scripts, UNIX shell scripts using WLM Scheduling tool.
Dropped the Join Indexes before ETL load and re-built after the ETL load to enhance Performance more.
Environment: Informatica Power Center 9.1, Teradata 13.10, IBM DB2, WLM Scheduling Tool and Windows 7
Client: XL Insurance Oct 2011 -Apr 2012
Informatica Developer, Synova Soft Technologies, India
XL Group is a leading PLC Insurance Company in the UK and worldwide. XL Insurance is the global brand used by XL Group’s underwriting division. The solvency II QRT reporting is the project to extract the life and non-life insurance data from the XL data warehouse and load it into to QRT Data mart. The Cognos reports are built based on the QRT Datamart. The requirements have been derived from the fall 2010 QRT document itself based on the CP 58 consultation outputs. They are enriched by the QIS5 business and technical descriptions. The XL Solvency II QRT Reporting Solution will employ the Cognos QRT Data mart and 63 designed QRT report.
Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
Designing and developing informatica mappings including Type-I,Type-II, Type-III slowly changing dimensions(SCD)
Involved in extracting the data from the Flat Files and Relational databases into staging area.
Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
Developed several reusable transformations and mapplets that were used in other mappings.
Prepared Technical Design documents and Test cases.
Involved in Unit Testing and Resolution of various Bottlenecks came across.
Implemented various Performance Tuning techniques.
Informatica Power Center 8.6.1, Teradata v2r6, SQL Server 2005, Oracle 9i, PL/SQL, SQL Developer, Toad, UNIX.
Client: Zyme-Kore Mar 2010 -Oct 2011
Informatica Developer, Capgemini, India
ZymeKore is the application platform that collects channel data from various sources of the channel and processes them to standardize, cleanse and store the data to produce several analytical reports that are critical to all the tiers of the channel.
As an ETL developer provide the optimal ETL load strategy and technical design documents.
Develop ETL jobs using Informatica Power Centre to Implement Business Requirements.
Create new mapping designs using various tools Informatica Designer like Source Analyzer,
Warehouse Designer, Mapplet Designer and Mapping Designer.
Communicate with Business customers to discuss the issues and Requirements.
Effectively used Informatica parameters files for defining mapping variables, workflow variables and relational connection. Review and analyse customer Requirements and design HLD & LLD documents.
Written stored procedure for updating and deleting the required table data.
Create SQL scripts for truncating tables and creating and dropping indices
Effectively used Unix shell scripts for pre-session and post session tasks for client specific loads.
Extensively used pmcmd command in Unix shell Scripts to call clients specific loads.
Performed unit testing at various levels of the ETL and actively involved in team code reviews.
Identified problems in existing production jobs and resolve it.