Email Id: email@example.com
Over 6.5 Years of IT experience in Data warehousing, Documentation, Data Analysis, Reporting, ETL, Data Modeling, Development, Maintenance and Testing.
Extensively worked in Extraction/Transformation/Loading of the legacy data to Data warehouse using Informatica (Power Center 9.6/9.1) including Designer, Repository Manager, Server Manager, Workflow Manager and Workflow Monitor.
Experience in designing/ developing complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Java Expression, Aggregator, Joiner, Update Strategy and Normalizer etc.
Experience in performance tuning ETL code, mappings, sessions, workflows, stored procedures, functions and database.
Excellent capabilities in integration mappings including dynamic cache lookup, shared lookup, and persistence lookup mappings for Type I, Type II, and Type III slowly changing dimensions.
Implemented many Partitioning techniques and concepts to improve the performance.
Experience in preparing Source to Target for the workflows.
Experience in extracting data from Oracle, MS SQL Server, XML, Flat Files, Tera data and DB2.
Knowledge in SQL & PL/SQL and expertise in writing Triggers, Stored Procedures, functions and Packages.
Experience in configuring Database and ODBC connectivity to various source/target databases.
Worked in Agile and waterfall methodologies.
Experience in working with different databases.
Worked on conversion projects.
Experience in using Teradata load utilities (FASTLOAD, MULTILOAD and TPUMP) to load huge volume of data in Teradata.
Experience in SQL, UNIX, Data Relationships, Data extraction & validation.
Implemented performance tuning techniques to improve the efficiency and performance of the workflows.
Worked extensively on SQL Query tuning.
Resolving data issues, complete unit testing and complete system documentation for ETL processes.
Comprehensive knowledge in Dimension modeling concepts like Star Schemas, Snowflake Schemas, Fact tables and Dimension Tables.
Experience in Reporting tools such as Crystal reports, SAP BO and Cognos.
Experience in Informatica Data Quality (IDQ).
Used various data quality transformations in order provide quality data.
Excellent communication, problem solving and analytical skills and an attitude to learn the new cutting-edge technologies fast.
Knowledge in AWS.
Proven ability to multi-task, effectively providing desired/expected results
Informatica Power Center 9.5/9.1, Power Mart 6.2 Informatica developer
SQL, C, Shell Scripting, PL/SQL
Oracle11g/10g/9i/8i, SQL Server 2012,2010/2008/2005, DB2, My SQL 5.0/4.1, MS-Access, Teradata.
Windows, Win NT, UNIX
Dimension Data Modeling, Physical and Logical Modeling, Relational Modeling, Snowflake Schema, Star Schema, Dimension Tables, Fact Tables, Normalization, Denormalization, Erwin 7.2/4.0 and MS Visio
Tidal, Autosys, Data Analyzer, Metadata Manager, SQL* Loader, ALM, TOAD, Cognos, Crystal Reports,SAP BO,Jira
State of Illinois
Informatica developer Nov 2014 –Present
The State of Illinois is undertaking a significant effort to consolidate and modernize eligibility functions for several programs operated by the Department of Healthcare and Family Services (HFS) and the Department of Human Services (DHS).This project’s goal is to improve access to programs providing service to economically disadvantaged people by providing a simple, efficient, seamless, and traceable system for people to access and manage their health coverage, insurance, or aid. These goals will be accomplished through the development of a new Integrated Eligibility System (IES) that will focus on medical programs, including Medicaid, CHIP, and various state specific programs, as well as two key human services programs, the Supplemental Nutrition Assistance Program (SNAP, formerly known as Food Stamps) and the Cash Programs (Temporary Assistance for Needy Families and Aid to Aged, Blind and Disabled, among others. This is a conversion project where data is converted from Legacy System (HIS) to IES.
Roles and Responsibilities:
Extraction, Transformation and Loading of the data using Informatica from and to different databases.
Created design documents from client requirements.
Captured Data Quality metrics of Source data such as distinct record, field, nulls to understand the raw source data.
Worked on Informatica Data Quality tool such as Informatica developer.
Extensively worked with Functional team and involved in requirement gathering and analyzing the source data before developing.
Designed the target load process based on the requirements documents.
Developed complex Informatica mappings.
Involved in extracting the data from SQL server and DB2 databases and XML files.
Develop Mappings and Workflows to load the data into DB2 tables.
Widely used Informatica client tools – Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
Extensively used the transformations Joiner, Aggregator, Sorter, Expression, Filter, Router, Rank, Lookup, Source Qualifier, SQL transformation, Java, Stored procedure, Union, XML etc.,
Used various Data quality transformations.
Created few functions and Stored procedures to handle the requirement.
Experience in Data mining.
Performed data profiling.
Created Tasks and database connections using Workflow Manager.
Used Sop for address validation check.
Implemented Slowly Changing Dimensions to capture the changed data
Created sessions and batches to move data at specific intervals & on demand.
Used different partitioning techniques to improve the performance.
Created and the scheduled the sessions to run on specific time.
Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance.
Worked with database DBA’s to resolve database related issues and with Informatica Admin to create the new connections needed and fix Informatica issues.
Implemented Exception handling scenarios.
Responsible for identifying the missed records in different stages from source to target and resolving the issues, data validation, data cleansing and correcting the source data.
Developed Job Sequencer to execute jobs in proper sequence. Also, automated email messaging was implemented using Sequencer to notify the operations team of any data load issues such as job failure, dropped rows, rejected rows etc.
Testing and debugging the enhanced mappings.
Creating the Play books which includes all activities.
Preparing the documents for test data loading.
Worked and supported the QA team to provide required information, test and execute the test scenarios.
Involved in meetings to discuss about all the activities that should be done by the team.
Created module specific reports for the client.
Working in Maintenance and Operations, Production support team supporting the client.
Environment: Informatica Power Center 9.6, Power Exchange, SQL developer, DB2, PL/SQL,TOAD, Flat Files, XML files, Crystal Reports, SAP BO, Jira.
Bloomfield, Ct APril 2013 – Oct 2014
ETL system Analysis Programmer
Cigna purchased a SAS software package which is intended to provide a more enhanced method of creating Episodes of Care for the internal Cigna East and Cigna west member populations. The purpose behind this project is to create necessary input data required by software to perform the analytics to create these Episodes of Care. This information in turn will be better managing the overall healthcare of the customer base.
Roles and Responsibilities:
Conduct user interviews for requirement gathering and evaluated client needs
Involved in creating functional and scope documentation for data cleansing, conversion, and integration processes.
Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
Created various documents including high-level design documents, mapping documents and knowledge transfer documents.
Involved in identification of data needs.
Coordinated with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
Extensively worked on relational data modeling for OLTP and OLAP systems.
Performed data analysis with respect to extracting and staging data for transformations.
Developed complex mapping as per the business rules.
Extracted data from multiple source systems and performed cleansing and modification of data.
Loaded the data into Teradata database which in in turn feeds data to SAS Analysis tool.
Extensively utilized the Debugger utility to test the mappings.
Performed data cleansing by writing queries to identify and analyze data anomalies, patterns, and inconsistencies.
Worked on performance tuning the Sqls.
Used UNIX Scripting and scheduled PMCMD to interact with Informatica Server from command mode.
Performed File validations on the server using UNIX.
Used versioning and labeling options in power center to keep track of mapping and workflow changes.
Responsible in conducting weekly status meetings with the client in absence of manager.
Performed knowledge transfer to the production support team.
Ensures test scripts are documented, reported and tracked in the appropriate tools.
Environment: Windows 2007, Informatica 9.1, Oracle 11g, XML, Teradata v6, R13, R14, UNIX/AIX, PL/SQL, and Control-M 8.0.0.