Post Job Free

Resume

Sign in

Manager Data

Location:
Baltimore, MD
Posted:
August 21, 2017

Contact this candidate

Resume:

SAI KUMAR G

443-***-****

ac1xmp@r.postjobfree.com

Baltimore, MD

•Summary:

•Over 8+ Years of IT experience in Data modeling, Data integration, Data Migration in designing, implementation, development, testing and maintaining the ETL components in building Data Warehouse & Data marts across Health Care, Manufacturing, Retail, Finance, Insurance, and Banking.

•Experienced in all stages of the software lifecycle architecture (Waterfall model, Agile Model) for building a Data warehouse.

•Well acquired on Informatica PowerCenter 9.x/8.x Designer Tools (Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) and Repository Manager & Admin console.

•Involved in troubleshooting data warehouse bottlenecks, performance tuning – session and mapping tuning, session partitioning & implementing pushdown optimization.

•Extensively worked on various transformations like Lookup, Joiner, Router, Rank,

Sorter, Aggregator, Expression, etc.

•Strong understanding to Data Warehousing, Data Architecture, Data Modeling, Data Analysis, SDLC Methods and GUI applications.

•Good Understanding with experience in Ralph Kimball Methodology, Bill Inmon

Methodology, creating entity-relational and dimensional-relational table modeling using Data Modeling (Dimensional & Relational) concepts like Star Schema Modeling and Snowflake schema modeling.

•Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management

•Extensive experience with Data Extraction, Transformation, and Loading(ETL) from heterogeneous Data sources of Multiple Relational Databases like Oracle, Netezza, Teradata, DB2, SQL Server, MS Access and worked on integrating data from flat files like fixed width and delimited, CSV, XML into a common reporting and analytical Data Model using Informatica.

•Worked with Teradata utilities like Fast load, Fast export, Multi load, TPUMP & TPT.

•Have experience in creating BTEQ scripts.

•Experience on Oracle utilities like SQL Loader, TOAD.

•Extensively used SQL and PL/SQL for development of Procedures, Functions, Packages and Triggers.

•Experience in using the Informatica command line utilities like PMCMD and PMREQ to execute workflows in Unix environment.

•Good programming skills in SQL, PLSQL, TSQL.

Educational Details

Bachelor of Engineering-2008 in Computers from Visvesvaraya Technical University (VTU), Karnataka, India

Technical Summary

ETL TOOL: Informatica 8.1, 8.6, 9.1, 9.5.1

DATABASES: Oracle9i/10g, Microsoft SQL Server 2008, Teradata.

PROGRAMMING LANGUAGES: UNIX Shell Scripting, SQL

SCHEDULING TOOLS: Control M

ENVIRONMENT: Windows XP/2000, UNIX

OTHER SOFTWARE: Toad, Putty, WinScp, SQL Developer

OTHER APPLICATION: HP Quality Center 9/10, Citrix

Professional experience:

CSRA, Inc, Baltimore, MD Jan 16 – till date

Senior ETL Developer

CSRA (formed by the merger of CSC's North American Public Sector business and SRA International (formerly Systems Research and Applications Corporation)) is a major provider of technology to the government in the form of national defense and health care. The Administrative Simplification provisions of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) mandated the adoption of standard unique identifiers for health care providers and health plans. The purpose of the provisions is to improve the efficiency and effectiveness of the electronic transmission of health information. The Center Medicare and Medicaid Services (CMS) has developed the National Plan and Provider Enumeration System (NPPES) to assign these unique identifiers, also known as National Provider Identifier (NPI).

Responsibilities:

•Participated in requirement gathering, business analysis, user meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.

•Created ER diagram of the data model using Erwin data modeler to transform business rules into logical model.

•Used Informatica Power Center 9.x/8.x for extraction, loading and transformation (ETL) of data in the data warehouse, from source flat files and RDBMS tables to target tables.

•Created reusable transformations and mapplets and used them in mappings.

•Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter and Sequence Generator, Update Strategy, SQL, Union, Lookup, Joiner, XML Source Qualifier, transformations.

•Implemented the slowly changing dimensions (SCD) type1 and type2 to maintain current information and history information in the dimension tables.

•Involved in client interaction, analyzing issues with the existing requirements, proposing solutions and implementing the same.

•Optimized the performance of the mappings by various tests on sources, targets and transformations by identifying the Bottlenecks, removing them and implementing

•performance tuning logic on targets, sources, mapping, sessions with Pipeline

•Partitioning and increased block size, data cache size, sequence buffer length, and target based commit interval to provide maximum efficiency and performance.

•Was also involved in production support for monitoring the jobs and fixing it without missing the SLA.

•Debugging code, testing and validated data after processes are run in development/testing according to business rules.

•Prepared unit test plans and maintained defect logs to resolve issues. Worked with the QA team to determine the data validation and performed the data validating at the source and the target database level.

•Hands on experience as an Administrator involving Maintaining the Repository Manager or creating repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environment.

•Working on data request tickets and assisting business users (non-technical) to understand the quality of the data.

Environment : Informatica Power Center 10.1/9.5.1/9.6, Informatica power exchange, Oracle

11g, SQL Server 2005/2008, IBM Mainframe, TSQL, MS Excel, CA Scheduler.

Centene Corporation, Clayton, MO Feb 15 – Dec 15

ETL/Informatica Developer

Centene Corporation, a Fortune 500 company, is a diversified, multi-national healthcare enterprise that provides a portfolio of services to government-sponsored healthcare programs, focusing on under-insured and uninsured individuals. Founded as a single health plan in 1984, Centene Corporation (Centene) has established itself as a national leader in the healthcare services field. Now expanded to 24 health plans.

Responsibilities:

Interacted with Business Analysts to understand the requirements.

Maintained Technical specification documents according to the Source to Target mapping documents.

Prepared ETL Specifications to help in developing mappings.

Used Informatica PowerCenter 9.6 for extraction, loading and transformation (ETL) of data into the target systems.

Worked on Informatica Power Center tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.

Created mappings in PowerCenter Designer using Aggregator, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, Source Qualifier and Stored procedure transformations.

Developed Informatica workflows and sessions associated with the mappings using Workflow Manager.

Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.

Have monitored the jobs in QA environment for shakeout purpose.

Migrated mappings, sessions, and workflows from development to testing then to Prod environments

Documented the process for further maintenance and support.

Environment:

Informatica Power Center 9.6,9.1( Repository Manager, Designer, Workflow Manager, Workflow Monitor, Repository Administration Console), Teradata 14, Teradata SQL Assistant 14,Delimited Flat Files, Oracle, SQL Developer, UNIX, WinSCP.

Ahold Inc,Canton,MA May 12 – Jan 15

ETL/Informatica Developer

Ahold is a major international super market operated based in Netherland.The Company expanded internationally and started buying chains in USA like super markets STOP &SHOP, Gaints. Ahold’s Data Migration project is to migrate the data from legacy systems into ORMS systems (Oracle Retail Management Systems)

Responsibilities:

Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation

Used Informatica PowerCenter 9.1 and its all features extensively in migrating data from Legacy systems into ORMS Systems.

Used Informatica PowerCenter 9.1 for extraction, loading and transformation (ETL) of data into thevtarget systems

Extracted data from different sources like COBOL copy book, flat files loaded into ORMS.

Created complex mappings in PowerCenter Designer using Aggregator, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, Source Qualifier and Stored procedure transformations.

Developed mappings/Reusable Objects/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica PowerCenter Designer, since different datatype conversions have to be dealt multiple times which are similar and also most of the lookups have been similar

Worked extensively with different caches such as Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).

Developed error handling & data quality checks in Informatica mappings.

Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.

Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance

Developed UNIX Shell Scripts for scheduling the sessions in Informatica.

Involved in scheduling the unix shell scripts which runs Informatica workflows using Autosys.

Involved in Performance tuning for sources, targets, mappings and sessions.

Created Test Plans and Test Scripts to support the testing team.

Migrated mappings, sessions, and workflows from development to testing and then to Production environments.

Environments: Informatica Power Center 9.1( Repository Manager, Designer, Workflow Manager, Workflow Monitor, Repository Administration Console),Oracle 10g, COBOL Copy book, Delimited Flat Files, SQL developer, UNIX Shell Programming.

Valuelabs, India June 2009– Apr 2012

ETL Consultant

The Sample Accountability System is a software application used to perform sample reconciliation and to audit representative activity surrounding the distribution of prescription drug samples using Excel files. The reports that are produced monthly by the sample. Accountability System are important tools for Lilly to audit and secure the distribution of samples.

Responsibilities

Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.

Extensively worked on Power Center 9.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.

Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.

Extensively worked on Transformations like Lookup, Joiner, SQL and Source Qualifier

Transformations in the Informatica Designer.

Implemented Slowly Changing Dimensions (Type 1, Type 2 and Type 3).

Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.

Modify and develop new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.

Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.

Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.

Defined Target Load Order Plan and to load data correctly into different Target Tables.

Used Informatica debugger to test the data flow and fix the mappings.

Loaded data to and from Flat files and databases like Oracle, Teradata and DB2.

Administering Exadata databases in production environments.

Modified existing and developed new ETL programs, transformations, indexes, data staging areas, summary tables, and data quality routine based upon redesign activities.

Moved the mappings, sessions, workflows, maplets from one environment to other.

Worked on UNIX Shell scripting and called several shell scripts using command task in Workflow manager.

Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.

Developed UNIX Shell scripts to archive files after extracting and loading data to Warehouse.

Used Informatica Power exchange 9.1 for Change Data Capture(CDC).

Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.

Involved in Unit and Integrating testing of Informatica Sessions, Batches and the Target Data

Environment: Informatica Power Center 8.X, Power Exchange, Oracle Oracle 8i, 9i, 10g, Teradata 13, Win7, SQL * Plus, Toad, AS 400, UNIX



Contact this candidate