Post Job Free
Sign in

Data Manager

Location:
Voorhees Township, NJ, 08043
Posted:
September 12, 2016

Contact this candidate

Resume:

Suman Jonnepally

626-***-****

*****.**********@*****.***

**** ***** ** ******** ** 08043

Summary:

* + years of IT experience in Analysis, Design, Development and Deployment of technical solutions to business requirements.

Extensively worked on Data Warehouse Full Life Cycle Projects for Health Care, Financial, Retail and Insurance industries.

7 + years of Strong Data Warehousing ETL experience using Informatics Power Center 9.6.1/9.5/9.1/9.0/8.6/8.1/8.0/7.1 (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer, Informatica Server, Repository Manager)

Experience in all the phases of ETL such as Extraction, Cleaning, Transformation and Loading using Informatica.

Experience in using Oracle 11g for performing ETL functions using stored procedures.

Extensive experience and exposure on several areas of the data warehousing and Business including requirements gathering, data warehouse architecture design, development, quality planning & assurance, data analysis and implementation.

Working knowledge with Informatica CDC for daily loads.

Excellent understanding of the System Development Life Cycle. Involved in analysis, design, development, testing, implementation, and maintenance of various applications.

Extracted data from various sources ranging from various RDBMS like Oracle, MS SQL Server, DB2, Teradata and Sybase, and Flat Files to load data into data marts and data warehouse.

Expertise in working with TOAD for creation of Tables, Materialized views and normal views.

Proficiency in working with PL/SQL stored procedures, packages, and triggers.

Actively involved in Performance Tuning, maintaining Staging area, Backup and Recovery process, product support on various platforms.

Effective implementations of change data capture using Power Exchange for daily loads.

Performed data quality checks using data quality tools.

Good experience working with large databases and performance tuning.

Efficient in Optimizing, Debugging and testing SQL queries and stored procedures.

Excellent working knowledge of Shell Scripting and job scheduling on platforms like UNIX.

Able to interact effectively with other members of the Business, Quality Assurance, Users and other teams involved with the System Development Life cycle.

Excellent analytical, functional, design and development skills with excellent communication skills and the ability to interface with the executives and perform independently as well as team environments.

Worked closely with the business analyst's team in order to solve the Problem Tickets, Service Requests. Helped the 24/7 Production Support team.

Resolving productions issues for any scheduled jobs failures and reporting issues to the concerned teams.

Documenting the Cut over plans, Design documents and Run books for the Production support teams.

Technical Skills:

Data Warehousing / ETL : Informatica Power Center 9.6/9.5.19.X/8.X /7.1, Tera Data 14.X/13.X/12.X.

Database : Oracle 11g/10g/9i/8i, MS SQL Server, Netezza, Teradata.

Database Tools : TOAD, SQL Navigator.

Languages : SQL, PL/SQL, HTML, XML

Operating Systems : IBM Z/OS, Windows 95/98/NT/2000/XP and UNIX

Professional experience:

Bank of America, New York, NY, USA Mar 2015 – Jan 2016

Registration and Insurance Licensing System (RAILS)

Senior ETL Developer

The Registration and Licensing team is using a number of different applications to support their processes. Some of the applications are built on older technologies that no longer comply with bank’s standards and require R&L team to manually enter the same data more than once.

BoA is building a new system called RAILS so it can move to a consolidated platform and enhance and simplify user experience.

The system will be implemented in stages with Insurance Continuing Education in Phase I and Insurance Licensing in Phase II.

Responsibilities:

Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and Load it into data warehouse using Informatica Power center.

Involved in the ETL technical design discussions and prepared ETL high level technical design document

Involved in the analysis of source to target mapping provided by data analysts and prepared functional and technical design documents

Extracted/loaded data from/into diverse source/target systems like Oracle, SQL Server, XML and Flat Files.

Created standard and reusable Informatica mappings/mapplets using Lookup, Joiner, Rank, Source Qualifier, Sorter, Aggregator, and Router transformations to extract, transform and load data mart.

Developed complex mapping in order to implement Change Data Capture (CDC).

Created and executed test cases/scripts and completed unit, integration and system tests successfully.

Deployment of Informatica Objects from Dev to QA, UAT, and Production environments.

Involved in the performance tuning of Informatica code, SQL, PL/SQL scripts, Database using standard

performance tunings steps for better performance and thereby improving the load time.

Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.

Created SQL, PL/SQL, stored procedures for implementing business rules and transformations.

Automated ETL process flow using batch scripting.

Supported daily nightly productions jobs on rotation basis.

Load inbound WEB EFT xml file from FINRA to database tables.

Created mappings to load changes in registration information to history tables.

Develop mapping to send registration information to FINRA in XML format.

Development of Informatica Mappings, Sessions and Workflows.

Development of Autosys jobs.

Environment: Informatica PowerCenter 9.5, MS Sql Server, oracle 11g Autosys, Linux.

Deloitte, Camphill PA May 2014 – Jan 2015

Sr. ETL Developer

Under Healthy PA, Pennsylvania will extend health care coverage to adults age 21 through 64 with in- come up to 133 percent of the federal poverty level (FPL) who do not currently qualify for Medicaid. Ra- ther than simply enrolling these individuals into the existing Medicaid program, the state intends to use Medicaid dollars to purchase them coverage through a Private Coverage Option (PCO).

Responsibilities:

Participated in requirement gathering, Business Analysis, user meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.

Created ER diagram of the data model using Erwin data modeler to transform business rules into logical model.

Involved in the extraction, transformation and loading of data from source flat files and RDBMS tables to target tables. Created reusable transformations and mapplets and used them in mappings.

Used Informatica Power Center 9.1/9.01/8.6.1 for extraction, loading and transformation (ETL) of data in the data warehouse.

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter,and Sequence Generator, Update Strategy, SQL, Union, Lookup, Joiner, XML Source Qualifier, Unconnected lookup transformations.

Implemented the slowly changing dimensions (SCD) type1 and type2 to maintain current information and history information in the dimension tables.

Involved in client interaction, analyzing issues with the existing requirements, proposing solutions and implementing the same.

Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified the Bottlenecks, removed them and implemented performance tuning logic on targets, sources, mapping, sessions to provide maximum efficiency and performance.

Was also involved in production support for monitoring the jobs and fixing it without missing the SLA.

Tuned performance of Informatica sessions for large data files by implementing pipeline Partitioning and increasing block size, data cache size, sequence buffer length, and target based commit interval and resolved bottlenecks.

Debugging code, testing and validated data after processes are run in development/testing according to business rules.

Prepared unit test plans and maintained defect logs to resolve issues. Worked with the QA team in order to determine the data validation and performed the data validating at the source and the target database level.

Hands on experience as an Administrator involving Maintaining the Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environment.

Working on data request tickets and assisting business users (non technical) to understand the quality of the data.

Environment: Informatica Power Center 8.6.1/9/1, Informatica power exchange, Oracle 11g/10g, SQL Server 2005/2008,IBM Mainframe, TSQL, MS Excel, Windows XP/2003/2008, CA Scheduler.

Centene Corporation, Clayton, MO Feb 14 – Apr 14

ETL Developer

IT_FRAUD_WASTE_ABUSE Project is built in pulling the claims from EDW (Enterprise Data warehouse) for particular members based on their business unit and their plan.

The Project acquires and Integrates Member, Eligibility, Claims, Benefits and Paid claims into HMS and submit the reports to Standard Vendor Outbound Paid claim.

Responsibilities:

Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.

Created metadata queries which would help in restart and to find out any missing link conditions or disabled tasks in a workflow.

Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.

Used Informatica data services to profile and document the structure and quality of all data.

Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.

Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.

Experience working with Teradata tools MLOAD, FLOAD, FEXP and BTECH script.

Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer which populated the Data into the Target Star Schema on Oracle 9i Instance.

Developed ETL mappings, transformations using Informatica Power center 8.6

Deployed the Informatica code and worked on code merge between two difference development teams.

Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.

Created automated scripts to perform data cleansing and data loading.

Attended daily status call with internal team and weekly calls with client and updated the status repo rt.

Created and maintained mappings as needed.

Developed various shell scripts using Korn Shell to Integrate various components

Created test cases and pseudo test data to verify accuracy and completeness of ETL process

Involved in Unit testing, System Integration testing, User Acceptance Testing

Tested application and implemented into production

Modified mappings, sessions, workflows as needed to get them to work as designed in upgraded environment.

Environment: Informatica PowerCenter 9.1.0, Power Exchange 9.1.0,Oracle 11g, Toad, HP Quality Center, Tera Data,DB2,Flat Files,unix shell scripting, Netezza, Tidal Scheduler.

Kaiser Permanente, Portland, OR Jun 13 - Jan 14

BIDW ETL Developer

The purpose of the INPATIENT Project is to integrate patients demographic, hospital, treatment data associated each facility and clinical data from NW Data warehouse, CBS and Clarity into a new, KPNW Data Mart that feeds the reporting. The purpose of the Finance Project is to integrate all employees demographic and payroll data associated costcener and yearly budget data from NW Data warehouse, CBS, Clarity and One link into a new, central KPNW Data Mart that feeds the reporting.

Responsibilities:

Responsible for reviewing Business Requirement document

Translated user requirements into system solutions.

Involved in Logical Data Modeling and Database design.

Designed and developed Oracle Star Schema Datamart

Involved in Data Analysis of the OLTP system to identify sources data elements

Worked with Oracle, Flatfiles,Teradata and SQL Server source systems

Prepared Technical Specification Document

Prepared Source to Target ETL mapping Documents

Designed and developed Informatica Mappings, Reusable Sessions, Worklets, Workflows, Dynamic Parameter files

Designed and implemented Audit control and Exception control strategies

Provided solutions for various performance bottle necks in Informatica Mappings

Designed and Developed Autosys jil scripts to Schedule Informatica workflows

Developed various shell scripts using Korn Shell to Integrate various components

Created test cases and pseudo test data to verify accuracy and completeness of ETL process

Involved in Unit testing, System Integration testing, User Acceptance Testing

Involved in Informatica Code Migration across various Environments

Involved in Production Support and Knowledge Transfer sessions

Environment: Informatica Power Center 9.5.1, Informatica power exchange 9.5, Oracle 11G, SQL Server, Flat files, PL/SQL, SQL*Plus, TOAD 9.1, UNIX, Shell Scripting, Tivoli, Erwin, Microsoft Visio

Cigna Healthcare, Greenwood Village, CO Jan 12 – May 13

ETL. Developer

The PDI (Proclaim Data Integration) project is built in the process of expanding the scope of Cigna Information Factory. The Project acquires and Integrates Member, Eligibility, Claims, Benefits and Accumulations data from Proclaim into the Information Factory. The Effort included stage, integrate and position the data making it available for the downstream data consumers.

Responsibilities:

Especially worked on the benefits subject area of PDI project

Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables which involved extracting data from proclaim mainframe JCL and cobol

Performed CDC capture registrations.

Assisted in building the ETL source to Target specification documents by understanding the business requirements.

Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.

Created High Level Design document and Low level design document for the projects worked on.

Worked on code enhancements for the assigned projects.

Reusable transformations and Mapplets are built wherever redundancy is needed.

Slowly Changing Dimension Type 2 is implemented to maintain last 3 years of historical data.

Performance tuning is performed at the Mapping level as well as the Database level to increase the data throughput.

Designed the Process Control Table that would maintain the status of all the CDC jobs and thereby drive the load of Derived Master Tables.

Used tasks like Command, Decision, Event Wait, Event Raise, Assignment and Timer in the workflows and worklets to meet business requirements logic effectively.

Created promotion packages in harvest for the code deployment to higher environments.

Environment: Informatica PowerCenter 9.1.0, Power Exchange 9.1.0,Oracle 11g, Toad, HP Quality Center, Tera Data,DB2, 13.X/12.X, Control M Scheduler.

.

Procter and Gamble, Cincinnati, Ohio Jul 10 – Dec 11

ETL Developer

The Atomic Data Warehouse (ADW) captures data from Integrated heterogeneous sources (SAP, Siebel, manufacturing Data, Consumer, etc.) and physically stores Procter and Gamble’s key business measures once in their most "atomic" form and then makes the data available via many logical business views (logical, regional and global) to create near real-time data feeds that address our Business Intelligence needs. The data is presented via the integration of the Procter & gamble Business Intelligence Portal and architected to utilize the same atomic data across business levels and areas of the business. This type of architecture provides for presenting Procter and Gamble atomic data to more areas of the business in less time, and makes the ADW scalable.

Responsibilities:

Involved in the process of extracting data from integrated heterogeneous sources and sorting extracted data.

Performed the preparation of technical design docs, source to target mapping (filed to field matrix). Prepared the strategies for performance tuning and reusable components in the Informatica.

Performed data validation, data reconciliation on the target DB2 Server and source data.

Created mappings using Informatica, prepared the test cases based on the business requirements and documented them in a specified manner.

Involved in migration of the Informatica objects between the repositories with deployment groups.

Designed and developed complex Informatica mappings using expressions, aggregators, filters, lookup and stored procedures to ensure movement of the data between various applications.

Tuned all the mappings for better performance.

Error checking and testing of the ETL procedures and programs using Informatica session log and various tracing options

Implemented the exception handling for trapping various exceptions and reported the various errors.

Designed and developed Oracle PL/SQL Procedures and wrote SQL, PL/SQL scripts code for extracting data to system

Environment: DB2, Oracle 9i, Informatica 7.2, UNIX.

Selective Insurance Co. of America, Branchville, NJ Oct 09 - Jun 10

Data Warehouse Developer

Selective is an insurance and reinsurance company with specialty in underwriting of property and causality insurance. Business and organizations can cover losses and provide for sharing the costs of losses among all insured. A digital cockpit, a web based application displays analytical data of Selective Company. Analytical data includes account details, pricing impact details, and claim and expense details. Digital cockpit application gets data from specialty data warehouse. Informatica interfaces are developed to load data from various sources database into the data warehouse

Responsibilities:

Worked with Business analysts for requirement gathering, business analysis, and testing and project- coordination.

Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to the business rules.

Created the (ER) Entity Relationship diagrams & maintained corresponding documentation for corporate data dictionary with all attributes, table names and constraints.

Research Sources and identify necessary Business Components for Analysis.

Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target Schema Design (Star Schema) and maintenance.

Created Informatica mappings as per technical specifications.

Created different transformations for loading the data into target database.

Performance tuning of mappings, transformations and sessions to optimize session performance.

Created snapshot for the transactional tables in distributed databases and also created Triggers, Procedures and Functions for the backend Development.

Developed Unix Shell Scripts and Stored Procedures to run from the Informatica pre and post sessions to create and drop the indexes to improve the speed of session execution

Worked on issues with migration from development to testing.

Carried out ETL Testing using Informatica and PL/SQL.

Used SQL, PL/SQL to validate the Data going in to the Data Warehouse.

Environment: Oracle 8i, 9i, 10g, Informatica 8.X, Erwin, Windows, UNIX.

Bank of America (Infosys), Mysore, India Nov 08 - Sep 09

Database Developer

CSAR is a Maintenance and Development Project where we had to develop the actuate reports based on the requirements given from the client. The Database used here is Oracle Database where we write the queries or procedures as per the requirements and extract the data from the database using actuate.

Responsibilities:

Extensively involved in requirements gathering and data gathering to support developers in handling the design specification

Involved in designing and coding of functional specifications for the development of user interfaces

Created tables, indexes, sequences, constraints and snapshots

Developed packages, procedures, functions and PL/SQL blocks for data validation

Developed PL/SQL scripts to validate and load data into interface tables

Fixed software bugs and interacted with developers to resolve technical issues

Designed and developed a Generic Billing system for a Telecommunication company

Responsible for all pre-ETL tasks upon which the Warehouse depends including managing

Environment: Oracle 8i, 9i, SQL, PL/SQL, UNIX and Windows, Korn Shell.

Education: Bachelor of Technology, Computer Science

Jawaharlal Nehru Technological University, India



Contact this candidate