Post Job Free
Sign in

Data Manager

Location:
Columbus, OH
Posted:
June 07, 2015

Contact this candidate

Resume:

Over **+ years of experience in IT industry in the design, development, implementation, testing and maintenance in the IBM Mainframe, Data Warehouse and Client Server environments.

6+ years of experience in using Informatica Power Center (9.x/8.x/7.x), Data stage 8.7

and Informatica Bid Data Edition.

Worked extensively on healthcare industry and applications.

Working knowledge of HIPAA and all its existing formats.

Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.

Experience in developing Informatica mappings, mapplets, sessions, workflows.

Worked on tuning of Informatica mappings/sessions.

Proficiency in developing SQL with various relational databases like Oracle, SQL Server, Netezza and Teradata.

Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and loading of data from multiple sources to oracle Data Warehouse and Teradata.

Experience in developing Unix Scripts.

Worked on all phases of Software Development Life Cycle (SDLC).

Strong working experience in the Data Analysis, Design, Development, Implementation and testing of Data Warehousing using ETL tool informatica power center.

Worked with heterogeneous databases like Teradata, Netezza, Oracle, DB2, SQL Server, MS-Access etc.

Worked closely with Business Users on Requirements

Good Experience with Business Users to analyze the business process and make necessary changes to schema objects to cater to their reporting needs.

Team player, Motivated, able to grasp things quickly with analytical and problem solving skills.

Good communication and inter personal skills.

TECHNICAL AREAS OF EXPERTISE:

Hardware

IBM 3090, PC Compatibles

Operating Systems

IBM MVS-ESA, Windows NT/2000,Windows XP, HP-UNIX

Languages

IMS-DB, OS/390 COBOL, JCL, VB, COM,SQL

Databases

IMS-DB, DB2, Sybase, MS-Access, Oracle, MS SQL SERVER, Teradata(v13),Netezza

Case Tools

UML, VISIO

Web Related

XML, ASP, HTML

Tools & Utilities

Client Server: ChangeMan-DS, SQL-Navigator, SQL Advantage, and Service Center, Teradata SQL Assist, Toad for Oracle, Toad for DataAnalyst

Mainframe: TSO/ISPF, VIASOFT, COMPAREX, XPEDITOR, FILE-AID, Endevor, ChangeMan-ZMF

Middle Ware/Translator

Mercator 6.5, MQ series

Domain Knowledge

Health Insurance

Management

Microsoft Project Plan, Team Building, Onsite Coordinator

GUI

Visual Basic, Power Builder 4.0, and Developer 2000

ETL

Informatica Power Center7.x/8.x/9.x,Datastage8.7,Informatica Big Data Edition,Hadoop,Hive

OLAP

Business Objects, Congo’s

WORK EXPERIENCE:

CHS/Enterprise Data ware House Project ETL Developer Nov’14 to Till Date

Nashville, TN

CHS is the largest non-urban provider of general hospital healthcare services in the United States in terms of number of acute care facilities. The goal of the CHS Enterprise Data ware house is to provide single,trusted source of data for both current and future business needs to provide accurate insights and actionable discoveries. The intent is to enable more time to be spent on analyzing data and less time on just collecting and understanding the data.

Analyzed the business requirements and functional specifications.

Extracted HDFS files from Hadoop and loaded into Hivestage.

Expertise with the Informatica Developer Toolset with capability to Profile Sources, creating workflows, configuring Hadoop Connections.

Experience working with Informatica Big Data Edition.

Experience pulling data from HDFS/Hive Sources and Writing to Hive Targets.

Experience querying the Hive tables and working with HiveQL

Used Parameter files to define values for parameters and variables used in mappings, sessions, worklet or workflow.

Worked in Development, Unit and Integration testing of mappings/workflows.

Experience preparing Mock Data for testing based on Business Requirements.

Writing test cases, preparation of test data and testing of mappings.

TECHNOLOGIES USED:

Informatica Big Data Edition, HDFS Files, Hive, Teradata, Windows, UNIX.

RADS/ Nationwide Financials ETL Developer Feb’14 to Jul’14

Columbus, OH

Nationwide Financials RADS (Reporting and Data Services) provides IT solutions and day-to-day operations support for the business segments and distribution capabilities through the Business Solution Areas (BSA) and supporting teams.

Analyzed the business requirements and functional specifications.

Extracted data from the Flat Files and Relational databases and staged into a single place

and applied business logic to load them in the central Netezza database.

Used informatica power center 9.5.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.

Involved in extracting the data from the Flat Files and Relational databases into staging area.

Develop the mappings using needed Transformations in Informatica tool according to technical specifications

Used Informatica power center workflow manager to create sessions, workflows and batches to run with the logic embedded in the mapping.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Fixed the invalid mappings and troubleshoot the technical problems of the database.

Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.

Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.

Used transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, Lookup, Rank, Joiner, Expression and Update Strategy in the mappings to meet business needs.

Worked on Mapping parameters, Mapping variables and passing parameters between sessions.

Extensively used Informatica debugger to figure out the problems in mapping.

Development and Testing as per requirements.

Writing test cases, preparation of test data and testing of mappings.

Involved in unit test case preparation.

Involved in performance tuning in the ETL processes.

Designed and developed UNIX scripts and used maestro for job scheduling.

Used SQL tools like Toad and WinSQL to run SQL queries and validate the data in warehouse.

TECHNOLOGIES USED:

Informatica, Oracle, Flat files, Netezza, Windows, UNIX.MAESTRO and ESP.

Incentives/Huntington Bank ETL Developer Jul’12 to Mar’13

Columbus, OH

The Purpose of the project is to Implement Incentive Management System. This common system provides easy management and administration of incentive plans for HR and Business Users. Provide corporate-wide

Performance reporting and financial reporting when needed.

Roles and Responsibilities:

Involved in design, source to target mappings between sources to operational staging targets, using Datastage Designer.

Used DataStage as an ETL tool to extract data from diffrent source systems, loaded the data into the SQL Server database.

Imported table/file definitions into the Datastage repository.

Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors.

Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, and Flat Files (Fixed Width) to staging database and from staging to the target Data Warehouse database.

Used various stages to construct Parallel Jobs.

Executed jobs through sequencer for better performance and easy maintenance.

Used lookup stage with reference to sql server Staging tables for insert/update strategy and

updating of slowly changing dimensions.

Involved in unit, performance and integration testing of Data Stage jobs.

Used Data Stage Director to run and monitor the jobs for performance statistics.

Involved in performance tuning of the jobs.

Controlled jobs execution using sequencer, used notification activity to send email alerts.

Code reviews for the jobs developed by other Datastage developers in the team.

Prepared documentation for the production support team.

TECHNOLOGIES USED:

Datastage 8.7, Oracle, Flat files, MS SQL SERVER, Windows XP,Unix

Appliance Migration/Abbott Nutrition ETL Developer Jun’10 to Jun’12

Abbott Nutrition is a division of Abbott, the global, broad-based health care company.

As part of the new Enterprise Architecture, Abbott Laboratories chose to move from Oracle to Teradata’s MPP architecture factoring in the Reporting performance benefits of Teradata vs Oracle. It was a massive undertaking as most of the ETL was written using Oracle PL/SQL.

Roles and Responsibilities:

Worked at seemlessly adapting to a Global Infrastructure System within Abbott. This involved moving the code from the Department level Repository to a Global Repository and ensures no functionality was impacted.

Involved in creating Mapping Specification documents.

Created ETL process to transfer and clean data from various sources to Data warehouse and Data Mart.

Developed the Informatica code for various individual datamarts that used Oracle.

Developed ETL mappings, transformations, maplets using Informatica Power Center 9.1.

Extensively used ETL to load data from mainframes to target database. Used Power exchange for mainframe sources.

Used Parameter files to define values for parameters and variables used in mappings, sessions and workflows.

Developed Oracle PL/SQL packages, procedures and functions

Studied the PL/SQL code developed to relate the source and target mappings.

Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.

Used FTP services to retrieve Flat Files from the external sources.

Worked on analysis and conversion of Oracle PL/SQL Procedures/Packages into Informatica standalone code.

Used Power Center Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email to run with the logic embedded in the mappings.

Involved in migration of Informatica from 8.6 to 9.1.

Worked on Informatica 9.1 Upgrade.

Worked on bug fixes on existing Informatica Mappings to accommodate new User Requirements

Involved in performance tuning for efficient loading of data from various sources.

Used Teradata loaders to enhance performance.

Used debugger to test the data flow and debug issues with the mappings.

Prepared documents to track the code migration to Production. Actively used MKS to create Code promotion requests to various environments.

Prepared Unit Test Plans/System Test Protocols, which captures the test conditions and scripts, expected/actual results.

Developed Informatica mappings to replace the existing PL/SQL code.

Extracted data from Mainframe databases using Power Exchange and loaded into SQL Server/Oracle Database tables.

Worked on Migration Strategies between Development, Test and Production repositories.

Worked on SQL tools like TOAD to run SQL Queries to validate the data

Wrote complex queries in Teradata SQL assistant to check the data from Source and Target.

Participated actively in various activities involved in the migration from oracle to teradata - analysis, making code changes to informatica test protocols and documentation.

Created new Mainframe jobs, FTP and file watcher jobs to integrate with the overall job flow.

Designed and created Autosys processes.

Prepared AUTOSYS JIL code for the new workflows that are created as a part of business requirements.

TECHNOLOGIES USED:

Informatica Power Center 8.x/9.x, Oracle10/11g, Flat files, SQL*Loader, TOAD, MS SQL SERVER, Teradata 13.1, Windows XP, Unix, Mainframe,AUTOSYS

EDW Process/Well Point ETL Developer Jan’08 to May’10

WellPoint is the nation's leading health benefits company serving the needs of approximately 34 million medical members nationwide. It is offering a broad range of medical and specialty products. WellPoint is a Blue Cross or Blue Cross Blue Shield licensee in 14 states: California, Colorado, Connecticut, Georgia, Indiana, Kentucky, Maine, Missouri, Nevada, New Hampshire, New York, Ohio, Virginia, and Wisconsin.

The goal of Enterprise Data Warehouse (EDW) program is to create a single, authoritative source for analytical reporting. The flow of this project is to getting the data on to the Enterprise Data Warehouse (EDW) using the Persistent Data Source (PDS) as the Input. Taking the PDS as input and applying all the business rules and transformations, the data would be loaded onto the Target data Base, which is standing on Universal Data Base (UDB). Loading of data into the EDW is done using the ETL Tool Informatica. Loading of data into EDW process need to be scheduled using Informatica scheduler.

Roles and Responsibilities:

Involved in analyzing the source data coming from different data sources such as Flatfiles, Oracle and SQL Server and identifying data anomalies in operational data.

Replicate operational tables into staging tables, Transform and load data into enterprise data warehouse tables using Informatica from their legacy systems and load the data into targets by ETL process through scheduling the workflows.

Extensively used Informatica client tools – Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager

Involved in developing the SQL’s, which are used to apply all business rules on the data before load into the target tables

Involved in implemented optimization techniques for performance tuning and wrote necessary Pre & Post session sql’s and shell scripts.

Involved in debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.

Used all the transformations like Filter, Joiner, Expression, Aggregator, Router, Rank, Normalizer, Union, Lookup and Update Strategy transformations

Used Mapping variables for Incremental Extraction of operational data

Created new Mainframe jobs, FTP and file watcher jobs to integrate with the overall job flow.

Used Parameter files to define values for parameters and variables used in mappings, sessions, worklet or workflow

Developed fine tuned Transformations, Mappings and Mapplets between source systems and warehouse components

Worked on Migration Strategies between Development, Test and Production repositories

Created and scheduled Workflows and Sessions using Informatica workflow manager

Coordinated tasks and issues with team lead and project manager on daily basis

Prepared Unit test documents, which captures the test conditions and scripts, expected/actual results

Used PMCMD commands in non-windows environments. Optimized/Tuned mappings for better performance and efficiency. Performance tuning of SQL Queries, Sources, Targets and sessions.

Responsible for ETL process under development, test and production environments.

Maintained warehouse metadata, naming standards and warehouse standards for future application development .Parsing high-level design spec to simple ETL coding and mapping standards.

TECHNOLOGIES USED:

Informatica Power Center 7.X, Oracle, SQL*PLUS, Flat files, SQL*Loader, TOAD, MS SQL SERVER, T-SQL, Windows XP, Unix shell scripting

EDI/HIPAA EDI Developer/QA May’01 to Mar’04

Due to HIPAA regulations BCBS MO is decided to use ANSI X12 formats for EDI transactions to send and receive data from clearing houses. This project is to convert the existing Inbound/Outbound transactions from UB92, NSF, 3041 ANSI standards and proprietary formats to HIPAA compliant 4010 ANSI X12 standard transactions (837, 835, 834, 270, 271, 276, 277 & 997’s).

Roles and Responsibilities:

Analysis of the specifications provided by the clients.

Involved in requirements gathering, getting the business rules written from business users

and defining the methodologies.

Played key role in full HIPAA implementation & test/verification life cycle process, analysis of

inbound and outbound files and formats.

Played a key role in EDI/ANSI X12 mapping for HIPAA like 270,271, 834,835, 837 and other formats.

Prepares Business requirements specifications, Design and HIPAA Test documents.

Responsible for creating and executing test plans (including test data & scripts preparation/Unit, system and partner testing / results verification, etc.), implementation of HIPAA Batch Transactions, and schedules for testing of multiple HIPAA releases.

Defects were tracked using Mercury Test Director.

Extensively used Mercator Design studio tools – TYPE Tree Designer, Mapping designer, Database designer, Integration flow designer.

Developed Mercator maps to interface with Sybase and mainframe environments, to extract

data based on business rules, process data and convert data into ANSI X12 formats.

Responsible for FTP scripts.

Developed shell scripts to transfer files from different systems.

TECHNOLOGIES USED:

MVS/ESA, Unix, Windows 2000, Informatica 6.x/7.x,Mercator Design Studio, Commerce Manager, Cobol, JCL & Sybase

Interplan Teleprocessing System Software Engineer Mar’00 to Apr’01

Interplan Teleprocessing System is to market Blues as a system, which means one Blue plan member can take the services of the other Blue plan provider network. ITS is developed by BCBS association to communicate the information back and forth between blues plans using a unified network which incorporates the common data formats (SF, DF, RF and NF), software and procedures with tools to access, send/receive and control data. ITS centralizes the multi-state accounts using electronic network standard formats for data.

Roles and Responsibilities:

Played a key role in preparation of design documents in accordance with the task specifications.

Implementation of changes for the ITS HOST mainframe project schedule, due to the installation of FACETS Host client server application.

Involved in reposting and resubmission of SF (claims submission format), DF, NF and RF format into formats database.

Modifications of the programs as required.

Creation of new programs and procs if required.

On call for the test systems.

Resolving test problems and re-promoting the software.

Executing the test file requests from the testers for testing purpose

Creation of new reports.

Preparation of test data for Unit testing.

Unit & system testing

TECHNOLOGIES USED:

MVS/ESA, Windows, Cobol, JCL, IMS, DB2, File-Aid, Via-soft, Sybase, Test Director,SQL-Navigator.

Year 2000 Conversion/BCBS MO Programmer Analyst Jan’99 to Feb’00

Performed the role of a Team Member during the Year 2000 Conversion of the modules: Finance System, Disbursement, Membership Eligibility & Broker System, Group & Subscriber, and Billing & Reconciliation.

Roles and Responsibilities:

Software Quality Management Authority (SCMA) related activities for auditing.

Maintained the modules inventory and the regular changes taken place based on Service Requests.

Involved in preparation of high level design documents and freezing of elements to be converted

Test data preparation

Before and After Image testing of the converted modules

Promotion of elements to the test environments and to production.

Development of bridge programs

TECHNOLOGIES USED:

JCL, MVS-COBOL, OS 390, IMS-DB/DC, VIASOFT, FILE-AID, Endevor, SYBASE, ORACLE, Win runner, Test director, SQL-NAVIGATOR, WINDOWS-XP and UNIX.

CSX Transportation, USA Software Engineer Jan’97 to Nov’98

Gladstone Date Package (GDP) Software Engineer Apr’98 – Nov’98

The Gladstone Date Package (GDP) provides tools, which can simplify the tasks associated with making the client’s application become Year 2000 compliant. It contains Analysis & Aging Process for IMS Database / DB2 / VSAM files. Gladstone’s Software is systematic and through, providing the user with field-to-field comparisons throughout the conversion process to insure the integrity of the data.

Roles and Responsibilities:

Played a key role in analysis of IMS database/ DB2 tables / VSAM files

Aging of date fields to create test data to test the Y2K compliance of the systems.

TECHNOLOGIES USED:

Mainframes, Windows 95, Cobol, JCL, IMS, DB2, File-Aid, Changeman & CICS

CSX – Maintenance Software Engineer Jan '97 to Mar '98

The offshore maintenance support team for the Transportation Management System involved in maintenance of online and batch applications such as Automatic Way billing System (AWS), Bill of Lading (BOL) and Finance by solving the netman tickets and problem logs.

Roles and Responsibilities:

Played a key role in production support of the daily batch jobs and online transactions.

Taking care of the job abends, back outs and data fixing.

Respond to Netman tickets promptly and verify the fix before closing the ticket.

Updating the maintenance logs to keep track of the problems.

TECHNOLOGIES USED:

Mainframes, Windows 95, Cobol, JCL, IMS, DB2, File-Aid, Changeman & CICS

EMS Programmer Jun'96 to Dec'96

This project Equipment Maintenance System helps in the maintenance of motor equipment with coordination between stores and purchase departments at different locations. This application also helps in maintaining the history of the equipment (including maintenance, servicing, and breakdowns), wear and tear of the vehicle, miles traveled, etc.

Roles and Responsibilities:

Played a key role in analysis of requirements

Played key role in creating the forms

Unit & System testing

Created reports

TECHNOLOGIES USED:

Oracle, Power Builder, Windows 95



Contact this candidate