Post Job Free

Resume

Sign in

Data Manager

Location:
Col, 5273, Slovenia
Posted:
March 29, 2017

Contact this candidate

Resume:

SUMMARY

Most of my experience has been gained from my employment with Computer Science Corporation as a Technical Analyst for 10 years. At CSC I have gained a wide array of experience.

** ***** ** hands on experience in database, ETL, data warehousing technology and ETL tools such as IBM Web Sphere/Ascential DataStage 8.1/7.5/7.1/6.0/5.2, Informatica Power Center 9.1,9.6, Talend Studio for Big Data, Integration

Data modeling tool ERWIN 4.0/3.5 and OLAP tool such as Business Objects 5.1/5.0, Crystal Reports.

Well versed with full software life cycle analysis and development, Data warehousing and application development. Highly knowledgeable in Database management systems, UNIX, Client/Server application development, Software Testing.

Experience in developing Architecture for building a Data warehouse and Data migration, Extensive experience in design and implementation of Star, Snowflake schemas and multi dimensional modeling, Data Vault Modeling.

Adept in writing, testing, and implementation of the Triggers, Procedures and Functions at Database level using PL/SQL, experience in working with various flavors of UNIX, Shell Scripting, DataStage Command language, usage of Auto Loader, SQL * Loader.

SKILLS

ETL Tools

IBM DataStage 7.5/8.1, Informatica Power Center 9.6,1, 8.x/7.x, Talend

OLAP Tools

Business Objects, Crystal Reports, WebI Reports

Data Modeling Tools

Erwin 4.1/4.0/3.5, Designer2000/6i

RDBMS

Oracle 11g/10g/9i, Sybase 15/12.5, Teradata, Netezza, MS SQL Server 2000/7.0/6.5/2012, MS Access 97/2000

Domain

HealthCare, Supply Chain, Retail & Marketing,

Languages

SQL, Unix, PL/SQL, DataStage/Informatica Command Language

GUI

Crystal Reports 7.0, Visio, Nexus, Putty, HP QC

Other tools

SAPBI 7.0, ECC 6.0, SAP Modules: SD,MM/WM,PP, FI/CO

Query Designer, BEX Analyzer,BEX Web,WAD

Sybase Central, Oracle Utilities, SQL*Loader, TOAD, SQL Navigator, Facets, TeraData Assitant

Operating Systems

IBM AIX 5/4.3/4.0, SOLARIS, HP-UX, WINDOWS NT/95/98/2000/XP

WORK EXPERIENCE

Oct 2016 – Present

Centene Corp, St Louis, MO

Lead ETL Developer

Centene Corp is a Health Insurance company located in St Louis, MO. Centene provides health plans through Medicaid, Medicare and the Health Insurance Marketplace and other Health Solutions through our specialty services companies.

Role:

Worked as a Lead ETL developer for the CMS DataMart project.

Worked diligently on developing ETL mappings, Data Quality applications to support vital CMS Auditing process

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Extensively worked on loading a Data mart, worked closely with Data Warehouse analyst and successfully designed the overall ETL solution using Informatica PowerCenter 9.6.

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Developed mapping parameters and variables to support SQL override.

Created mapplets to use them in different mappings.

Developed mappings to load into staging tables and then to Dimensions and Facts.

Used existing ETL standards to develop these mappings.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Responsible for design, development, testing, maintenance, and support of end to end ETL solution for loading the data mart.

Utilized best practices to analyze data sources and target data models Developed scalable, reusable ETL jobs using the Informatica PowerCenter.

Demonstrate analytical and business domain knowledge

Oct 2015 – August 2016

Anheuser-Busch InBev Company, St Louis, MO

Application Developer

Anheuser-Busch InBev Company brewery complex in St Louis, MO. Implementation of Infor Scheduling application for Materials management.

Role:

Worked as a senior database developer for the largest brewing company in St, Louis, MO.

Worked with data management team and developed complex Stored Procedures, Triggers, Functions, Indexes

Views, Joins and T-SQL code for Material Scheduling applications.

Managed indexes, statistics and optimized queries by using execution plan for tuning the database.

Responsible for performance tuning and Optimization of stored procedures using SQL Profiler and Database tuning wizard.

Worked closely with team members and end users to understand and analyze requirements.

Worked extensively on defect management using HPQC in writing and debugging complex stored procedures, triggers, Inner Joins, Outer Joins, views and user-defined functions.

On a regular basis maintained SQL scripts indexes and complex queries for analysis and extraction.

Extensively worked with WebSphere DataStage Job Sequences to Control and Execute DataStage Jobs and Job Sequences using various Activities and Triggers.

Used WebSphere DataStage Designer for importing metadata from repository, Flat files, new job categories and creating new data elements.

Used the WebSphere DataStage Designer to develop processes for extracting from SAP Sales module, cleansing, transforming, integrating, and loading data into the data warehouse.

Designed jobs to change the source database to reflect to new database and loaded the data in to the target database according to the transformations needed.

Designed WebSphere DataStage jobs for Insertion, Updates and deletion process

Demonstrate analytical and business domain knowledge

SEP 2013 – Oct 2015

MD Anderson Cancer Center, Houston, TX

Lead ETL Developer/Analyst

The University of Texas MD Anderson Cancer Center, located in Houston on the campus of the Texas Medical Center, is one of the world’s largest and most respected centers devoted exclusively to cancer patient care, research, education and prevention.

Role:

Worked as a Lead ETL developer for the Cancer Center in Houston, Texas, leading team of six developers including offshore developers in India.

Worked on several projects while at the eResearch department at the project, which manages heterogeneous clinical trial management systems/applications.

Worked diligently on developing and integrating heterogeneous applications to support vital clinical research efforts.

The most recent project that he worked on was loading a Data mart, worked closely with Data Warehouse analyst and successfully designed the overall ETL solution using Informatica PowerCenter 9.6

Responsible for design, development, testing, maintenance, and support of ETL for loading the data mart.

Worked closely with team members and end users to understand and analyze requirements and develop BI solutions by demonstrating a clear understanding of BI and Data Warehouse operating environments and related technologies.

Utilized best practices to analyze data sources and target data models Developed scalable, reusable ETL jobs using the Informatica PowerCenter.

Participated in strategic planning, design and delivery of next generation of Data Mart using Big Data Technologies.

Regularly performed ongoing technical assessment of ETL and database environments for operational considerations including performance, capacity, redundancy, stability and security. He partnered with operational teams on implementations that address issues and proactively.

Worked closely with the business, other architecture team members and global project teams to understand, document and design data warehouse processes and needs.

Monitored and resolved Data Warehouse production issues

Participated in strategic planning, design and delivery of next generation of Data Mart using Talend Big Data Technologies.

Created complex mappings in Talend 6.0.1/5.5 using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, etc.

Involved in creating AmazonS3 buckets using tS3BucketCreate, tS3BucketDelete, tS3Get, tS3List, tS3Put.

Efficiently involved in creating Talend Mappings to populate the data into dimensions and fact tables.

Involved in using File Components like tAdvancedFileOutputXML, tFileInputExcel, tFileInputJSON, tFileInputMSXML, tFileInputXML - File, tFileOutputARFF, FileInputDelimited, tFileOutputDelimited, tFileOutputExcel, tFileOutputJSON

Created job lets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.

Extracted/loaded data from heterogeneous source systems running on Oracle, Lotus Notes database and load into target system running on Oracle Exadata Machine

Demonstrate analytical and business domain knowledge

September 2008 – July 2013

BCBS of Rhode Island, Providence, RI

Senior ETL Developer

BCBSRI is a for profit corporation, serving its members since 1939. Blue Cross & Blue Shield of Rhode Island has been dedicated to improving the health of its members, strengthening relations with providers, and simplifying its business processes. BCBSRI’s current core system is Long Range System Planning (LRSP) running on IBM mainframes. Blue Cross Blue Shield requested CSC to replace LRSP with Facets. The multi-million dollar Facets implementation will be done in collaboration with Trizetto.

Role:

Provide WebSphere DataStage ETL tool expertise to the team.

Work with several teams and projects in the BCBSRI EDR environment

Work with FACETS configuration team and FACETS customized solution for BCBSRI business needs

Work closely with Director, data warehouse analyst to understand the business requirements

Design the specifications for the transformations of the data from FACETS to EDR

Analyze data models of the Facets application and target systems (Netezza) to develop comprehensive mapping specifications for the Data Warehouse

Create and ran internal and external customer specific crystal reports

Extract/Load data from FACETS source systems running on SYBASE database and load into target system running on Netezza, utilizing Oracle and Flat files

Design, develop, unit test the DataStage ETL programs and assist other team members with DataStage Tool

Develop simple to very complex ETL jobs to extract Provider, Member and Claims data from FACETS system and transform to fit into EDR needs.

Design solutions and work products in moving data through source to staging and target systems.

Work on small and large releases through SDLC, assist and support the code migration and testing efforts through UAT.

Continue to improve the design of work products and provide production support on the Facets extracts.

Offer suggestions where process improvement may be applied to current or future functionality

Assist in development of system test data, identify and raise issues, and communicate status to his CSC lead and client on a weekly basis

Ensure the quality and added value of the work products, demonstrate an awareness of the Catalyst approach

Create technical specification and unit test cases documents for non-claims and medical claims subject area

Prioritize work, manage own time and accurately estimate the time remaining to complete tasks

Serve as a billing manager for team of 30, on shore and off shore, verified and approved time and expenses

Responsible for managing the billing and invoicing process, generates the invoices, verify and submit to client

Provide leadership to facilitate the completion of payments for the invoices billed to the client

Environment: IBM Webshere DataStage 8.1/7.5.2 (Server version and Parallel Extender), Sybase 15, Oracle 11g and 10g, Netezza, UNIX AIX and FACETS, Crystal Reports.

Sept 2006 – August 2008

Member Health, Baltimore

Data Analyst/PL/SQL developer

In the Medicare Part D program, CSC functions as subcontractor to Member Health, who serves as the Prime Prescription Drug Provider under contract to the Federal Department of Health and Human Service's CMS organization. CSC functions as the operations element of the program, with responsibility for IT software and operations, Beneficiary call center support, and Federal compliance, to include Fraud and Abuse. The program initiated operations in November 2005, one month after final award from CMS. The program goal for FY2006 is 1.5M enrollments, yet by March 2006 has approached the 1M enrollment count.

Role:

Design and develop programs for assigned components using several technologies.

Develop programs to reconcile Facets (Claims Database) and Pharmacy (SXC database) databases

Provide expertise of Facets Data Model and supported data structures, with the ability to query tables as part of the configuration validation and auditing process

Developed stored procedures using Oracle Analytical functionality and utilized them as needed to query, report, and/or update the reconciliation process using PL/SQL and SQL plus

Develop complex ETL programs using Access and stored procedures to validate input data files and produce Data Quality Reports.

Performed data analysis as needed to track, research, and account for any data discrepancies and exceptions that may be encountered

Worked with various managers, to define reporting requirements and created reports to meet their specific business requirements, example, ad hoc reports, web intelligence reports, and crystal reports.

Accountable for the collection and analysis of business requirements during the Facets conversion

Worked with the testing team for FACETS implementation for the Medicare Part D plan

Responsible for all aspects of testing including all the FACETS batch jobs that have been unit tested by the developers

Responsible for producing test plans, document test scenarios, and write manual test scripts for System/Integration Testing and User Acceptance Testing.

Responsible for logging the defects identified during Facets testing into Quality Center

Designed and developed system test plans, test cases, and test scripts from documented business system requirements

Responsible for testing batch jobs for Performance Engineering for Facets Batch Jobs and worked closely with developers to optimize the queries and jobs

Responsible for optimizing SQL queries for better performance using Hints, Indexes and Explain Plan.

Develop custom data extract programs for client deliverables.

Develop necessary documentation.

Assist business team in Data Analysis and ad-hoc reporting.

Generate weekly project status reports

Develop and test programs to support invoicing sub-systems in Oracle 10g

Responsible for developing, testing and executing programs as needed to query, report, and updated Medicare Part D data using SQLPLUS, PL/SQL, and shell scripts

Responsible for developing programs as needed to support ad hoc reporting needs

Responsible for maintaining documentation of data issue resolutions and audit trails

Guiding, teaching, and mentoring team members whenever necessary

Environment: Oracle9i, 10g and11g, Sun Solaris, Crystal Reports, SQL*Ldr, Unix Shell.

Jan 2006 – July 2006

Johnson and Johnson, Piscataway, NJ

Informatica Integration Developer

Johnson & Johnson Health Care Systems Inc (J&J HCS) is a central organization providing various services, including Contract Administration, Sales & Distribution, Logistics and Supply Chain Management to 25 J&J Operating Companies. Within J&J HCS there exists a Data Management stream that is responsible for defining and maintaining data standards and Core Master Data, including the governance of said Master Data across the organization. The Data Management stream has defined the establishment of Core Masters as an essential component of their Data Management Strategy for supporting the key functional areas of Order-to-Cash, Contract Excellence and Warehousing/Analytics. The targeted Core Masters are:

Customer & Customer Membership Master

Product & Product Hierarchy Master

Pricing Master

Sales Alignment Master

To achieve its goal of establishing Core Masters, J&J HCS sanctioned the Data Excellence project. The Data Excellence project initiated with an effort to define the overall program scope, Core Masters maintenance process maps and associated data elements.

Role:

Master Data Management (MDM) project with one of the Fortune 500 company

Worked as integration developer using Informatica

Identified controls and creating functional and technical specifications for transformation of data to ensure the proper delivery of the data from hodgepodge source systems

Assisted in data analysis efforts necessary to resolve issues uncovered during the testing of ETL process related to source data

Updated logical and physical data models for the data warehouse using ERWIN and created process flow diagrams

Developed ETL process to handle daily feeds of transactional and analytical data from several source systems

Designed, developed, and deployed common ETL components and integration solutions using Informatica

Extensively used Informatica client tools-Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager. The objective is to extract data stored in different databases such as Oracle and to load finally into a single Data Warehouse i.e. Oracle.

Tested Informatica mappings and adjusted them when necessary to provide optimal performance

Ensured efficiency, as well as quality, to time sensitive project deadlines

Data Modeling using Erwin.

Environment: Informatica Power Center, SAP R/3 5.0, Oracle9i, IBM AIX 5.1, Cognos Reporting, TOAD 8.0, Putty, Windows XP Professional SP2.

July 2005 – Dec 2005

GE Aircraft Engines, Cincinnati, OH

ETL Developer

GE Aircraft Engines is a retailer for aircraft engines to different airlines and countries. The project initiated to support QA Development and support of sales data warehouse to ensure the data quality using Informatica. This data warehouse helps analyzing historical data for analytical processing, which assists the business groups to monitor the trends and to come up with new marketing and sales plans.

Role:

Worked as ETL developer for one of the Fortune 500 company

Supported the Sales Data Warehouse to ensure data quality, and delivery using Informatica

Identified controls and created design documents to validate the proper delivery of the data

Extracted data from different sources such as SAP SD module, Oracle and flat files, and loaded them onto Oracle target table for further processing

Created and supported numerous ETL mappings using Informatica designer, warehouse designer, transformation developer, Informatica repository manager to extract data from multiple source systems

Designed and created target tables using SQL/Plus in the Oracle database 9i

Created detailed functional and technical specification for the complete development

Environment: Informatica Power center 7.1, SAP ECC (SD module), Cognos Power Play 6.6, Transformer 6.6, Cognos Impromptu, Oracle 10i, flat files, Vi Editor, Toad, Windows 2000/NT.

June 2004 – July 2005

WMC Mortgage Corporation, Oklahoma City, OK

Data Warehouse Developer

WMC is a wholesale Mortgage Lender that specializes in flexible financing alternatives for borrowers who have less than perfect credit scores, high debt ratios, foreclosures and bankruptcies. Develop and support of business intelligence systems in order to satisfy the reporting needs.

Role:

Worked closely with project manager to identify technical tasks to be completed to meet the business requirements, Development Request

Used various tools in Informatica Designer like Source analyzer, Warehouse architect, Transformation developer.

Designed and developed various mapplets using mapplet designer.

Used the Repository manager to give permissions to users, create new users, repositories

Designed and created target tables using SQL/Plus in the Oracle database

Created mappings using Informatica Designer to build business rules to load data and tuned them to enhance the performance. Most of the transformations were used like the Source qualifier, Stored Procedure, Aggregators, lookups, Filters,Updating, Sequence and summation

Used Informatica Power Center Work Flow manager to create sessions with the task developer and workflows with Workflow Manager to run with the logic embedded in the mappings

Tuned and tested the mappings to perform better using different logics to provide maximum efficiency

Scheduling Informatica batches and sessions to synchronize between multiple source systems

Preparing Test Plans, executing Test Cases, Documentation and preparing Test scripts.

Assisted in developing Project Run Book, which will help Production support team.

Environment: : Informatica, PowerCenter 5.1/6.1, Cognos Power play 6.6, Transformer 6.6, Cognos Impromptu, Oracle 8i, DB2, UNIX vi Editor, Ervin 4.0, Windows 2000/NT 4.0.

Academics Affiliation:

Honor’s Roll Fall ‘02, Spring ‘03

Member of Association of Information Technology Professionals (AITP), Spring ‘03

EDUCATIONAL QUALIFICATION:

Bachelors in Information Systems, University of Central Oklahoma

Second Bachelors in Business Administration, University of Central Oklahoma Major: Management



Contact this candidate