Post Job Free
Sign in

Data Project

Location:
Germantown, MD, 20874
Salary:
108,000
Posted:
September 15, 2012

Contact this candidate

Resume:

Mack Dai

Senior Informatica ETL /Data Warehouse Developer

240-***-****(Cell), Email:abr6ic@r.postjobfree.com, abr6ic@r.postjobfree.com

SUMMARY

• Over Eight Years Informatica 9.0/8.x /7.x working experience in data warehouse, ODS and OLTP database.

• Over ten years ETL work experience using Teradata Fast loader(Fast Export), SQL LOADER, PERL DBI&DBD, Unix KORN shell/SED/AWK/PERL, Oracle PL/SQL Stored Procedure and Java/JDBC

• 10 year experience in developing UNIX shell applications using KORN Shell, Bourne shell, AWK, SED,PERL

• 10 year experience in developing large number of SQL statements, stored procedures/functions using Oracle SQL(PL/SQL) and Teradata SQL.

• Four + Years Working experience using Cognos 7/8 Framework manager, Report Studio, Query Studio to generate complicate reports and dash Board.

• Over 15 years experience in object-oriented application development and design using design pattern and UML

• Over 8 years experience in Java/J2EE(java/ JSP/Servlet/Struts/EJB/JDBC/JNDI/XML/Web Service)

• 7 years experience in XML/XML Schema/DTD/DOM/Xerse2.0/XLST/XPATH/XFORM/ JAXP1.2

• One Year working experience as Oracle DBA to administrate Oracle database and backup data using RMAN and Veritas Netbackup5.1

• 10 Years Database working experience in Oracle, SQL server, DB2,My SQL and Teradata12

• Extensive experience in developing complicate application using C/C++,VB, VC++ ( MFC & ATL)

• Extensive knowledge and experience in Directory Service(Sun ONE LDAP ) and JCA/JCE

PROFESSIONAL EXPERIENCE

SRA International July 2012, 2012-Present

Senior Informatica ETL Developer in FDA MARCS development team

This project is funded by FDA. I am participating in designing and developing ETL application to migrate and integrate FDA data from FDA legacy system(FACTS) into current online system(MARCS) using Informatica 9.1.I am also participating in development job of distributed database Synchronization using Sybase MobiLink 12

Quality Software Inc. (QSSI) April 2010 –July,2012

Senior INFORMATICA ETL Developer in CMS IDR Development Team

Developing DST/1a (Delivery System Tracking) project

This project is funded by CMS. I am designing and developing the ETL architecture and applications. CMS wants to merge 2007~ 2010 all data in CMS Chronic Condition Data Warehouse(CCW) into IDR data warehouse. Source data files are SAS generated data files, which include Beneficiary Summary and Condition data files, Part D service level data files, Part A and Part B service level analytical data files, Readmission and quality data files. Since the data files are very big (nearly 1000 G), we use taradata best practice scenario. That is we use teradata fast load utility to load source data into IDR stage tables and then use INFORMATICA full push down to extract, transformation(normalize) and load data from stage tables into target tables. All source data file loading are automatically processed and loaded (Source data files are put into a folder, and then a master script is run to start loading source data automatically).

Developed HITECH NLR (National Level repository) project

This project is funded by CMS. As one of Key Persons, I participated in whole life cycles of this project. I participated in NLR project’s design and development which include Hospital Based Determination, Allowable Charge calculation and HPSA determination. The source data of the project come from IDR CLMNCH part B tables and VSAM flat files (HCPCS Codes and HPSA zip codes).I also developed Teradata fast export scripts to export Hospital based, Allowable Charge and HPSA V2 views into three flat files for CMS end users. The development environments include Solaris 10, Teradata version 12/13, Informatica 8.6/9.0

Developed NPI Crosswalk Project

This project is funded by CMS. This was an existing project. Since source data was changed from snap shot source data to delta source data and the data layout was also changed. I re-designed and re-developed source data loading ETL application to load source data into stage tables and also modified existing ETL application to load historical legacy NPI data and their legacy/ NPI mapping data into IDR tables.

Z-Tech Corporation, a ICF International Company Dec. 04 – April, 2010

Senior INFORMATICA ETL Engineer, Cognos8 BI developer

Developed Chronic Care Initiative-Information Management System (CCI-IMS) 04 ~Present

This project is funded by Centers for Medicare & Medicaid Services (CMS).CCI-IMS is a large, regional pilot program that will support up to 30 regional groups that are supported by Medicare and Medicaid. This project includes three parts: Data warehouse, Cognos7.x/8.x reporting and Java/J2EE Web Management System.

• Participated in Design of CCI-IMS Data Warehouse (ODS) and Data Mart.

CCI-IMS Data Warehouse (ODS) reserve beneficiary, claim history data( Intervention and

Control Group), Beneficiary daily eligibility data, beneficiary participation data, PDE data and netted

claim data. These source data files are VSAM sequence flat files and come from CMS mainframe. Some

data files have variable length record (OCCUR DEPENDANT ON CLAUSE in Cobol Copybook).

• Designed and developed CCI-IMS ETL Data Automatic loading Framework using Informatica PowerCenter7.x/8.1 and PowerExchange8.1.1 and UNIX KORN Shell

- Designed PowerExchange8.1.1datamaps and transfer them to remote node using PowerExchange8.1.1 navigator.

- Imported data map and create Source definitions using PWXC (Power Exchange Client).

- Developed transformations, mapplet, mappings, sessions and workflows to implement complex data transformation and loading logic

- Developed parameter files for sessions to pass database connections, source file name, log file names, and other parameters.

- Developed Unix KORN shell scripts to automatic loading data which includes starting workflow and dynamically passing session parameters to parameter files

• Designed and developed Audit ETL/Informatica PowerCenter processes including mapping, session and workflow. The IMS audit source data(daily eligibility, participation, etc.) are pipe line delimiter, fixed length flat file and report formatted text file.

• Designed and developed Data De-identification Framework using Informatica Powercenter8.1.1

This application can de-identify CMS beneficiary data, netted claim data(original and refresh), daily eligibility data and baseline daily eligibility data, randomization bucket data. All the data files are VSAM flat files, which come from CMS mainframe. The de-identification business logics are implemented by Informatica PowerCenter8.1.1 mappings/sessions/workflows and automation of data process is implemented by KORN shell scripts .For the variable length netted claim data files, we use Java customer transformation to process “OCCUR DEPENDANT ON CLAUSE” data structure. In order to improve performance, I use multiple partition to process Original baseline Carr data(the size of one file>8G).

- Designed algorithms to de-identify beneficiary, netted claim and other data.

- Developed mappings, session and workflow to de-identify beneficiary data which include three

HICANS, BOD and etc.

- Developed mapping, session and workflow to de-identify claim, baseline claim, randomization

bucket, daily eligibility data.

- Developed Unix KORN shell scripts to start work follows and call PGP application to encrypt output files.

• Designed and developed Monthly Beneficiary and Claim Data Process Using Informatica PowerCenter 8.1.1

The source files, target files and look up files are pipeline delimiter flat files. The source files include intervention and control group data. The output files are separated by each MHSO organization and intervention and control group

- Import variable length claim data and use Java transformation to process each record.

- Developed mapping, session and workflow to split control group data for each MHSO.

- Developed Unix KORN shell scripts to automatically create data file initial cycle directories, call Informatica workflows, pass parameters dynamically and call PGP application to encrypt output files.

• Installed and administrated Informatica power center7.1.2/Informatica PowerCenter8.1.1

- Installed PowerCenter7.1.2 and Powercenter7.15 (PM server Repository Server and Repository)

- Installed Powercenter8.1.1 (Integration service, repository service)

- Installed Informatica7.1.2.

- Upgrade Repository from version 7.1.5 to version 8.1.1

- As Administrator, manage Informatioca Powercenter security(User Group and privilege), Backup repository and shutdown and start server

- Installed/Upgraded Informatica Service Pack(5.0) for Informatica8.1.1

• Participated in Modeling CCI-IMS Data ware house (ODS) using Erwin4.1.2 and Web Application data base

- Designed Logical Models

- Designed Physical model

- Generated DDL and Schema in Oracle Database

• Developed Ineligible Beneficiaries Report using COGNOS7.1.

- Developed oracle stored procedure to aggregate data, to implement complex logic.

- Modeled ineligible beneficiaries using Cognos7.1 framework manager and published the package to ReportNet server.

- In Report Connection, I used Query Studio and Report studio to design Ineligible Beneficiary Report Specification and then generate HTML, PDF and Excel format report for CMS.

• Developed McKesson Original/Refresh Cohort Spillover Reports Cognos7.1

- Developed Oracle Procedure

- Created model for this oracle stored procedure in Cognos7.1 Framework Manager and publish it to

Cognos reportnet server.

• Developed Randomization Bucket and Flag Reports using Cognos7.1

- Create Data Model using Cognos Framework Manager

- Create Report using Congos Report Studio

• Developed yearly and quarterly Historic Cohort Refresh Reimbursement reports

- Developed Oracle Stored Procedure.

- Create model using Framework manager

- Create report using Report Studio.

• Designed and developed IMS Data Processing framework/application using Unix KORN shell and PERL scripts for monthly beneficiary and claim data split. The source files coming from mainframe and they are EBDIC text file with occur clause. The script application first remove control characters from the source files, normalize claim data and split claim data into one parent file and various children files, remove disqualified beneficiary data from beneficiary files, check and delete orphan data from claim files of seven claim types, split beneficiary and claim files for MHSOs, create report to display what is different between two beneficiary files and encrypt splitting files before the files is sent to MHSOs.

• Designed and developed De-identifying application using KORN shell and PERL scripts for quarterly netted claim data and produce HICAN Map file by which de-identified claim data can be related to specific beneficiary. IMS claim data in control group would be sent to MHSO to analyze. The data should be de-identified before sent to MHSO.I designed algorithm of de-identifying and also developed the application.

• Designed and developed PDE application using KORN shell and PERL scripts for monthly PDE data.

• Designed and developed Unix KORN shell scripts and oracle SQL loader control file to process and load disqualified beneficiary/PDE data into data base. It is automatic loading scripts. It scans all files in inbox and automatically create SQL Loader control file dynamically.

• Designed and developed CCI-IMS MHSO file messaging ETL application. I developed Unix KORN/AWK shell scripts and Oracle stored procedures to create and update downloading file information into CCI-IMS repository and message-center tables.

• Designed and analyzed system requirement and functions for CCI-IMS project based on client requirements. As senior software engineer and J2EE technical leader. I participated in CCI-IMS system and function design and document them.

• Participated in developing CCI-IMS web application.

• Designed and developed NDM (Connect: Direct3600) Data Transfer Agent using java and JDBC to implement data transferring remotely and securely. By this agent, CCI-IMS Web application can fully integrate with NDM (Connect: Direct) for uploading and downloading. In the downloading/uploading web pages, a list of files is displayed to be downloaded and uploaded. After user clicking execution button, the web application through Data Transfer Agent to pass transformation scripts to NDM server to transfer data, and the Agent checks the status and creates/updates file information in CCI-IMS Data Base. This java Agent implements whole data upload and download logic from MHSOs to MPR, MPR to MHSOs and creates messages to notice MHSO, or MPR that the file is available to download, or successfully downloaded.

• Installed, configured and maintained PGP encrypt software in DEV, Test and Production environment

• Installed,configured and maintained NDM(Connect:Direct3600) Server in DEV,Test and Production

Environment

• Participating in Installing Cognos rpt 7.x server and Cognos 8.1/Cognos 8.3 Application server and web Server.

• As a database administrator, I participated in administrating database( ODS ) and backing up data using RMAN and Veritas 5.1

Sun Microsystems July,01 – 10/04

ETL and ETL Tool developer

Designed and development Data Migration application using Informatica PowerCenter6.5

This is sub project of ELP, which of the tool is provided to Sun’s client to migrate/create their new database. The sources are rational database tables, pipe delimited flat files and XML files. we designed mapping/session/workflow to insert data and also update data.

Designed and developed a ETL Tool for ELP4.1 & SLP3.0 using Java/J2EE

This ETL tool is a sub product of Sun Microsystems ELP4.1 and SLP3.0 system. The purpose of the tool is

to create new massive records, or update massive existing records for eleven data types in online oracle

transaction database. The data created in database are compatible to the business logic of ELP system. The source

file is pipe line delimitated flat files and XML files.

• Designed business logic, function, architecture and objects using UML, object-oriented methodology and related patterns.

• Developed Java classes. In the project, I used XML and flat as input files, and Jaxp1.2 and Xerser2.0 as client side parser API to process XML input files and TX_BEAN_MANAGED EJB as server side business components. I used User Transaction to manage server side transaction. I also implemented internationalization of Chinese, Korea and Western Europe Characters in this ETL project

Senior Java/J2EE Software Engineer, oracle database engineer and technical Leader

Developed Client Agent for Sun 9980 Manage Storage Platform

• Designed Java classes/objects using UML, object-oriented methodology and related design patterns and implemented them. I used DOM/XML to manipulate data coming from storage center, used JCE to encrypt the XML output file.

• Developed Unix Born shell, AWK and PERL scripts to manipulate the xml file (adding checksum, coping file and deleting file at specific date).

• Set up cron task to automatic run the provider in Solaris Server

Developed Managed Service System for Sun Data Center using SOA and Business Process

Management (BPM) Architecture

• Participated in design managed service framework using J2EE architecture and ITIL Framework.

• Installed, evaluated and integrated third part software products like Peregrine, Remedy, Centauri and OSS/J.

• Developed Intalio workflow/BPM process to centrally handle business logic.

• Designed and Developed Web Services, EJBs and java classes to provide specific business functionalities, or services.

• Developed portlet for Weblogic Portal Server.

• Installed, managed and deployed Sun ONE LDAP Server 5.1.

Developed ILMS/ ELP3.0/ELP3.5/ELP4.0/ELP4.1/SLP3.0 Based on J2EE

• Participated in design Sun ONE LDAP schema object classes and Developed Sun ONE LDAP Java client application using Sun ONE LDAP SDK to authenticate user.

• Designed and developed student and admin UI using JSP/Servlet/JavaBean/Struts.

• Designed and developed Business component using Session/Entity Bean

• Designed and developed messaging client using JMS/MDB for asynchronous business logic (sending group emails).

• Developed AICC and SCORM package using XML/JAXP/SAX/DOM/XSLT for student course description.

• Designed and developed third part agent remote interface using JWSDP/Web Service.

• Participated in design framework for Globalization, Internationalization of Japanese, Chinese, Korea and Western Europe Characters

• Participated in bug fixing and enhancement from version 3.0 to 4.1

• Developed Oracle Stored Procedures to query data from database.

• Creating Massive testing data using Oracle SQL loader and PL/SQL stored procedures.

Isopia, Toronto, Ontario, Canada 01/01 - 06/01

Senior Java/J2EE Software Developer

Developed E-Learning System with J2EE Solution.

• Developed stateless and stateful session beans to implement core business logic. I used SQL and JDBC, or Entity bean to talk with oracle database, and used CMT to implement transaction management.

• Responsible for integration test and some features test.

Cendec System Corporation, Calgary, Alberta,Canada 10/00 - 12/00

Senior Java/J2EE Software Developer

Developed Material Integration and Maintenance Project Based on J2EE

• Developed server side component. I was responsible to code for EJBs(including session beans and entity beans),and some web JSP/Servlet.

• Deployed and managed of BEA Weblogic Commerce Server 3.1 and BEA Weblogic Application Server 5.1.

EDUCATION

Chongqing University, P.R.China, 1997

PHD Degree in Engineering

Training

• Architecting and Designing J2EE Applications (SL-425)

Jun. 16 2003 ~Jun.19, 2003, Sun Training Center, 27 Markham, Toronto, ON, Canada

• Object-Oriented Analysis and Design Using UML (OO-226)

May 5, 2003 ~May 9, 2003, Sun Training Center, 27 Markham, Toronto, ON, Canada

Certificate

• INFORMATICA Certified Power Center Designer.

• Certified Sun Java Programmer, Certified Sun Web Component Developer, Certified Sun EJB Developer and Certified IBM XML and Related technology Developer



Contact this candidate