Post Job Free

Resume

Sign in

Data Manager

Location:
Dublin, OH, 43016
Posted:
March 15, 2011

Contact this candidate

Resume:

PENG YEW LIM

Senior EDW/BI Architect

BMW Financial Services Spot Award holder

Address Email

**** ******** **** abhqpc@r.postjobfree.com

Dublin OH 43016-8421

Tel 614-***-**** (Cell)

614-***-**** (Hm)

Employment Objective

A position that would utilize my skill sets to achieve company's business

and technical goals.

Education

Master of Engineering (Electrical and computer systems) - RMIT University,

Australia, 1988

Legal status

Green card holder

Skills

* Over twenty (20) years IT experiences.

* Fourteen (14) years solid data architect experiences in banking,

health care, manufacturing, retail, telecommunications, retirement and

software services.

* Implemented data flows among ERP, EPM, data governance, data

stewardships, data cleansing, data processing, data standardizations

and data de-duplications systems.

* Implemented data hub, ODS, data warehouses, Decision Support System

(DSS).

* Lead data architect/Manager. Provided expertise on data processing,

conversions, logical and physical modeling, building data hub, ODS,

warehouse/data mart.

* Have extensive MS SQL Server 2000 and 2005 experiences. Coded DTS

packages.

* Expert in PL/SQL, Informatica and DTS coding for ETL Processing.

* Expert in converting old legacy data to new applications.

* Ensure data integrity and data quality.

* Ensure streamline data flow among systems with data tracking and data

traceability.

* Created a single version of truth. Distributed the cleansed data to

the downstream systems via flat files, excel, xml, web services, MQ

series or message brokers.

* Solved complicated problems with simple solutions. Working out the

best application to handle the processing of data.

* Expert in ETL coding, reporting and implementations.

* Mastered a set of strategies for delivering projects on time and on

budgets.

* BI Guru. A go to 'person' for your queries answered.

* Provided expertise on high performance Informatica coding strategy and

session tuning.

* Solid Cognos Framework Manager, Report Studio, Analysis Studio and

Query Studio, ReportNet, Transformer, Powerplay skills.

* In depth knowledge in data modeling, high performance database and SQL

tuning, Extract-Transform and Load (ETL), ROLAP, MOLAP and DSS, data

manipulation, DBA duties, Erwin and Cognos.

* Design ETL Do and Don't protocols, table relationship standards,

coding standards, data de-duplication and cleansing standards.

* Provided mentoring and knowledge transfer to developers, guided them

in every aspects of development phases.

* Attention to details. Able to handle multiple tasks concurrently.

* Effective communicator. Excellent speaking and writing skills.

* Awarded the BMW Financial Services Spot Award for designing, planning

and executing the smoothest ETL process.

* Databases: DB2 UDB, Oracle 10g, 9i, 8i & 7, Microsoft SQL Sever 2000 &

2005, Sybase 11.

* Tools: Informatica 4.5, 5.0, 6.x, 7.1,2, 8.1.1, 8.6, Cognos ReportNet,

Cognos Query Studio, Cognos Report Studio, Cognos Framework Manager,

Business Objects, Autosys, Erwin, Perl, Bourne, Korn shell scripts,

Winscp, ftp, Test Qualifer, WinCVS, Citrix, Quality Manager.

Project Experience

Amtrak Apr 2009 - Present

Enterprise Architect

Amtrak provides freight and passenger transportation services for the whole

nation. Amtrak has fast and normal train services running between Boston

and Washington DC.

My current responsibility is to architect a historical asset repository to

capture the changes for all assets that are related to rail way tracks.

This includes creating historical Oracle SDO Geometry data for tracks,

stations, signals, transformers etc. The SDO Geometry data is then

displayed on Google Earth. Three and four dimension topologies are being

drawn on Google Earth.

The source contains over three hundred tables. These tables are

consolidated to about one hundred tables at the data mart. I've designed

the data mart model using Erwin. I have employed Informatica 8.6 to

successfully load data from the source to the target. Since historical data

is required, I've deployed the check sum generator MD5 function to decode

the incremental rows and used them to create type two slow changing

dimension and fact rows at the target database tables. An active flag is

used to indicate the current active records. All records including the

historical records have been conditioned by their start and end dates.

American Electric Power (AEP) May 2008 - Feb 2009

Columbus OH

Data Warehouse Architect

AEP provides electricity supply for a few states in Midwest. It employs

over 19 thousand employees.

My responsibility was to architect the data flow and reporting needs among

People Soft Enterprise Performance Management (EPM), Microsoft Project

Portfolio, Project Servers, Workflow management, PeopleSoft finance systems

Version 9.

I created a new Portfolio Project Server Management Data Mart to replace

the old IT Project Management Data Mart. It sourced its data from EPM, MS

Portfolio and Project Servers. It provided high level project level

attributes as well as low level task labor hours and costs. Apart from

labor costs, People Soft EPM provided material, software, hardware and

professional services costs. It enabled users to manage all company

projects from a single point.

My duty was to provide architect methodology, new incremental loading

strategy which made use of MD5 check sum generator for detection of

modified rows, ETL strategy, source data analysis, ETL and reporting

coding. Using Informatica 8.6, I'd coded incremental mappings to load the

Portfolio and Project Data mart.

My other assignment was to create a virtual enterprise data warehouse

architecture where costs and revenues from several finance systems could be

reported using a standard set of key performance indicators.

Informatica 8.6

New Informatica features which were adopted, including and not limited to:-

Used Web Services to exchange data between Power Center and internet web

pages via Web Service hub.

Used MD5 function to generate check sums and used it to detect modified

rows.

Used the new SQL transformation to look up large tables of another

database. This had produced better throughput than joining to the large

tables using federated database links in the source qualifier.

Used the new Java transformation to parse html strings and removed the html

tags. A process that could not be easily achieved using normal Informatica

transformations.

Business Objects XI

Designed Universe using BO Designer.

Coded reports using Web Intelligence. Created master and child reports.

Sterling Commerce, a subsidiary of AT&T Feb 2005 - May 2008

Dublin OH

Data Warehouse Architect

Sterling Commerce is the market leader in providing B2B software and

services. Its customers are mainly made up of fortune 500 multi-national

corporations. The B2B software enables data transfers through secured

channels with encryptions.

The ERP system was PeopleSoft V9 which contained AP, AR, GL, Journal,

Billing, Contract, eProcurement and CRM modules. The marketing department

installed SalesForce.com application and the Call Center Support Group

installed the Vantive customer support ticket system. The Software

Subscription Department employed a Global Billing System.

Led a team of data modelers and ETL developers to build data interfaces,

ods, data warehouse, data marts to link the systems above. Primarily, I was

responsible to drive the team to deliver projects on time and on budget. My

greatest strength was able to set the right directions at the beginning of

the projects to avoid major reworks later. My other skill was able to solve

complicated issues with simple solutions.

Architecture duty

. Setup overall system architecture frame work.

. Interviewed users on requirements and consolidated and current and

future needs.

. Provided coding estimates to users.

. Designed the building phases.

. Designed logical and physical data hub, ods, data warehouse and data

mart schemas using the best of breeds between Kimball and Inmon's

methodologies.

. Foresaw if future requirements could be met with minimum alterations.

. Setup ETL strategy and standard.

. Enabled incremental loading.

. Detected source delta rows using Change Data Capture (CDC), Checksum,

comparing columns or database triggers.

. Analyzed source system and derived source system data models.

. Ensured data from different source systems could be easily integrated

at the target tables.

. Designed data process flow from source systems to data hub and from

data hub to other systems.

. Defined the most efficient way to process data whether it was at the

database layer or at Information Power center engine.

. Determined a way to handle dirty data.

. Provided Do and Don't protocols.

. Designed and prototyped coding templates and standards.

. Transferred knowledge to developers and trained them on the templates

and business logics.

. Assigned works to developers according to developers' strength and

trained them to attain the next level of skill sets.

ETL duty

. Ensured the following ETL objectives were met

1. Data integrity

2. Conform to data governance

3. Provide data cleansing

4. Provide data standardization

5. Provide data de-duplication

6. Create a single version of truth data

7. High performance

8. Efficient data loading

9. Adhere to standard

10. Ease of maintenance - Able to restart easily

11. Robust mappings - Able to self correct errors

12. Ease of enhancement - Anyone can work anyone else mappings

13. Stability

14. Simplicity

15. Ease of trouble shooting

16. Data traceability

17. Able to report on historical data

. Solved day to day developers' issues.

. Reviewed Erwin data models, coding and reports to see if they met

coding standards.

. Reviewed on Change Data Capture (CDC) and Slow Changing Dimension

(SCD) and other mappings.

. Tested historical data marts. A historical data mart captured data and

the transaction timestamps. The data could be reported at any point of

time in history where it presented different pictures at different

point in times.

. Performed incremental data loading. Collected all changes for a table

and push the delta rows to data warehouse and data marts. This would

enable incremental loading to both dimension and fact tables. The full

history of the data could then be maintained.

. Maintained a data warehouse layer to process all business rules at a

central hub to achieve a single version of truth. The data was then

distributed to data marts as needed. This would have conformed

dimension tables across all data marts.

. Tuned sql at source qualifiers and lookup transformations.

. Ensured unique keys conditions were met at every table, source

qualifier, look up and target transformations.

. Ensured error handling was met.

. Worked on most complicated mappings and reports.

. Performed tuning on sql using hash, merge or nested loop joins.

. Performed session tuning by having process partitioning, pipe files,

employing external loaders, dropping and recreating indexes.

. Bypassed database transaction logs by using external loaders and pipe

files.

. Upgraded Informatica from version 7.1.2 to 8.1.1.

. Coded Unix Perl, Bourne and Korn shell scripts.

. Queried Informatica's repository to get the following meta data

information.

1. Which mappings will be impacted by a proposed code change?

2. Which SQ or Lookup transformation SQL overwrite has a key word

imbedded?

3. Which mappings are saved by a specific date?

4. Which mappings loaded zero rows and what errors did they

produce?

5. Which mappings are having a specific transformation?

6. Which session use bulk or normal load?

7. Which sequence counter does not have recycle flag turned on?

Report generation duty

. Created Cognos 8 Model using Framework Manager.

. Created Reports using Report Studio and Analysis Studio.

. Created Reports using Report Net.

. Created master, child, detail and drill through reports.

. Created ad hoc queries using Query Studio.

. Tested drill through reports to see if the sum of detail records match

up to master records.

. Solid SQL skill set.

. Assigned work.

. Ensured business rules were met.

. Ensured look and feel was set similarly among all reports.

Cognos 8 Framework Manager

Setup data source connections to access data at the GSDM snow flake

schema.

Loaded the table structure to a raw Import View namespace and assigned

the relationship cardinality.

Marked all data warehouse internal keys and columns as hidden.

Built the relationship on the surrogate keys between the fact and the

dimension tables.

A business View namespace was then created. In the business view,

changed all the technical column names to business names. For each

column, set the column usage to identifier, attribute or fact.

To avoid fact table to fact table join, created a Presentation View

namespace for each fact table. This view would contain any dimension

table that was required by the fact table. The Conformed Dimension

tables were linked in from a common business view namespace. This

namespace could be used in more than one model.

Created Embedded and Stand Alone Filters and calculations where

necessary.

Assigned user security to the model.

Published the packages to the Cognos 8 server.

Cognos 8 Report Studio

Created master and child reports via setting up of drill through

properties.

Created crosstab, list columns and trend analysis reports.

Cognos 8 features used including, but were not limited to:

Blocks, Child Report, Count, Crosstab, Filter on Child Report, Grouped

By, List Column, Master Report, Master and Child Report, Passing

Parameters, Prompt Pages, Setup Drill Through properties and Sorting.

Cognos 8 Query Studio

Used Query Studio to perform ad hoc queries. The saved query can be

used to launch Report Studio so that a more proper report can be

designed.

Cognos 8 Transformer

Created transformer models. Assigned dimension levels, categories and

measures. Checked models. Created and published cubes. Created cube

groups.

Cognos 8 Analysis Studio

Used Analysis Studio to perform OLAP analysis on cubes. Performed

slice and dice along the dimensional levels.

Cognos ReportNet 1.1

Used Reportnet to create reports and master and child reports which

contained crosstab, list column, maps, charts, grouping, aggregation,

sorting and filters.

Cognos 7 PowerPlay

Used PowerPlay to analyze Power Cubes. Sliced and diced cubes.

BMW Financial Services Apr 2004 - Feb 2005

Dublin OH

Lead data warehouse architect.

BMW Financial Services Spot Award holder

. Lead a team of five people to convert the Enterprise Data Warehouse

(EDW) to daily incremental loading process.

. Expanded the data warehouse and merge data from additional six data

sources.

. Implemented five new data marts and six mini cubes.

. Designed and implemented new ETL strategy, coding standards, codes

reuse, unit and regression testing methodology.

. Designed high performance ETL flow, tuned SQL statements and databases

and implemented efficient multi-stage data cleansing strategy.

. For ease of maintenance, implemented robust program execution restart

strategies. Programs may stop and restart safely without truncating or

rolling back loaded data.

. Implemented error recycling process.

. Extensively querying Informatica meta data to examine coding bugs.

. Mentored and trained ETL developers.

MS SQL Server experience

. Tuned de-duplication sql to make use of surrogate key in the outer

loop query. This had improved the performance of the query by 400%.

. Performed complicated sql tuning by breaking it down to smaller sqls

and tuned each sql using hash, nested loop or merged join.

. Coded in stored procedure when the routine could not be performed by

Informatica easily.

Michigan State Office of Retirement Services - A Covansys project.

Lansing MI Feb 2002 - Apr 2004

Operation Data Store (ODS Hub) lead architect

. Designed logical and physical data models for the data hub using Erwin

based on business requirement analysis.

. Extracted data from three legacy systems, merged them, de-duplicated

the data, cleansed and standardized the data.

. Merged the data with new references and send the data to downstream

systems.

. Designed and implemented ETL strategy, data hub interfaces, migration,

integration testing, database sizing, audit trail, error handling and

security.

. Performed sql tunings.

. Mentored team members on coding the ETL modules using SQL Server 2000.

. Coded in Data Transformation Services (DTS).

MS SQL Server experience

. Performed DBA duties.

. Performed database files backups.

. Attached database files to databases. Created database users, assigned

user access rights.

. Created logical and physical models. Created database objects such as

tables, indexes, clustered indexes, triggers, check and default

constraints.

State Teachers Retirement Systems. A Covansys project.

Columbus OH Sep 2001- Dec 2001 and also Apr 1999 - Jun 2000

Data Warehouse Guru

. Provided technical team lead functionalities to mentor team members in

designing, coding and implemented a data warehouse using Informatica.

. Designed data model in Erwin.

. Mined Informatica's repository and performed SQL tuning, Shell and

Perl scripts coding, data base design, data mart constructions.

. Designed do and don't ETL coding methodologies.

. Implemented high performance loading and error recycling strategies.

Submitorder.com. - A Covansys project. June 2000 - Aug 2001

Dublin OH

Data Warehouse Architect/DSS Application Architect

. Designed DSS data model and implemented ETL.

. Designed star schemas.

. Coded in Informatica to extract XML data from job queues, parsed and

populated tables for the Oracle 9i databases.

Lucent Technologies. A Covansys project. Feb 2000 - Sep 2000

Columbus OH

Team Lead at Lucent Technologies

. Managed a team to work on the scalability and availability to scale

the workload.

National Australia Bank. Aug 1998 - Jan 1999

Melbourne Vic Australia

Data Warehouse Senior Consultant

. Performed data model design in Oracle 8i.

. Coded in PL/SQL for the ETL processing.

. Coded Unix shell scripts.

. Populated data warehouse and data marts.

. Generated high risk lending reports for auditing and monitoring

purposes.

IBM-GSA TELSTRA ALLIANCE. Feb 1998 - Aug 1998

Melbourne Vic Australia

Senior Consultant

. Designed and coded the import and the export modules for the churning

of long distance carriers using Oracle Designer 2000 and Oracle

database.

. Performed coding in PL/SQL.

. Coded Unix shell scripts.

Australian Hospital Care Group. Jun 1996 - Jan 1998

Melbourne Vic Australia

Data Mart Senior Consultant

. Implemented data marts to consolidate financial data from all hospital

branches to generate monthly KPI, profit and loss as well as actual vs

budget reports.

Security Mailing Services Pty Ltd. May 1993 - May 1996

Melbourne Vic Australia

Chief Analyst Programmer

. Developed shell and Clipper scripts to cleanse and format data for the

high volume Xerox printers.

Leigh Mardon Pty Ltd. Sep 1991 - May 1993

Melbourne Vic Australia

Programmer Analyst

. Developed programs to format incoming bank data for check printings.

TRI-M Technologies(S) Pte Ltd. Aug 1989 - Jul 1991

Singapore

System Engineer

. Maintained ASK ManMan MRPII system.

. Developed operation reports in Power House Quiz.

Laser Lab Ltd. Aug 1988 - Jul 1989

Melbourne Vic Australia

Software Engineer

. Maintained MRP system.

. Coded in DbaseIII for ETL processing and reporting.

. Created Bills of Materials (BOM) hierarchies for in house manufactured

products.

References

Available upon request



Contact this candidate