Post Job Free
Sign in

ETL Developer

Location:
Greensboro, NC
Posted:
March 12, 2015

Contact this candidate

Resume:

Objective: ETL (BODS) Developer position

in Greensboro, NC

SUMMARY

. 8+ years of total IT experience in data warehousing large to mid-size

projects in a fast paced environment for Bank of America, CVS-Caremark,

SaskPower and CIBC Bank.

. 3+ years of experience on (BODS) Data Services 4x/XI3.2/3.1 and

Informatica Power Center 8.6/8.1 with good understanding of business

needs and extraction, transformation and loading large data from

various data sources i.e SQL Server, Oracle, DB2, Teradata, and flat

files.

. Worked on Tickets and provided support to resolve production issues.

. Worked extensively on various BODS Transformations i.e. Data Integrator,

Data Quality, Platform, and Text Data

Processing.

. Strong experience in the integration of BODS with SAP ECC,

HANA and Non-SAP data sources.

. Experience in SAP modules (FI/CO, SD), Tcodes, Transport Requests,

Mapping, Gap analysis, Configuration, Testing, SAP ECC and ASAP

methodology.

. Experience in writing SQL, PL/SQL, Shell scripts.

. Designed/Edited/Tested Source-Transformations-Target; exception-handling

and email-alerts.

. Broad understanding of SAP ECC tables and SAP BW objects.

. Performed Unit and User Acceptance Testing. Wrote test cases, test plan,

ETL technical specifications doc.

. 4 years of experience in BI Reports - Microstrategy, SAP Business

Objects suite 4.1, Crystal, SSRS, Tableau, Qlikview.

. Knowledge of conceptual, logical and physical data modelling,

Normalization and tools like Erwin, and Visio.

. Experience in identifying and resolving performance bottlenecks in

various levels like sources, mappings and sessions.

. Familiar with Scrum, Agile methodologies, OLAP and OLTP database

schemas.

. Capable of understanding business needs and translate business

requirements into technical solutions.

. Diverse background with fast learning and creative analytical abilities

with good technical, communication and interpersonal skills.

. Served as a liaison with other divisions such as Sales, Accounts

Receivable, and Risk Management in an effort to ensure accurate and

timely transaction processing.

. Experience working with off-shore team in India and North America;

Provided 24/7 production support.

TECHNICAL SKILLS

ETL Tools: BODS 4x/XI3.2/3.1, Informatica PowerCenter 9.1/8.6/8.1, SSIS

2005/2008/20012.

RDBMS: Oracle10g/9i, SQL Server 2000/2005/2008, DB2, AS400, Teradata

BI Tools: SAP Business Objects 4.1, Crystal, Tableau, SSRS,

MicroStrategy, Tableau, QlikView

Operating System: UNIX, Linux, Windows NT/2000/XP/7

Defect Management tools: HP Quality Center 9x, Test Director, Remedy 7x

Languages: SQL, PL/SQL, ABAP, Shell Scripting

Modeling Tools: MS Visio, Erwin

EXPERIENCE

Bank of America, Jacksonville FL Sept.2014-Feb.2015

Developer-Contractor

Description: Bank of America is an American multinational banking and

financial services corporation headquartered in Charlotte, North Carolina.

It is the second largest bank holding company in the United States by

assets.

Responsibilities

. Worked with Teradata DBAs, Data Analyst, ETL Developers and Business

Analysts to analyze and gather business requirements.

. Designed complex mappings, mapplets, created sessions and workflows to

load data, and monitored workflow progress.

. Implemented Slowly changing Dimension(Type 1/2/3), Case, Map_Operation,

Merge, Query, Row_Generation, SQL, Validation, Data_Transfer,

Date_Generation, Effective_Date, Hierarchy_Flattening, History_Preserving,

Key_Generation, Map_CDC_Operation, Pivot, Reverse Pivot, Table_Comparison,

XML_Pipeline, etc.

. Defined Mapping parameters and variables, breakpoints, email-alerts as

per requirements.

. Worked on CDC (change Data Capture) to capture insert, update, and

delete changes made to Enterprise Data Sources.

. Wrote SQL, PL/SQL scripts. Created/edited shell scripts to write post-

session & pre-session scripts.

. Scheduled and monitored BODS jobs sequentially and conditionally.

. Created/edited Local, Central and Profiler repositories as per

requirements.

. Wrote SQL queries. Created/Edited shell scripts for pre or post-

session.

. Wrote BRDs, ETL technical specification docs, test plan, ETL test cases,

and other supporting documents

. Performed testing from source to target, performance tuning and

exception-handling.

. Scheduled and monitored jobs using Autosys.

. Worked on data cleansing, data profiling and CASS error handling

strategies, standards used by USPS.

. Worked on Tickets and provided tech support to resolved production

issues. Worked with QA team to follow up on testing.

. Participated on daily/weekly team meetings on the Project. Provide

24/7 on-call to provide production support.

Environment: SAP BODS, Teradata, Teradata SQL Assistant, Oracle, Toad,

DVO, PL/SQL, MS Visio, Linux, Remedy, SharePoint, Autosys, Hadoop

CVS-Caremark, Scottsdale AZ Aug'2013 to Feb'2014

Developer-Contractor

Description: CVS Caremark (NYSE: CVS), headquartered in Woonsocket, RI, is

the largest pharmacy health care provider in the US with integrated

offerings across the entire spectrum of pharmacy care.

Responsibilities:

Worked on BODS 4.0

. Created complex Jobs, Work Flows, Data Flows, and Scripts using

various Transforms (Integrator, Quality, Platform and Text Data) to

successfully load data from multiple sources into a desired target.

. Worked with DBAs, stakeholders and Analysts for data requirements and

wrote entire ETL design specifications doc.

. Implemented Slowly changing Dimension(Type 1/2/3), Case, Map_Operation,

Merge, Query, Row_Generation, SQL, Validation, Data_Transfer,

Date_Generation, Effective_Date, Hierarchy_Flattening, History_Preserving,

Key_Generation, Map_CDC_Operation, Pivot, Reverse Pivot, Table_Comparison,

XML_Pipeline, etc.

. Defined Mapping parameters and variables, breakpoints, email-alerts as

per requirements.

. Worked on CDC (change Data Capture) to capture insert, update, and

delete changes made to Enterprise Data Sources.

. Wrote SQL, PL/SQL scripts. Created/edited shell scripts to write post-

session & pre-session scripts.

Scheduled and monitored BODS jobs. Created RFC (Remote Function Call)

connections to SAP systems

. Created/edited Local, Central and Profiler repositories as per

requirements.

. Extracted data from various data sources i.e. SAP ECC tables, Hana,

Oracle, SQL Server & flat files to provide analytical, BI Reporting, &

data warehouse solutions in a structured data-centric environment.

. Did data cleansing, data mapping, data conversion.

. Wrote BRDs, ETL Technical Specification Docs, Test Plan, ETL Test Cases,

Migration Plan Doc and performed Dev testing.

. Created/edited/ tested various Reports, Dashboard, Graphs, Charts using

SAP Business Objects (WEBI)

. Worked on Tickets and provided support to resolve production issues.

Worked with QA team to follow up on testing.

. Worked extensively with SAP SD module (Sales & Distribution) team for

data migration from Legacy system to SAP.

. Participated on daily/weekly team meetings on the Project and

interacted with DBAs, Analysts, Stakeholders, Developers and Testers.

Environment: BODS 4x, SAP ECC6.0, MS SQL server 2008, Oracle10g, Toad,

PL/SQL, MS Visio, UNIX, Quality Center 11, Remedy, Windows, SharePoint,

Siebel, Business Objects WebI, Explorer, LiveOffice, John Galt, Dashboard,

ABAP.

SaskPower, Regina SK Jan '09 - Jun'13

Developer (BODS)

Description: SaskPower is the principal Electric Utility company in

Saskatchewan. SQL Server, Oracle, SAP, Mainframe and Custom Applications.

Responsibilities: .

. Worked with SAP functional consultants (FI/CO), DBAs, Business Analysts,

Data Analyst, stakeholders to understand the business processes,

workflows, business rules and business needs.

. Analyzed and designed ETL jobs for SAP data and non-SAP data to address

BI Reporting & DW Solutions.

. Created complex jobs, workflows, dataflows, datastore, format and

scripts using various Transforms (Integrator, Quality, Platform and

Text Data) to successfully connected and loading data into a

desired target.

. Implemented Slowly changing Dimension(Type 1/2/3), Case, Map_Operation,

Merge, Query, Row_Generation, SQL, Validation, Data_Transfer,

Date_Generation, Effective_Date, Hierarchy_Flattening, History_Preserving,

Key_Generation, Map_CDC_Operation, Pivot, Reverse Pivot, Table_Comparison,

XML_Pipeline, etc.

. Defined Mapping parameters and variables, breakpoints, email-alerts as

per requirements.

. Extracted data from various data sources i.e. SAP ECC, Oracle, Hana, SQL

Server, & flat files.

. Worked on CDC to capture insert, update, and delete changes made to

enterprise data sources.

. Wrote SQL, PL/SQL scripts.

. Did data conversion, data cleansing and data profiling.

. Scheduled and monitored BODS jobs.

. Wrote detailed ETL design specifications Documents.

. Created/edited/ tested various Reports, Dashboard, Graphs, Charts

using BOBJ.

. Wrote test cases and performed testing and followed up with QA team.

. Worked on Tickets and provided support to resolved production issues.

. Participated in daily/ weekly team meetings.

Environment: Data Services XI3.2/3.1, Oracle, Toad, SAP ECC6.0, SQL Server

2008, ABAP, Erwin, Windows, TFS, UNIX, Remedy, Quality Center, Sharepoint,

Business Objects suite.

Lafarge Corp. Toronto ON. Mar'08 - Jan'09

ETL Developer (Informatica)

Description: Lafarge North America is the largest diversified supplier of

Construction Materials in the U.S. and Canada.

Responsibilities:

Designed ETL packages using Informatica Power Center 8.6.

. Worked with SAP Functional Consultants (SD), DBAs and Business Analysts

for data requirements to address BI Solutions.

. Designed complex ETL jobs, source-transformations-target; mappings

using various types of transformations such as Active, Passive, Connected

and Unconnected.

. Worked extensively on Power Center Client tools (Designer, Repository

Manager, Workflow Manager, Workflow Monitor).

. Implemented Source Qualifier, SCDs(1/2/3), Joiner, Filter, Router, Rank,

Expression, Lookup, Aggregator, Sorter, Normalization, Update Strategy,

Sequence Generator, Stored Procedure transformations, Union, etc.

. Defined Mapping parameters and variables, set breakpoints, Email-

Alerts.

. Created/edited mapplets, validated mappings, error-handling.

. Wrote SQL, PL/SQL scripts.

. Wrote BRDs, Technical Specification Docs, Test Plan, Test Cases,

Migration Plan.

. Extracted large data from various sources i.e. Oracle, SQL Server, SAP

R/3, and flat files.

. Performed Dev testing and worked with QA team.

. Provided 24/7 tech support to resolved production issues.

. Participated on daily/weekly team meetings on the Project.

Environment: Informatica PowerCenter 8.6, SAP, MS SQL Server 2005/2008,

Oracle, Toad, PL/SQL, MS Visio, UNIX, Quality Center, Windows.

CIBC Bank, Toronto ON May'05- Feb'08

ETL Analyst (Informatica)

Description: The Canadian Imperial Bank of Commerce, commonly known as

CIBC, is one of Canada's chartered banks, fifth largest by deposits. The

bank is headquartered at Commerce Court in Toronto, Ontario.

Responsibilities:

. Gathered requirements from stakeholders, DBAs, Business Analysts, Data

Analysts to provide BI/DW solutions.

. Analyzed Database tables, data requirements for loading data from

Oracle, SQL Server, and flat files.

. Created various transformations such as Active, Passive, Connected and

Unconnected.

. Organized meetings and worked with DBAs, Business Analysts, Data

Analysts to understand ETL data requirements and documented detailed ETL

design technical specifications.

. Worked extensively on Power Center Client tools (Designer, Repository

Manager, Workflow Manager, Workflow Monitor).

. Implemented successfully various transformations i.e. Source

Qualifier, SCDs(1/2/3), Joiner, Filter, Router, Rank, Expression, Lookup,

Aggregator, Sorter, Normalization, Update Strategy, Sequence Generator,

Stored Procedure transformations, Union, etc.

. Created/edited mapping parameters, variables, set breakpoints,

exception-handling;

. Worked on tickets to resolve production issues. Provided 24/7 tech

support.

. Wrote SQL, PL/SQL scripts.

. Wrote test cases and performed testing and followed up with QA team.

. Scheduled and monitored jobs.

. Participated in daily/weekly team meetings.

Environment: Informatica 8.6/8.1, Oracle, SQL Server 2005, SQL, PL/SQL,

VISIO, Windows, Quality Center, SSRS.

EDUCATION

. Bachelors in Computer Science - University of Saskatchewan, Canada

. 10 weeks part-time course in SAP BI 7.0 from IIBS, Toronto ON

SOFT SKILLS

. Excellent communication and problem solving skills;

. Reliable team member, organized, positive attitude, and work well under

pressure;



Contact this candidate