Omer
Informatica/ETL Developer
Ph: 214-***-****
Email: *********@**********.***
Professional Summary:
. I.T Professional with 7 years of experience in application, designing,
development and implementation of ETL and RDBMS projects for Financial,
Banking, Pharmacy, Insurance and Utilities industries.
. Expertise in data warehousing, ETL architecture, Data Profiling using
Informatica Powercenter 8.6/8.5/8.1/7.1/6.2 - Client and Server tools and
building Design and building Enterprise Data warehouse/Data Marts.
. Adept at understanding business processes / requirements and implementing
them through mappings and transformations.
. Involved in Database design, entity relationship modeling and dimensional
modeling using Star and Snowflake schemas.
. Extensively worked with mappings using different transformations like
Filter, Joiner, Router, Source Qualifier, Expression, Union, Update
Strategy, Unconnected / Connected Lookup, Aggregator and SCD Type-2.
. Experience in tuning Mappings and Sessions for better Performance.
. Experience in loading various data sources like Oracle, SQL Server,
Teradata, DB2 and Flat Files into Datamarts.
. Experience in requirement gathering and documenting.
. Worked in Production support team for maintaining the mappings, sessions
and workflows to load the data in Data warehouse.
. Experience in Performance Tuning and Debugging of existing ETL processes
. Experience in preparing and scheduling and/or running Sessions/tasks,
Work Flows and Batch processes using Workflow Manager or PMCMD command
. Experience in Oracle 10g/9i/8i.
. Experience in writing Triggers, Stored Procedures, Functions, and
Packages etc using PL/SQL.
. Experience in OLAP Tools like Business Objects 6/5.1, Web Intelligence
6/2.6.
. Experience in Electronics /Financial/ Health Care/ Insurance/ Wireless/
Agricultural/Mortgage Industries.
. Experience in UNIX Shell Scripting.
Technical SKILLS:
ETL Tools: Informatica PowerCenter/PowerMart/Power
Designer/Real Time
9.1.1/8.6.1/7.1.3/6.2.2
Databases: Oracle 10g/9i/8i/7.3, MS SQL Server
2000/2005; TeraData V2R4/V2R3, DB2 UDB
7.1, MySql, MS Access, Office, Netezza
Modelling Tools: Erwin 4.5,
Programming Skills: SQL,T-SQL, PL/SQL, UNIX shell
scripting, Perl, Java 1.4, HTML 4+,
CSS, JavaScript
Tools/Utilities: TOAD 7.x/8.x/9.x, Win SQL, SQL
Navigator, Ultra Edit 32, Winscp,
Harvest, Maestro, Crystal Reports 7/8,
Business Objects 5.x/ 6.x,Tidal,PuTTY,
WinSCP.
Scripts: Perl, Unix Shell Scripting, Java
Scripts.
PROFESSIONAL EXPERIENCE
Rackspace Hosting, San Antonio, TX
Feb'12 - Current
Role: Sr. Informatica Developer
. Exposure to all phases of data integration and/or warehousing life cycle
including planning, analysis, design, development, implementation and
maintenance of data and meta-data.
. Worked with business requirement team, analysts and developers to ensure
stakeholder's requirements are addressed through integration solutions in
a manner that aligns with timelines, budgets and quality expectations.
. Extracts, transforms, and loads data using Informatica Power center
9.1.0.
. Experienced in designing and developing of mappings, transformations,
sessions and workflows in Informatica PowerCenter.
. Created external data transformation components.
. Performed production migrations of data transformation services.
. Worked with the multiple file formats, relational database systems and
different application platforms.
. Ensures adherence to locally defined standards for all developed
components.
. Provided technical documentation of Source and Target mappings.
. Good exposure in Data Analysis, Data Profiling, Data Management,
Troubleshooting, Reverse Engineering and Performance tuning in various
level of the ETL Process.
. Strong experience in performance tuning for Informatica processing.
. Ensures performance metrics are met and tracked.
. Writes and maintains unit tests.
. Supported the development and design of the internal data integration
framework.
. Participated in design and development reviews.
. Worked with system owners to resolve source data issues and refine
transformation rules.
. Conducts QA reviews and Performed product migrations.
. Developed data transformation designs.
Environment: Informatica Power Center 9.1.0, Informatica Analyst, Oracle
11g, MS Sql Server 2008, MySQL, TOAD 11,Erwin and Linux.
Avnet Inc., Phoenix, AZ
Apr'11-Jan'12
Role: Sr. ETL Developer
One of the world's largest trans-national electronics distributors of
electronic parts, enterprise computing and storage products and embedded
subsystems, Avnet provides a vital link in the technology supply chain.
I worked on QRT, a project based on Quoting System for the current
availability of the parts with the best quote possible.
Responsibilities:
. Gathering the requirement changes from the functional team and
incorporates them in Informatica and Business Objects.
. Interaction with direct Business Users and Data Architect for changes to
data warehouse design in on-going basis.
. Designed the ETL processes using Informatica tool to load data from
Oracle, flat files into the target Oracle Database.
. Followed & Implemented Standards for BI & DW at various levels of the
SDLC.
. Developed complex mappings in Informatica to load the data from various
sources using transformations like Source Qualifier, Expression, Lookup
(connected and unconnected), Aggregator, Update Strategy, Filter, Router,
Transaction Control etc.
. Used Informatica workflow Manager to create, schedule, monitor and send
the messages in case of process failures.
. Designed SQL queries with multiple joins to pull relative data during
import state.
. Designed and modified PL/SQL Stored Procedures to modify data flow.
. Used triggers in order to enforce business rules.
. Developed FTP scripts to send the data extracts to various downstream
applications using Informatica.
. Providing support for user BO report issues and Informatica loading
issues.
. Tuning and performance improvement of the jobs in Informatica. Translated
business requirements to Informatica Mappings. Involved in Unit testing
of mappings.
. Delegating and Tracking the change requests in Informatica.
. Created the transformation routines to transform and load the data.
Developed processes for automation of loading data using parameter driven
sessions for batch schedule processes, verification and re-conciliation
of data stored in several different source systems.
. Worked with analysts and data source systems experts to map requirements
to ETL code.
Environment: Informatica Power centre 9.1/8.6, TOAD, PL/SQL Developer, Data
Mining, Oracle, DB2, Teradata, Erwin 4.0, Windows 2000, XML, SQL, PL/SQL,
Unix/Perl/Shell Scripting.
PNC Bank, Pittsburgh, PA
Dec'09- Mar'11
Role: Sr. Informatica Developer
. Analyze, design, develop, test, install, and configure the data
warehousing applications. Conduct unit testing, integration testing and
system testing of the mappings and writing the Unit and System Test Plan.
. Responsible for Production Turnover for moving Informatica objects
(workflows/mappings/ sessions), Oracle objects (functions, stored
procedures, DDL queries), UNIX objects (shell scripts) and mainframe jobs
from Test to QA to PROD. Migrating Harvest packages (UNIX shell scripts)
from Test to QA to Production Environments. Creating PTMs and DTG
requests for developers in order to migrate objects to the Production
Environment.
. Create Sessions and Workflows to load data from the Oracle Databases that
were hosted on UNIX servers.
. Write complex SQL override scripts at source qualifier level to avoid
Informatica joiners and Look-ups to improve the performance as the volume
of the data was heavy. Executing jobs in CA7 scheduler (Mainframe
Scheduler).
. Modify the existing ETL code (mappings, sessions and workflows) and the
shell scripts as per the user requirements. Monitoring workflows/mappings
in Informatica for successful execution.
. Created and modify Oracle stored procedure to implement complex business
logic for good
. Serve as point of contact between the developer and the administrators
for communications pertaining to successful execution of job. Resolve
issues that cause the production jobs to fail by analyzing the ETL code
and log files created by failed jobs on the Informatica server. Utilize
Lotus Notes for communicating with the developers as well as the managers
and upload documents in SharePoint.
. Develop mapping logic using various transformations like Expression,
Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update
strategy and Sequence Generator.
. Provide backup support of databases, ETL applications and reporting
environment.
. Involved in analysis of data for creation of test data and improving data
quality by embedding logics in mapping programs. Create high level
technical documentation. Involved in developing design documents.
. Participate in weekly end user meetings to discuss data quality,
performance issues and ways to improve data accuracy and new
requirements, etc.
Environment: Informatica Power Center 8.6.1, Oracle 11g/10g, SQL Server,
PL/SQL, TOAD 9.5.1, SQL Plus, UNIX Shell Scripting, Mainframes, CA7
scheduler.
Hoffman La Roche,
May'08 - Nov'09
Role: Sr. ETL Analyst/Programmer
Sales and marketing Data Warehouse
Sales and marketing Data Warehouse is an extraction of data from SIEBEL
Sales Force Application and legacy systems. The other sources that were
integrated are DDD and Exponent data from IMS into ODS (Operational Data
store). This Central location of data will make it easier for Roche to
acquire tool(s) to create and manage common reports so that each group can
benefit from or leverage work done by other groups.. Existing Mainframe
System was replaced by the Data warehouse, which helps the company in
saving million dollars every year.
Responsibilities:
. Involved in Data modeling and design of data warehouse in star schema
methodology with conformed and granular dimensions and FACT tables.
. Using Informatica Repository Manager, maintained all the repositories of
various applications, created users, user groups, security access
control.
. Analyzing existing transactional database schemas and designing star
schema models to support the users reporting needs and requirements.
. Created Informatica Mappings to load data using transformations like
Source Qualifier, Sorter, Aggregator, Expression, Joiner, Connected and
Unconnected lookups, Filters, Sequence, Router and Update Strategy.
. Worked on Xponent Data provided by IMS.
. Extensively used SQL * Loader, Informatica tool to extract, transform and
load the data from MS SQL Server, Flat Files, Oracle to Oracle.
. Using PowerExchange tool extracted data from mainframe system and shadow
direct tool to extract db2 tables.
. Performed security and user management repository backup using the same
tool.
. Implemented Slowly Changing Dimensions (SCDs, Both Type 1 & 2).
. Cleansed the source data, extracted and transformed data with business
rules, and built reusable mappings, known as 'Mapplets' using Informatica
Designer.
. Involved in the development of Informatica mappings and also performed
tuning for better performance.
. Using parallel processing capabilities and Session-Partitioning and
Target Table partitioning utilities
. Written a UNIX process for loading the zipped version of files and
thereby improving the load time and saving the disk space.
. Extensively worked on tuning (Both Database and Informatica side) and
thereby improving the load time.
. Extensively used PL/SQL for creating packages, procedures and functions.
. Automated the entire processes using unix shell scripts
. Using Autosys to schedule UNIX shell scripts, PL/SQL scripts and
Informatica jobs.
. Written Unix Shell Scripts for getting the data from all systems to the
data warehousing system. The data was standardized to store various
business units in tables.
. Tested all the applications and transported the data to the target
Warehouse Oracle tables and used the Test Director tool to report bugs
and fix them in order.
. Tested the target data against source system tables by writing some QA
Procedures.
. Created Migration Documentation and Process Flow for mappings and
sessions.
. Using Autosys as the job scheduling tool.
. Presentation of the advanced features of Business Objects to users and
developers to enable them to develop queries and reports easily.
Environment: Informatica Power Center 5.1/6.2, HP UNIX, Windows NT, DB2,
AS/400, Oracle 8i, SQL, PL/SQL, SQL*Loader, TOAD, SQL-Navigator.
PayPal, San Jose, CA
July'07 - Apr'08
Role: Data Warehouse ETL Developer
Description:
. ECTL of Oracle tables to Teradata using Informatica and Bteq Scripts.
Migrated SAS code to Teradata Bteq Scripts.
. Predict current high value sellers who are likely to churn based on
their similarities to past high value sellers. Identify dollar
opportunity by reducing churn of high value sellers.
Responsibility:
. Responsible for preparing the technical specifications from the
business requirements.
. Analyze the requirement work out with the solution. Develop and
maintain the detailed project documentation.
. Used Informatica and generated Flat file to load the data from Oracle
to Teradata and BTEQ/Fast Load Scripts to do incremental load. Used
Stage work and Dw table concept to load data, applied Start Schema
concept. Created UDFs in JAVA Transformation to complete some tasks.
. Design, develop and implemented ECTL process for Marketing Team for
existing tables in Oracle. Wrote BTEQ Scripts in order to support the
project.
. Wrote stored procedures in PL/SQL and UNIX Shell Scripts for Automated
execution of jobs
. Used version control system to manage code in different code streams
like Clear case.
. Performed data-oriented tasks on Master Data projects especially
Customer/Party, like standardizing, cleansing, merging, de-duping,
determining survivorship rules.
. Responsible for the design, development, testing and documentation of
the Informatica mappings, PL/SQL, Transformation, jobs based on Paypal
standards.
. Initiate, define, manage implementation and enforce DW data QA
processes across, Interact with other QA Team. Interacted with data
quality team.
. Identify opportunities for process optimization, process redesign and
development of new process.
. Anticipate & resolve data integration issues across applications and
analyze data sources to highlight data quality issues. Did performance
and analysis for Teradata Scripts.
. Migrate SAS Code to Teradata BTEQ Scripts to do the scoreing for score
taking in account various parameters like login details, transaction $
amount etc. Playing with Marketing Data for various reports.
Environment: Oracle 9i, Informatica PC 8.1, PL/SQSL, Teradata V2R6,
Teradata SQL Assistant, Fast load, BTEQ Script, SAS Code, Clear case, JAVA
Language, Pearl Scripts, XML Source .
Centre for Good Governance, Hyderabad, India
Oct'06- Jun'07
Role: ETL Informatica Developer
This project helps to increase crop productivity, improve quality of
agricultural produce, conserve & sustainable use of water, higher
fertilizer use efficiency & saving in fertilizer, and saving in labour
expenses for the Government of Andhra Pradesh..
Responsibilities:
. Contributed in the development of system requirements and design
specifications
. Participated in the design and development of Dimensional modelling.
. Developed complex mappings in Informatica to load the data from
various sources using various transformations like Source Qualifier,
Look up (connected and unconnected), Expression, Aggregate, Update
Strategy, Joiner, Filter and Router
. Developed Mapplets to implement business rules using complex logic
. Converted the PL/SQL Procedures and SQL*Loader scripts to Informatica
mappings
. Tuned the Sessions for better performance by eliminating various
performance bottlenecks
. Created and scheduled Sessions and Batches through the Informatica
Server Manager Wrote UNIX shell scripts to automate the data transfer
(FTP) process to and from the Source systems, to schedule weekly and
monthly loads/jobs
Environment: Informatica PowerCenter 6.2, Business Objects, Oracle
Applications 11i, Oracle 9i, SQL Server, SQL* Loader, HP UNIX, ERwin 4.0,
Test Director, WinRunner.
EDUCATION
Bachelor of Engineering, INDIA