Ismail Mohammed
************@*****.***
Ph. # 972-***-****
Summary:
. I.T Professional with 9 years of experience in application, designing,
development and implementation of ETL and RDBMS projects for Financial,
Banking, Pharmacy, Insurance and Utilities industries.
. Expertise in data warehousing, ETL architecture, Data Profiling using
Informatica Powercenter 8.6/8.5/8.1/7.1/6.2 - Client and Server tools and
building Design and building Enterprise Data warehouse/Data Marts.
. Adept at understanding business processes / requirements and implementing
them through mappings and transformations.
. Involved in Database design, entity relationship modeling and dimensional
modeling using Star and Snowflake schemas.
. Extensively worked with mappings using different transformations like
Filter, Joiner, Router, Source Qualifier, Expression, Union, Update
Strategy, Unconnected / Connected Lookup, Aggregator and SCD Type-2.
. Experience in tuning Mappings and Sessions for better Performance.
. Experience in loading various data sources like Oracle, SQL Server,
Teradata, DB2 and Flat Files into Datamarts.
. Experience in requirement gathering and documenting.
. Worked in Production support team for maintaining the mappings, sessions
and workflows to load the data in Data warehouse.
. Experience in Performance Tuning and Debugging of existing ETL processes
. Experience in preparing and scheduling and/or running Sessions/tasks,
Work Flows and Batch processes using Workflow Manager or PMCMD command
. Experience in Oracle 10g/9i/8i.
. Experience in writing Triggers, Stored Procedures, Functions, and
Packages etc using PL/SQL.
. Experience in OLAP Tools like Business Objects 6/5.1, Web Intelligence
6/2.6.
. Experience in Electronics /Financial/ Health Care/ Insurance/ Wireless/
Agricultural/Mortgage Industries.
. Experience in UNIX Shell Scripting.
SKILLS:
ETL Tools: Informatica PowerCenter/PowerMart/Power
Designer/Real Time
9.1.1/8.6.1/7.1.3/6.2.2
Databases: Oracle 10g/9i/8i/7.3, MS SQL Server
2000/2005; TeraData V2R4/V2R3, DB2 UDB
7.1, MySql, MS Access, Office, Netezza
Modelling Tools: Erwin 4.5,
Programming Skills: SQL,T-SQL, PL/SQL, UNIX shell
scripting, Perl, Java 1.4, HTML 4+,
CSS, JavaScript
Tools/Utilities: TOAD 7.x/8.x/9.x, Win SQL, SQL
Navigator, Ultra Edit 32, Winscp,
Harvest, Maestro, Crystal Reports 7/8,
Business Objects 5.x/ 6.x,Tidal,PuTTY,
WinSCP.
Scripts: Perl, Unix Shell Scripting, Java
Scripts.
PROFESSIONAL EXPERIENCE
Rackspace Manage Hosting, San Antonio, TX Feb'12-
Till date
Role: Sr. ETL Developer/Data Analyst/On-Site Lead
Rackspace service leader in Cloud Computing deliver enterprise-level
hosting services to businesses of all sizes and kinds around the world. The
Project, Polaris-QB was for migration of the complete customer &
transaction data from QuickBooks/HMDB (PostgreSQL) to Oracle Billing and
Revenue Management (BRM) system. Using Informatica PowerCenter designed ETL
mapping for migrating data from source system (QB) to Staging then to XML,
which was fed through CMT/Opcode and loaded into BRM.
Responsibilities:
. Interacted with business/users and gathered requirements based on
changing needs. Incorporated identified factors into Informatica mappings
to build the Data Mart.
. Managed a team of 5 (onsite-offshore).
. Developed a standard ETL framework to enable the reusability of similar
logic across the board. Involved in System Documentation of Dataflow and
methodology.
. Involved in documenting HLD and LLD.
. Done POC's on various business scenarios to facilitate business
decisions.
. Used Informatica Designer to create complex mappings using different
transformations like Filter, Router, Connected & Unconnected lookups,
Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator
transformations to pipeline data to Data Mart.
. Written shell scripts to process make changes in XMLs generated.
. Developed PL/SQL procedures and functions for implementing business
logics.
. Conducted Database testing to check Constrains, field size, Indexes,
Stored Procedures, etc.
. Created testing metrics using MS-Excel
. Implemented various Performance Tuning techniques on Sources, Targets,
Mappings, and Workflows.
. Defects were tracked, reviewed and analyzed.
. Validation of Informatica mappings for source compatibility due to
version changes at the source level.
. Used Source Analyzer and Warehouse designer to import the source and
target database schemas, and the Mapping Designer to map the sources to
the target.
. Implemented Address Cleansing of all the Customer Addresses, migration
from the legacy Systems in Oracle BRM for proper taxation.
. Address standardization of various countries of Data using Informatica
Data Quality's Address Doctor Transformation.
. Data profiling of the Product data.
Environment/Tools:
. Informatica PowerCenter 9.1, Oracle 11g,PostgreSQL UNIX, Informatica Data
Quality(IDQ), Informatica Data Profiling, QuickBooks, Oracle BRM,XML's
Avnet Inc., Phoenix, AZ
Apr'11-Jan'12
Role: Sr. ETL Developer
One of the world's largest trans-national electronics distributors of
electronic parts, enterprise computing and storage products and embedded
subsystems, Avnet provides a vital link in the technology supply chain.
I worked on QRT, a project based on Quoting System for the current
availability of the parts with the best quote possible.
Responsibilities:
. Gathering the requirement changes from the functional team and
incorporates them in Informatica and Business Objects.
. Interaction with direct Business Users and Data Architect for changes to
data warehouse design in on-going basis.
. Designed the ETL processes using Informatica tool to load data from
Oracle, flat files into the target Oracle Database.
. Followed & Implemented Standards for BI & DW at various levels of the
SDLC.
. Developed complex mappings in Informatica to load the data from various
sources using transformations like Source Qualifier, Expression, Lookup
(connected and unconnected), Aggregator, Update Strategy, Filter, Router,
Transaction Control etc.
. Used Informatica workflow Manager to create, schedule, monitor and send
the messages in case of process failures.
. Designed SQL queries with multiple joins to pull relative data during
import state.
. Designed and modified PL/SQL Stored Procedures to modify data flow.
. Used triggers in order to enforce business rules.
. Developed FTP scripts to send the data extracts to various downstream
applications using Informatica.
. Providing support for user BO report issues and Informatica loading
issues.
. Tuning and performance improvement of the jobs in Informatica. Translated
business requirements to Informatica Mappings. Involved in Unit testing
of mappings.
. Delegating and Tracking the change requests in Informatica.
. Created the transformation routines to transform and load the data.
Developed processes for automation of loading data using parameter driven
sessions for batch schedule processes, verification and re-conciliation
of data stored in several different source systems.
. Worked with analysts and data source systems experts to map requirements
to ETL code.
Environment: Informatica Power centre 9.1/8.6, TOAD, PL/SQL Developer, Data
Mining, Oracle, DB2, Teradata, Erwin 4.0, Windows 2000, XML, SQL, PL/SQL,
Unix/Perl/Shell Scripting.
State Farm Insurance, Bloomington, IL
Dec'09 - Mar'11
Role: Sr. Informatica Developer
State Farm Insurance is one of the leading companies in the nation for the
auto insurance with more than 10 million policyholders. The Insurance Data
warehousing project was developed in an effort to streamline the entire
process of Insurance and to keep track of various financials on a daily,
weekly, monthly basis. Policy, Premium, Claims and Payments were the
modules we were basically dealing.
Responsibilities:
. Interacted with business community and gathered requirements based on
changing needs. Incorporated identified factors into Informatica
mappings to build the DataMart.
. Developed a standard ETL framework to enable the reusability of
similar logic across the board. Involved in System Documentation of
Dataflow and methodology.
. Assisted in designing Logical/Physical Data Models, forward/reverse
engineering using Erwin 4.0.
. Developed mappings to extract data from SQL Server, Oracle, Flat files
and load into DataMart using the PowerCenter.
. Developed common routine mappings. Made use of mapping variables,
mapping parameters and variable functions.
. Used Informatica Designer to create complex mappings using different
transformations like Filter, Router, Connected & Unconnected lookups,
Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator
transformations to pipeline data to DataMart.
. Written procedures, Queries to retrieve data from DWH and implemented
in DM.
. Written shell scripts in UNIX to execute the workflow form GUI.
. Expert knowledge in SQL Queries, Triggers, PL/SQL Procedures, Packages
to apply and maintain the Business Rules.
. Conducted Database testing to check Constrains, field size, Indexes,
Stored Procedures, etc.
. Created testing metrics using MS-Excel
. Implemented various Performance Tuning techniques on Sources, Targets,
Mappings, and Workflows.
. Defects were tracked, reviewed and analysed.
. Validation of Informatica mappings for source compatibility due to
version changes at the source level.
. Performed Configuration Management to Migrate Informatica
mappings/sessions /workflows from Development to Test to production
environment.
. Worked closely with the business analyst's team in order to solve the
Problem Tickets, Service Requests. Helped the 24/7 Production Support
team.
Environment: Informatica PowerCenter 8.6/8.5, SQL Server 2005/2000,
Microsoft Visual Studio 2005,.NET Frame Work 2.0, Oracle 9i, SQL, PL/SQL,
IBM AIX, UNIX Shell Scripts, Actuate Report tool, SSRS, Erwin, STAR team,
Remedy.
Hoffman La Roche,
May'08 - Nov'09
Role: Sr. ETL Analyst/Programmer
Sales and marketing Data Warehouse
Sales and marketing Data Warehouse is an extraction of data from SIEBEL
Sales Force Application and legacy systems. The other sources that were
integrated are DDD and Exponent data from IMS into ODS (Operational Data
store). This Central location of data will make it easier for Roche to
acquire tool(s) to create and manage common reports so that each group can
benefit from or leverage work done by other groups.. Existing Mainframe
System was replaced by the Data warehouse, which helps the company in
saving million dollars every year.
Responsibilities:
. Involved in Data modeling and design of data warehouse in star schema
methodology with conformed and granular dimensions and FACT tables.
. Using Informatica Repository Manager, maintained all the repositories of
various applications, created users, user groups, security access
control.
. Analyzing existing transactional database schemas and designing star
schema models to support the users reporting needs and requirements.
. Created Informatica Mappings to load data using transformations like
Source Qualifier, Sorter, Aggregator, Expression, Joiner, Connected and
Unconnected lookups, Filters, Sequence, Router and Update Strategy.
. Worked on Xponent Data provided by IMS.
. Extensively used SQL * Loader, Informatica tool to extract, transform and
load the data from MS SQL Server, Flat Files, Oracle to Oracle.
. Using PowerExchange tool extracted data from mainframe system and shadow
direct tool to extract db2 tables.
. Performed security and user management repository backup using the same
tool.
. Implemented Slowly Changing Dimensions (SCDs, Both Type 1 & 2).
. Cleansed the source data, extracted and transformed data with business
rules, and built reusable mappings, known as 'Mapplets' using Informatica
Designer.
. Involved in the development of Informatica mappings and also performed
tuning for better performance.
. Using parallel processing capabilities and Session-Partitioning and
Target Table partitioning utilities
. Written a UNIX process for loading the zipped version of files and
thereby improving the load time and saving the disk space.
. Extensively worked on tuning (Both Database and Informatica side) and
thereby improving the load time.
. Extensively used PL/SQL for creating packages, procedures and functions.
. Automated the entire processes using unix shell scripts
. Using Autosys to schedule UNIX shell scripts, PL/SQL scripts and
Informatica jobs.
. Written Unix Shell Scripts for getting the data from all systems to the
data warehousing system. The data was standardized to store various
business units in tables.
. Tested all the applications and transported the data to the target
Warehouse Oracle tables and used the Test Director tool to report bugs
and fix them in order.
. Tested the target data against source system tables by writing some QA
Procedures.
. Created Migration Documentation and Process Flow for mappings and
sessions.
. Using Autosys as the job scheduling tool.
. Presentation of the advanced features of Business Objects to users and
developers to enable them to develop queries and reports easily.
Environment: Informatica Power Center 5.1/6.2, HP UNIX, Windows NT, DB2,
AS/400, Oracle 8i, SQL, PL/SQL, SQL*Loader, TOAD, SQL-Navigator.
PayPal, San Jose, CA
July'07 - Apr'08
Role: Data Warehouse ETL Developer
Description:
. ECTL of Oracle tables to Teradata using Informatica and Bteq Scripts.
Migrated SAS code to Teradata Bteq Scripts.
. Predict current high value sellers who are likely to churn based on
their similarities to past high value sellers. Identify dollar
opportunity by reducing churn of high value sellers.
Responsibility:
. Responsible for preparing the technical specifications from the
business requirements.
. Analyze the requirement work out with the solution. Develop and
maintain the detailed project documentation.
. Used Informatica and generated Flat file to load the data from Oracle
to Teradata and BTEQ/Fast Load Scripts to do incremental load. Used
Stage work and Dw table concept to load data, applied Start Schema
concept. Created UDFs in JAVA Transformation to complete some tasks.
. Design, develop and implemented ECTL process for Marketing Team for
existing tables in Oracle. Wrote BTEQ Scripts in order to support the
project.
. Wrote stored procedures in PL/SQL and UNIX Shell Scripts for Automated
execution of jobs
. Used version control system to manage code in different code streams
like Clear case.
. Performed data-oriented tasks on Master Data projects especially
Customer/Party, like standardizing, cleansing, merging, de-duping,
determining survivorship rules.
. Responsible for the design, development, testing and documentation of
the Informatica mappings, PL/SQL, Transformation, jobs based on Paypal
standards.
. Initiate, define, manage implementation and enforce DW data QA
processes across, Interact with other QA Team. Interacted with data
quality team.
. Identify opportunities for process optimization, process redesign and
development of new process.
. Anticipate & resolve data integration issues across applications and
analyze data sources to highlight data quality issues. Did performance
and analysis for Teradata Scripts.
. Migrate SAS Code to Teradata BTEQ Scripts to do the scoreing for score
taking in account various parameters like login details, transaction $
amount etc. Playing with Marketing Data for various reports.
Environment: Oracle 9i, Informatica PC 8.1, PL/SQSL, Teradata V2R6,
Teradata SQL Assistant, Fast load, BTEQ Script, SAS Code, Clear case, JAVA
Language, Pearl Scripts, XML Source .
Equitable Resources Pittsburgh, PA
Nov '06 - Jun'07
Role: ETL Developer
Customer has announced its plan to acquire two natural gas companies. These
companies will be integrated into the operation of Customer utilities.
Among the more complex elements of this acquisition will be the transition
into the customer's IT infrastructure. Customer wants to transfer the data
from current systems of these companies to its CIS system.Informatica8.1 is
used as ETL tool for integration process and Mercury Quality centre is used
for Test management.
Responsibilities:
. Understand the Functional requirements.
. Created user id and folder for each new account for the developers
along with the necessary privileges
. Developed mappings in Informatica to load the data from various
sources into Data Warehousing using different types of transformations
like source qualifier, expression, filter, aggregator, update
strategy, lookup, sequence generator, joiner, normalizer.
. Assisting in Informatica Administration.
. Managing Domain, Nodes, Grids, Integration Service, Repository
Service, Folders through Admin Console.
. Taking Backup of Informatica repository database.
. Managing Integration service on grid and on node.
. Used Informatica Workflow Manager to create, Schedule, execute and
Monitor Sessions and Workflows.
. Developed UNIX shell scripts as part of the ETL process to schedule
tasks/sessions.
. Migrating data mappings to production and monitoring, troubleshooting,
and restarting the batch process using Informatica. Migration for dev
to test and test prod.
. End-to-End Integration testing of ETL-process from source-system to
target data mart.
. Worked with Memory cache for static and dynamic cache for the better
throughput of sessions containing Rank, Lookup, Joiner, Sorter and
Aggregator transformations.
. Scheduled sessions and batches on Informatica server using workflow
manager.
. Involved in unit testing of mappings, Mapplets also involved in
integration testing and user acceptance testing.
. Used the Debugger to run Debug sessions setting Breakpoints across
instances to verify the accuracy of data.
. Experience in coding using SQL, SQL * plus, PL/SQL
procedures/functions.
. Extensively used ETL to load data using PowerCenter/Power Connect from
source systems like Flat Files and Excel Files into staging tables and
load the data into the target database.
. Created mappings based on procedure logic to replace procedures and
functions.
. Involved in fixing invalid Mappings, testing of Stored Procedures and
Functions, testing of Informatica Sessions, and the Target Data.
. Backing up Informatica, UNIX.
PHILIPS, Bridgeport, CT
Nov'05- Sept '06
Role: ETL Consultant
Description: Philips concentration in the United States mirrors the Global
focus on lifestyle, health care and Technology. The main objective of this
project is to analyze the sales data for their products and customers. The
POS Sales Data contains Sales, Product, Inventory Balances, and Demographic
information for Philips U.S Operations.
Responsibilities:
. Analyzing requirement and Functional documents and explaining them to
the team and imparting business knowledge to the team.
. Worked on Dimension as well as Fact tables, developed mappings and
loaded data on to the relational database.
. Augmentation of mappings to take care of new emerging requirements.
. Developed mapping (interfaces) data transfer strategy from data
sources to target systems
. Developed mapping in multiples schema data bases to load the
incremental data load into dimensions.
. Extensively used the Filter Control & Expression on Source data base
for filter out the invalid data etc.
. Extensively used ETL to load data from Flat files which involved both
fixed width as well as Delimited files and also from the relational
database, which was Oracle 9i.Involved in the data analysis for source
and target systems and good understanding of Data Warehousing
concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake
Schema.
. Good exposure on Normalized, Denormalizated Data.
. Experience in Integration of various data sources like Oracle, SQL
Server, Fixed Width and Delimited Flat Files, COBOL files & XML Files.
. Developed and tested all the informatica mappings, sessions and
workflows - involving several Tasks.
. Worked extensively on different types of transformations like source
qualifier, expression, filter, aggregator, update strategy, lookup,
sequence generator, joiner.
. Analyzed the session and error logs for troubleshooting mappings and
sessions.
. Designed Universe with data modeling techniques via implementing
Contexts, aggregate awareness, hierarchies, predefined conditions,
linking, Joining tables, indicating cardinalities, creating aliases to
resolve the loops, subdividing into contexts and creating the objects
which are grouped into classes.
. Generated reports using Desk-I and infoview with analytical ability
and creativity using Multi-data providers, synchronization, BO
formulas, variables, sections, breaks, formatting drilldowns,
Hyperlinks etc.
. Used Informatica Version Control for checking in all versions of the
objects used in creating the mappings, workflows to keep track of the
changes in the development, test and production environment.
. Responsible for providing development support to offshore team, also
involved in changing the business logic (Ralph Kimball Preferred)
depending upon the client requirements.
Environment: Informatica 7.1.2, BO 6.5, Oracle 9i, UNIX
Centre for Good Governance, Hyderabad, India
Oct'04-Feb'05
Role: ETL Informatica Developer
This project helps to increase crop productivity, improve quality of
agricultural produce, conserve & sustainable use of water, higher
fertilizer use efficiency & saving in fertilizer, and saving in labour
expenses for the Government of Andhra Pradesh..
Responsibilities:
. Contributed in the development of system requirements and design
specifications
. Participated in the design and development of Dimensional modelling.
. Developed complex mappings in Informatica to load the data from
various sources using various transformations like Source Qualifier,
Look up (connected and unconnected), Expression, Aggregate, Update
Strategy, Joiner, Filter and Router
. Developed Mapplets to implement business rules using complex logic
. Converted the PL/SQL Procedures and SQL*Loader scripts to Informatica
mappings
. Tuned the Sessions for better performance by eliminating various
performance bottlenecks
. Created and scheduled Sessions and Batches through the Informatica
Server Manager Wrote UNIX shell scripts to automate the data transfer
(FTP) process to and from the Source systems, to schedule weekly and
monthly loads/jobs
1
2 Environment: Informatica PowerCenter 6.2, Business Objects, Oracle
Applications 11i, Oracle 9i, SQL Server, SQL* Loader, HP UNIX, ERwin 4.0,
Test Director, WinRunner.
EDUCATION
Bachelor of Engineering, INDIA