Alapi
ETL/Informatica
Developer
E-Mail: *********@*****.***
Phone: 224-***-****
Professional Summary:
. Over 7+Yeas of experience in all phases of the Data Warehouse
Lifecycle including Requirements Gathering, System Analysis, Design,
Development, and Validation of Data Marts and Data warehouses using
ETL, Data Modeling and Reporting tools.
. and extensive experience in Data Integration using SQL and ETL tool
Informatica PowerCenter 6.x/7.x/8.1.1/8.6.1/9.1.0/9.5.1.
. Experience in designing and developing ETL Solutions, Informatica
Workflows to support operational management of data solutions and
adhering the industry best practices.
. Experience in creating Reusable Tasks (Sessions, Command, Email) and
Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer,
Assignment, Worklet, Control).
. Well Experienced in doing Error Handling and Troubleshooting using
various log files.
. Extensively worked on developing and debugging Informatica mappings,
mapplets, sessions and workflows.
. Worked on Performance Tuning, identifying and resolving performance
bottlenecks in various levels like sources, targets, mappings and
sessions.
. Understanding & working knowledge of Informatica CDC (Change Data
Capture) Implementation experience of CDC using stored procedures,
triggers and using informatica power exchange.
. Experienced with Informatica Data Explorer (IDE) / Informatica Data
Quality (IDQ) tools for Data Analysis / Data Profiling and Data
Governance.
. Experience in working with Mainframe files, COBOL files, XML, and Flat
Files
. Outstanding experience in retrieving the data by writing simple/
complex SQL Queries and using SQL, PL/SQL, Indexes, Functions,
Procedures, Triggers and Cursors.
. Skills in creating Test Plan from Functional Specification, and
Detailed Design Documents and thorough with Deployment process from
DEV to QA to UAT to PROD.
. Experience in creating customized MicroStrategy Reports using the MSTR
objects and Metrics.
. Experience in Reporting tools using MicroStrategy, Business Objects,
Cognos and Hyperion. Created various Grid reports, master/detail
reports, cross tab reports and charts.
. Experience in Data Analysis and Adhoc Analysis as needed to support
project goals.
. Proficient working with Unix/Linux environments and writing UNIX shell
scripts,Java&Sybase
. Experienced in working with tools like TOAD,SQL Server Management
studio and SQL plus for development and customization
. Experience with Teradata as the target for the datamarts, worked with
BTEQ, FastLoad and MultiLoad
. Proficient in using pmcmd and pmrep commands
. Experience in mentoring and providing knowledge transfer to team
members, support teams and customers.
. Excellent skills in analysis and business requirements gathering from
end users to the client and experience in documentation of developed
applications.
Technical Skills:
ETL / DWH Tools: Informatica Power Center 8.1.1/8.6.1/9.1.0/9.5.1,
Informatica Power Exchange 8.1.1, SSIS, DTS.
BI / Reporting Tools MicroStrategy Web 8.01/9.2.1/9.4.1, Hyperion 9.0
Interactive Reporting, Business Objects XIR2,
Cognos 8, SSRS, SAS
Operating Systems: UNIX (Solaris, HP-UX, AIX), Linux, MS Windows
NT/2K/2003/XP/7
Databases/Modeling Tool: Oracle 9i/10g/11g,MS SQL Server 2005/2008, Sybase,
Teradata, DB2, IBM DB2, Netezza, ER-win 4.1.4/4.1
Rational Tools: Rational Clear Case, Clear Quest, Test Manager,
Robot, RUP,
Rational Rose, Rational Requisite Pro
Utilities: Toad, SQL Developer, SQL plus, Rapid SQL, Teradata
SQL Assistant, NZSQL, SQL*Loader
HP ALM Quality Center 8.2/11.00 Enterprise
Testing Tools: Edition, QTP 8.2, Test Director
Programming: UNIX Shell Scripting, SQL, PL/SQL, C, C++, HTML,
XML, ASP.NET.
Professional Experience:
Client: American Water, Voorhees, NJ
Aug'12 - Tilldate
Role: Informatica Developer
Responsibilities:
. Involved in all phases of SDLC from requirement, design, development,
testing, training and rollout to the field user and support for
production environment.
. Working with the Business Analysts and the QA team for validation and
verification of the development
. Involved in designing relational models for ODS and datamart.
. Data Quality Analysis to determine cleansing requirements.
. Responsible for developing, support and maintenance for
the ETL (Extract, Transform and Load) processes using Informatica
power center.
. Extract data from flat files, Oracle, DB2, and SQL Server 2008 & MS
SQL Server 2005/2008 and to load the data into the target database.
. Extensively used Informatica to load data from fixed
width and delimited Flat files.
. Worke with Repository Manager, Designer, Workflow Manager and Workflow
Monitor, also imported and created Source Definitions using Source
Analyzer and Target Definitions using Warehouse Designer.
. Developed complex mappings using corresponding Source, Targets and
Transformations like update strategy, lookup, stored procedure, SQL,
sequence generator, joiner, aggregate, Java and expression
transformations in extracting data in compliance with the business
logic
. Developed Informatica using Java&Sybase environments.
. Created SSIS Packages to migrate slowly changing dimensions
. Extensively used Informatica Power center and created mappings using
transformations to flag the record using update strategy for
populating the desired slowly changing dimension tables.
. Implementation experience of CDC (Change Data Capture) using stored
procedures, triggers and using informatica power exchange.
. Querying and analyzing multiple databases and handling the errors as
per the client specifications.
. Created server optimized database routing and mappings and focused on
performance tuning.
. Handling larger database queries and applying transformations to make
the business solution applicable to project.
. Involved in Performance tuning.
. Created Transformations and Mappings using Informatica Designer and
processing tasks using Workflow Manager to move data from multiple
sources into targets.
. Designed dynamic SSIS packages to transfer data crossing different
platforms, validate data and data clearing during transferring, and
archived data files for different DBMS.
. Wrote stored procedures in PL/SQL and UNIX Shell Scripts for automated
execution of jobs
. .Developed Korn shell (ksh) scripts for Informatica pre-session, post
session procedures
. Identified the errors by analyzing the session logs.
. Designed and created an Autosys job plan to schedule our processes.
. Working with Mainframe files, COBOL files, XML, and Flat Files
Environment: Informatica Power Center 8.6(Informatica Designer, Repository
Manager, Workflow Manager, Workflow Monitor, Repository Server
Administration Console),SSIS, Erwin, SQL, PL/SQL, Sybase, MS SQL Server
2005/2008,Java,Oracle 11g, UDB, DB2, Flat Files, Solaris 10
National Grid, Long Island, NY June
'11 - Aug'12
Role: Sr. ETL/ BI Developer
Project: Data Warehouse / Business Intelligence Project, BI Analytics
Responsibilities:
. Responsible for developing, testing, implementing and supporting
applications. Worked closely with ETL Leads/Managers.
. Responsible for the development of Informatica Mappings, workflows,
development of UNIX (AIX) scripts and run documentation.
. Created AIX UNIX scripts to run ETL Informatica batches. Worked with
Command line utilities.
. Used SQL *Loader in UNIX scripts to load the Stage tables from delimited
Flat Files.
. FTPed the files from one server to another.
. Tested and Documented the Financial and Other Applications in development
and in Production for AIX 5.3 TL10 OS Upgrade.
. Extensively worked with Financial Data. Did Load refresh for
BUDGET/FISCAL Year 2011. Updated the Financial Hierarchy as per the Users
request in development and in Production.
. Created the Migration document and listed out the objects related to
Informatica and Database that needs to be migrated to Production as per
Release Management.
. Worked on Request for Service, RFS from Users and created an Extract for
TAX AUDIT from Integrated Electric Marketing Data Mart, IEMDM related to
Revenue data. Loaded data into Delimited Flat File as per the requirement
and created MS Access database file.
. Created a Revenue Report for Users from Electric Marketing Data Mart.
. Used DB links in SQLs. Created views from multiple source tables and used
them in Informatica Mappings. Created DDLs for the Tables.
. Used BMC Remedy User Tool for creation of Tickets.
. Provided Production Support as needed.
Environment: Informatica PowerCenter 8.6.1, Oracle 10g/9i, SQL *Loader,
MicroStrategy 8.1.1, AIX 5.3 TL10, UNIX (AIX) Scripting, Remedy 7.0.1, Text
Editor, File Viewer, FileZilla FTP Client, MS Access, Windows XP.
Merkle, Marlborough, MA Sept'10 -
June '11
Role: ETL/ BIDeveloper
Project: Biogen Idec Campaign Management solution (BIIB CMS) / WEB
Analytics
Responsibilities:
. Analyzed the Business Requirements for BioGen Idec's Campaign Management
Solution.
. Designed the Source and Target System for the ETL Process.
. Designed the File Handling from the client BioGen by using UNIX Scripts.
. Did the File Validation and File Rejection techniques on the File System.
. Extensively worked with Fixed Width and Delimited Flat Files.
. Designed and created the Informatica PowerCenter Mappings for Data
Process from the File Landing Area to the Informatica Target Directory.
. Created the Parameter Files for Dynamically Handling the Source Files
from the client, BioGen.
. Extensively worked with Data Cleansing Data Scrubbing Operations by using
Informatica Functions.
. Followed Team Based Development for Informatica ETL Development.
. Designed the File System to send the client files to Knowledge Link (KL),
a Merkle's proprietary software for standard data Hygiene and Match
Processing for Customer Name and Address Standardization as per client
business need.
. Followed the Merkle Standards for Processing the Merkle Client Files.
. Used Pre / Post Process Shell commands in Informatica and extensively
worked with UNIX shell scripts for File transfers.
. Created the Jobs and dependencies in Tidal Enterprise Scheduler for
client Files and for Informatica ETL Workflows.
. Created the High Level and Low Level Design Documents for the Informatica
Mappings.
. Followed the Standards and Requirement given by UNICA, a Campaign Tool to
process the client Files as per UNICA Standards.
. Worked with UNICA tables like Individual, Address, Contact Preferences,
Email, Phone, Individual Address, Demographics and Individual Attribute
Extended for Customer Database.
Environment: Informatica PowerCenter 8.6.1, Sterling, Tidal 5.0 Enterprise
Scheduler, UNIX Shell Scripting, UNICA IMOD, UltrEdit-32 Text Editor, File
Viewer, FileZilla FTP Client, SQL Server 2008, Windows XP.
PNC Financial Services, Pittsburg, PA
May' 09 - Sept '10
Role: ETL Consultant
Project: Finance Services
Responsibilities:
. Created data model structure and ER modeling with all related entities
and relationship with each entity based on the rules provided by the
business manager using Erwin.
. Analyzed the current architecture to propose an ETL design solution
DWH and coordinating with source system owners, day-to-day ETL
progress monitoring, Data warehouse target schema design and
maintenance.
. Involved in understanding requirements and in modeling activities of
the attributes identified from different source systems, which are in
Oracle, Teradata.
. Data is Staged, Validated and finally loaded the data into Data
Warehouse using Informatica.
. Used Type2 mapping to update a slowly changing dimension table to keep
full history.
. Create and maintain metadata and ETL documentation that support
business routines and detailed source to target data mappings.
. Designed and developed complex Aggregate, Join, Router, Look up and
Update transformation rules (business rules).
. Coding using BTEQ SQL of TERADATA, write UNIX scripts to validate,
format and execute the SQL's on UNIX environment.
. Solved Various Defects in Set of wrapper Scripts that executed the
Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
. Knowledge in generating Mload and Tpump scripts to load the data into
Teradata tables.
. Used Teradata manager, Index Wizard and PMON utilities to improve the
performance of mappings.
. Carried out performance-tuning on data sources and targets using
different methods, executing the mappings remotely from Informatica
Power center.
. Worked on Informatica Data Quality (IDQ) in driving better business
outcomes with trusted enterprise data and empowering the data quality
and data governance.
. Worked on Testing and validation of the developed Informatica
mappings.
. Wrote PL/SQL Packages, procedures, functions in oracle for business
rules conformity.
. Involved in Unit Testing, Integration, and User Acceptance Testing of
Mappings.
. Worked with Senior Developer in Documenting the ER Diagrams, Logical
and Physical models, business process diagrams and process flow
diagrams.
. Created a database for various phases and assisted the QA team in
building test cases for each.
Environment: Informatica PowerCenter 8.1, Teradata V2R6/V2R5, Oracle 9i,
Erwin 4.1, UNIX, SQL Plus, SQL*Loader.
Infotech Enterprise Ltd., India.
May '07 - Mar '09
Role: Database/ETL Developer
Responsibilities:
. Extracted data from multiple sources, which include relational
sources, flat files and XML files.
. Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping
Designer, Mapplets, and Transformations for Extraction, Transformation
and Loading of data.
. Used Joiner Transformations to extract data from multiple sourced and
Sequence Generator Transformation to create unique primary key values.
. Used Aggregator, Sequence Generator, Joiner, Lookup, Expression,
Filter, Rank, Router, Update Strategy Transformations to populate
target.
. Involved in analysis, design, development, test and implementation of
transformations and mappings.
. Coordinated execution of the test plans and verified data to be loaded
into Warehouse.
. Implemented SCD 1, SCD 2 techniques for handling history requirements.
. Worked cooperatively with the team members to identify and resolve
various issues relating to Informatica and databases.
. Involved in the coding of Stored Procedures and Triggers using PL/SQL
for various Business Requirements.
. Designed and developed all the tables, views for the system in Oracle.
. Involved in Data Loading and Extracting functions using SQL*Loader.
. Implemented various validation rules on the data by applying various
integrity constraints as well as data validation rules.
. Used debugger to test the mappings developed and fixed the bugs.
. Reducing the bottlenecks helping the query to execute by dropping or
creating the indexes as per required.
. Used parameters and variables at mapping and session levels to tune
the Performance of mapping.
Environment: Informatica PowerCenter 7.1, Oracle 8i, SQL*Plus,
SQL*Loader, PL/SQL, Windows NT.