Jyothi Kothagundla
Ph: 408-***-****
Professional Summary:
Over 7 years of experience with ETL tool using IBM Web sphere Data Stage 8.x/7.x and Informatica 8x in
Educational, Retail, Insurance, Telecommunications, and Pharmaceutical and Financial firms.
Extensive Experience working with Ascential Data Stage, IBM Infosphere Information Server (IIS) Data Stage
8.5 and IBM Websphere Information Server 8x.
Experience and Working knowledge on Informatica Powercenter 8x, Informatica Power Exchange for Data
Migration, Data integration projects using Designer (Source Analyzer, Warehouse designer, Mapping designer,
Mapplet Designer, Transformation Developer), Repository Manager, Workflow Manager & Workflow Monitor.
Experienced in using highly scalable parallel processing Infrastructure using Datastage Enterprise Edition (PX).
Proficient in designing IBM Data Stage 8.5 Parallel Jobs.
Experience with Oracle Forms and Reports development.
Proficient in working with ETL processes across Oracle, Access, SQL Server, DB2 Databases.
Strong experience with Data Modeling techniques, proficiency in design and implementation of Star and Snowflake
schemas.
Extensive experience working on Data Marts and Data Warehouses.
Proficiency with ERWIN 4.1/7.2 and Datanamic Dezign V5 Tool for Logical and Physical Data Models.
Excellent experience in designing and developing jobs using DataStage Designer, Data Stage Manager, and Data Stage
Director.
Strong experience working with Datastage plugin stages like Oracle Connector, Oracle Enterprise and RDMS in IBM
Datastage 8.x/7.x.
Excellent knowledge of studying the data dependencies using Metadata of Data Stage and preparing job sequences for
the existing jobs to facilitate scheduling of multiple jobs.
Worked on DataStage production job scheduling process using Control M Enterprise/Desktop Manager.
Strong in writing UNIX shell scripts for Data validations, Data cleansing etc.
Involved in all the stages of Software Development Life Cycle.
Worked extensively on Oracle Pl/Sql Stored Procedures, Packages, Functions, Snapshots, Triggers.
Skilled in writing translating user requirements and writing Technical Specification Documents.
Highly organized, detail oriented professional with strong Technical and Communication skills.
Effective in cross functional and global environments to manage multiple tasks & assignments concurrently.
Education Qualifications :
Bachelor’s in Computer Science, Hyderabad, India.
Master’s in Information Technology, Texas, USA.
Skillset:
DataStage 8x/7x (Administrator, Manager, Designer, Director), Dimensional Modeling, UNIX (Sun Solaris), Linux, Windows
9x/NT/ Win P, SQL, PL/SQL, C, and Visual Basic, UNIX Shell Scripting (Korn Shell, Bash Shell), HTML, DHTML,
JavaScript, XML. Oracle 9i, 10g and MS Access, Oracle SQL* Plus, Sql Developer, TOAD, Oracle Forms 6i/9i/10g, Oracle
Reports 6i/9i/10g, SQL*Loader, BI Publisher, Oracle 11i Applications (OLTP) Implementation & Development: Financials:
Receivables, General Ledger, Payables.
Projects Summary:
Sephora USA,San Francisco, CA Jan’2013 – April’2013
Sr.ETL Developer
Provide accurate Gift Card reporting (sales and total redemptions) for Sephora USA and Sephora Canada, across all channels
(Stores, Dotcom, Mobile) sold directly from Sephora and through 3rd party channels at a weekly, MTD and YTD cadence.
Environment:
IBM DataStage 8.1, Mainframes AS400, SQL Server, Oracle 10g, Sql Squirrel, SQL Developer, SQL Server Management
Studio, Jira, Sub Version( SVN).
Responsibilities:
• Gathering and Analyzing project requirements for the Gift Card Sales Activations and Redemptions.
• Analyzed the source data coming from the Value link MCS Vs Dotcom system and prepared the documentation.
• Translating the business requirements in to ETL Mapping documents.
• Designing the end table structures for the EDW Gift_Card_Details and Gift_Card_Transactions.
• Wrote SQL queries for calculating sales details for all the Gift Card information coming from MCS source system
(GDDYTXN and Value link).
• Designed and Developed ETL jobs in DataStage for Sephora Gift Cards sales through Mobile, Dotcom and Brick and
Mortar Stores for USA and Canada.
• Developed the ETL Server jobs for Gift Card Activations, Redemptions, Reloads and Reconciled data and loaded to
the target EDW tables for the end user Reports.
• Created Sequencer jobs to load the incremental data based on the Reporting needs (Weekly/Monthly and Yearly).
Gap Inc, San Francisco, CA. Oct’2011 – Dec’2012.
Sr. ETL Consultant
GLV is Global Logistics Visibility system, from where logistics related data should be imported to PROMPT system. To update
the PROMPT system with the latest logistics data from GLV, the input feed files would be provided and through the separate
inbound batches, GLV data would be updated to PROMPT master tables which will be used for creating the Purchase Orders.
Environment:
Ascential DataStage 7.5.1 (Parallel Jobs), Informatica PowerCenter 8.0, Oracle (Forms, Reports 10g), Mainframe DB2 Gap
host, Mainframe ESP scheduler, UNIX AIX, MS Visio, Serena Dimensions, Unix Shell scripting.
Responsibilities:
• Gathering and Analyzing project requirements for the GLV Data Integration with the Prompt System.
• Translating the business requirements and writing Functional Specification Documents, Technical Specification
Documents and ETL Mapping Documents for Freight Rate, Transit Time Planner, Agent Commission and
XfrPointCtryOfOrigin Interfaces.
• Created Mapping documents for Phase 1 and Phase 2 GLV Interfaces with the XML files as the source and Mainframe
DB2 as the target database.
• Worked on the Environment readiness for the project build and implementation with the ESB, SCM, ISA, MQ and VP
teams.
• Coordinating with Offshore team, sharing the business requirements, assigning the tasks and scheduling weekly status
meetings.
• Designed and Developed ETL jobs for Transit Time Planner interface in DataStage 7.5.1 Designer.
• Responsible for developing and modifying the Existing application Forms for SIM modules.
• Prepared Unit Test Plans and Unit Test Results, Job Docs for the ETL batch jobs to be scheduled from ESP.
• Created UNIX shell scripts for the ETL batch to be run on Daily/On Request mode from ESP.
• Worked on the code migrations, fixing issues and bugs during the Integration Testing Phase.
• Working on the Production requests, Change tasks and environment readiness for the Phase 1 Project Deployment.
• Creating Weekly Status Reports with the Project plan and Implementation dates.
Blackhawk Networks, Pleasanton, CA.
May’2011 – October’2011
ETL Consultant
This project is about the Blackhawk core system data and reporting maintenance: Data Centralization, Data Integration of
interfacing systems called TRACS, IMS, BLISS, PAY GO, Galileo and e funds.
EDW is the Enterprise Data Warehouse which stores Blackhawk enterprise data for reporting and analysis. This system extracts
Sales, Supply Chain and Issuance data from various interfacing systems (TRACS, IMS, BLISS, PAY GO, Galileo and efunds)
directly or through external data file loads. EDW messages the data which it pulls and stores them as Facts and Dimensions.
Environment:
IBM DataStage Information Server 8x (Parallel Jobs/Server Jobs), IBM DB2 UDB 8.0 database, Microstrategy Business
Intelligence Tool, AquaData Studio 9.0.13, Windows XP, Linux, Unix Scripting, Tidal.
Responsibilities:
• Gathering and Analyzing user requirements for the ETL EDW development.
• Designed and Developed ETL Datastage parallel jobs for Data Centralization (CDR) and EDW target tables from
source sequential files to target DB2 tables through staging tables.
• Design and Developed ETL jobs in Datastage 8x parallel jobs for Bliss Order and Order Response files.
• Worked on the EDW ETL job customizations for the files coming from the multiple sources: ERP and IMS interfacing
systems.
• Developed Datastage Sequence jobs with the Execute Command stage using UNIX commands for checking source file
patterns.
• Developed ETL job scheduling for the EDW batch jobs in Perl script and Tidal.
• Monitored Datastage ETL Production CDR Batch jobs on the daily run and compared the GPR and GIFT (BLISS)
counts with the CDR for the Enterprise Reporting Team.
• Attending daily scrum meetings and weekly BI meetings, discussing and solving technical issues and production
issues.
University of Arizona, Tucson, Arizona
Database/ETL Consultant Jan’2010
March’2011
Mosaic is the name of The University of Arizona's Enterprise System Replacement Project (ESRP). The purpose of this project
is to update and augment the University’s ageing core administrative systems. It is a multi phase project with five key areas:
Student Administration, Financials (Kuali Financial System, KFS), Human Resources / Payroll, Research Administration (Kuali
Coeus, KC), and Business Intelligence.
The Kuali Coeus DataWarehouse (EPM) will be built to replace the existing financial system FRS as well as part of the existing
grant administration program ‘SPINS’ with a new BI solution to all levels of the University. This DataWarehouse has been split
in to three parts: Operational Warehouse Staging OWS and Multi Dimensional DataWarehouse MDW.
Environment: IBM Infosphere Datastage 8.5(Parallel Jobs), IBM DataStage Information Server 8.1, IBM Infosphere Fast Track
8.0.1, Oracle Business Intelligence Suite Enterprise Edition (OBIEE) Answers/Dashboards, Control M Desktop/Enterprise
Manager, Dezign 5.1, Oracle 10g, Toad, Windows XP, KC 3.0, Linux, Microsoft Excel 2007.
Responsibilities:
• Gathered requirements from HPC Analytics for KC Phase 1.0 for Proposal and Award Modules.
• Analyzed the requirements and Designed Dimensional Model for Proposal, Award and Budget data in Dezign V5.
• Prepared Technical Design Specification Documentation for Data Extraction, Data Transformation, and Data Loading
in to EPM Data warehouse.
• Created Mapping Documents for the ETL process in MS Excel 2007.
• Developed ETL job designs in DataStage that extracts data from the source (Kuali Coeus) transactional database and
loads into the Staging and Target DataWarehouse built on Oracle 10g.
• Designed and Developed Sequence jobs that handles the Initial load and the Incremental load for the Research and
Financial System’s EPM.
• Designed various Datastage jobs for the Dimensions and Fact Tables of Star schema DataWarehouse using Datastage
CDC, Lookups, Transformer, Datasets, Joins and Oracle Enterprise Stages.
• Developed Datastage jobs that maintain the historical data in DataWarehouse using Slowly Changing Dimension
(SCD) Type 1 and 2 Logic.
• Designed a Sequence job in Datastage that extracts the data on a daily basis, loads in to the target DataWarehouse and
notifies the status of the load upon completion.
• Developed Job sequences with restart ability check points, and implemented proper failure actions.
• Created some test scripts for testing Datastage jobs.
Extensively used SQL coding for overriding of Auto generated SQL query in Oracle Enterprise Stage of DataStage
•
8.0.1.
• Worked on the modifications of DataStage jobs that were developed through IBM Infosphere Fast track
• Scheduled ETL jobs in Prod and Test environments through Control M for daily run.
• Developed Adhoc Reports in OBIEE (Analytics Answers) to manage Student Administration Report Inventory.
• Developed OBIEE Dashboards for Student Administration using the concepts of Pivot, Prompts, and SQL Functions
from Term Enrollment, Class Enrollment and Student Profile Subject Areas.
• Actively participated in weekly Operational team meetings and involved in solving the Technical issues.
Astellas Pharma US, IL
Informatica Developer Sep’2009 Dec’2009
The purpose of the project is to implement the EDGE Continuous Loop Promotion (CLP) system for ASTELLAS. The vision
of the EDGE solution is to allow Astellas Sales Representatives to interact with their customers through the delivery of the
targeted messages to Health Care Professionals and how Astellas can control and utilize its sales force in most effective manner.
The EDGE CLP project is a Data Integration and a Data Conversion process project with the three major components –Siebel
SFA (ATLAS), SQL Server 2005 (EXPLORIA) and Oracle 10g (EDGE CIR) loaded to CLP Enterprise Data Warehouse
(EDGE CIR).
Environment: IBM DataStage 8x, Oracle 10g, Autosys, Toad, Sub Version, Sql* plus, MS Visio, Erwin 7.3, Pl/Sql, SQL
Server 2005, Windows XP.
Responsibilities:
Worked on the CLP Phase 1.2 to build the EDGE CIR Repository in Oracle 10g.
Created Mapping Documents for ATLAS and EXPLORIA basing on the business rules in MS Excel.
Created Design Specification Documents for the CLP Phase 1.2 process.
Developed several ETL mapping for Data Integration of the Source systems which were channeled through
“LND”,”CRF” and “IDS” batch stages.
Created various procedures, functions and packages in Pl/Sql for the daily feed of Contacts with IMS ID HCP
Segmentation.
Developed Pl/Sql code to keep track of the rejected records in the data transfer process from source to target with
Exception Handling.
Wrote various SQL scripts to implement the business rules.
Developed a Pl/Sql Package for the batch process cycle of ESP and SFA Source data extraction and load in to Target.
Implemented Slowly Changing Dimension (SCD) logic in Informatica to maintain history for every data load from the
Siebel and SQL Server Source Systems.
Implemented table Look ups for the HCP and Reps Contacts that were cached in Oracle Server.
Used link server as a bridge between SQL Server Source and Oracle 10g target database to process the Data Transfer.
Created various UAT scripts for the Integration and the Conversion processes.
Involved in Data Cleansing, Data Transformation and Data Conversion process of landing data and converted in to
CRF format.
Designed the Data Models for the Batch process and Error Handling using Erwin 7.3.
Created various Subject Areas, Domains for SFA, CIR and ESP in Erwin and used Reverse Engineering to implement
the changes of CLP Phase 1.1.
Involved in Source System dependencies for the EDGE CIR load in Oracle Pl/Sql for ATLAS batch job flow and
worked on the daily feeds that are extracted from ATLAS.
Designed ETL sessions in Informatica workflow designer for the Integration process.
AIG Claims Services, NY Jul’2008 Aug’2009
DataStage/ETL Developer
AIG (American International Group Inc.) is a group of companies that serve various customers through the most extensive
worldwide property casualty and life insurance networks. The purpose of the project is to Integrate Data of 21st Century
Insurance Group of West coast with AIG. The process is to extract the data stored in different DB2 UDB and load it into
DataWarehouse.
Environment: IBM DataStage Information Server 8.1 (Designer, Manager, Director), Sub Version, Quality Stage, DB2 UDB,
Oracle 10g, UNIX, Sql, Pl/Sql, Toad, SQL*Loader, UNIX Shell Scripting, Autosys, Erwin 7.2, MS Excel.
Responsibilities:
Prepared technical design/specifications for Data Extraction, Data Transformation, Data Cleansing and Data Loading.
Loading APS Policy source data from DB2 to Oracle 10g Staging Schema.
Designed Data Stage Parallel jobs with Change Capture, Change Apply stages and Created Policy Data Sections from
the Staging Tables.
Developed user defined Transforms, Routines, Stage Variables and Stored Procedures.
Automated and fine tuned Data Stage jobs and sequences for loading source systems data into Data warehouse.
Extensively wrote user defined SQL coding for overriding for Auto generated SQL query in DataStage.
Written UNIX scripts as required for preprocessing steps and to validate input and output data elements, along with
data stage routines.
Developed back end interfaces using PL/SQL stored Packages, Procedures, Functions, Bulk Collections, Bulk
Variables and Triggers.
Participated in the review of Technical, Business Transformation Requirements Documentation.
AT&T, Alpharetta, GA Jan’2007 Jun’2008
ETL Consultant/Data Analyst
The objective of this project is to develop and maintain database system to meet internal and external client needs by gathering
business and market intelligence to analyze customer’s demographic patterns such as preferences, expenditure patterns.
Environment: Ascential DataStage 7.5.2 Enterprise Edition, Autosys, IBM AIX, UNIX Shell Scripting, Oracle 10g, DB2
UDB, Sybase, MS Access DB, PL/Sql, TOAD.
Responsibilities:
Used Datastage to extract client data from Oracle, MS Access sources and transformed into a target Business
Warehouse.
Created various Functions in Data Stage for mappings and other statistical calculations.
Designed, developed and deployed Data Stage Jobs using various DataStage stages like Interprocess, Link Collector,
Link Partitioner, Hash Tables etc.
Extensively followed the concepts of Naming Conventions, Parameterizing the variables from Global Environment to
stage level.
Used Data Stage RDMS, DB2 and Oracle Plug ins for pulling the data from the source table.
Created various Aggregate Tables and Snapshots for the Enterprise Data Warehouse.
Hands on experience on Autosys job scheduler used for automating the monthly regular run of EDW cycle in both
production and UAT and devflow environments.
Developed job sequences by identifying independent and dependent flows with proper rollback strategies incorporated
in the sequences.
HDFC, Hyderabad, India May’2004 – Jun’2005
Oracle Developer
Environment: Oracle 9i, Forms Builder 6i, Reports builder 6i, Sql, SQL*plus, Pl/Sql, Sql*Loader, XML publisher, UNIX.
Responsibilities:
Performed Conversions /Interfaces programs using PL/SQL.
Utilized SQL*Loader to load data from flat files in to the base tables.
Designed various Database objects like tables, Indexes, Views, Sequences, Collections.
Created some Stored Procedures, Packages and Functions with PL/SQL at database level.
Developed Customized Reports Using XML Publisher.
Developed front end forms and Reports using Oracle Forms and Reports builder.
Wrote some sql scripts by analyzing various test cases.