DIVYA CHERUKURI
Email:*****@******.***
Ph: 408-***-****
Summary:
About 7 years of experience in the Data warehousing using InformaticaPowerCenter 9.x/ 8.x.x /
7.x
Experience in various phases of SDLC like the data analysis, design, development and testing.
Exposuretodata modeling techniques likeStar schema and Snowflake schema
used in relational and multi dimensional modeling using Erwin, Visio.
Worked with the Informatica PowerCenter client tools: Designer, Workflow Manager, Workflow
Monitor and Repository Manager.
Extensively worked on third party tool like TOAD, SQL*PLUS and PL/SQL
Developer.
Created ETL mappings using Informatica PowerCenter to extract the data from multiple sources
like XML, SQL server, Flat files, Oracle into the staging area and then to the Data Warehouse/ Data
Marts.
Experience in moving the Program files (FTP) to the server by using the Tools like FileZilla, WINSCP,
Putty and Telnet.
Implemented Slowly Changing Dimension methodology for accessing the full history of accounts
and transaction information.
Hands on experience in Tuning Mappings, Identifying and resolving performance bottlenecks in
various levels like sources, targets, mappings, and sessions.
Strong skills in SQL, PL/SQL packages, functions, stored procedures,triggers,SQL tuning and
materialized views to implement business logic in oracle database.
Experienced with coordinating cross functional teams, project management and presenting
technical ideas to diverse groups.
Technical Skills:
ETL Tool. Informatica Power Center 9.1/8.x/7.x
Databases & Tools. Oracle 8i,9i and 10g,11i,SQL*Plus, DB2
TOAD, SQL Developer
Star Schema Modeling, Snowflakes Modeling using Erwin tool
Data Modeling
: Star Schema
Modeling, Snowflakes
Modeling, Dimension
Tables, FACT,
and Erwin
1
Divya Cherukuri
Operating Systems. Windows 95/ 98/NT/2000,XP,Linux.
Languages. PL/SQL, SQL, C, C++, HTML, XML, Unix Shell Scripting, Java.
Other Tools/Packages Quality Center, FTP,SQL*Loader
Experience:
Comcast Corporation,Herndon,VA June 2011 – Till Date
Role:ETL Developer
Comcast corporation is one of the nation’s largest video, high speed Internet and phone providers to
residential and business customers. It is principally involved in the operation of cable systems through
Comcast Cable and in the development, production and distribution of entertainment, news, sports and other
content for global audiences through NBCUniversal.Purpose of the project is to perform daily reconciliation of
stable accounts found in eCUST.
Responsibilities:
Created requirements and functional specification based on the CMS/IMS
•
Migration Functional specification.
Involved in creating mapping document for the eCUST Startegic BULK
•
Reconciliation.
Worked with the ETL Architect for the design review meetings and making the
•
design decisions.
Accounts to reconcile are identified by set of defined rules,rather than the process
•
being driven by a list of accounts.
Created and maintained the staging tables from the source systems
•
(NETREC,DSSRPT,ELOC,EPC).
• Created source definitions from the operational sources and target definitions from the target
database.
Extracted data from billing systems and from ELOC .
•
Performed bulk download and rebuild operations for customer profile and product
•
data for approx. 20million accounts.
Created mappings using different transformationsto identify the accounts to be
•
reconciled.
Created reusable transformations to use in multiple mappings.
•
Worked on reconciling the accounts into the eCUST staging tables and merging
•
with eCUST Production.
2
Analysed the eCUST physical data model and the relationships between the
•
tables.
Worked on development of product data and customer data,created services
•,resources,product groups,aspects for CDV and HSD LOB’s.
• Used AutoSys for Job scheduling, set up batches and sessions to schedule the loads at the
required frequency and tested the data loaded into the target tables as expected.
Optimized/Tuned mappings for better performance, example the SQL override in
•
source qualifier, lookup overrides in lookups, and other fixes based on the bottlenecks
identified
Trouble shooting by checking logs and using debug action to identify complex
•
issues.
Analyzing long running SQL's and tuning themDesigned and developed PL/SQL
•
packages,procedures for interfaces.
• Used InformaticaWorkflow Manager and Workflow Monitor to schedule and monitor session
status
• Involved in performance tuning of the SQL queries and PLSQL. The intent was to keep the
performance constant in spite of added functionality.
Environment:Informatica Power Center 9.1,Autosys, SQL Developer,Oracle 11i,Toad.
Thermo Scientific/Dionex Corporation, Sunnyvale, CA Mar 2010 May 2011
Role: ETL Developer
Thermo Scientific Dionex products are designed to provide leading performance in sample preparation,
separation, analysis, and reporting on the components of chemical mixtures and compounds. It is a
corporate warehouse Service Analytics which helps understanding some critical metrics about the service
group.
Responsibilities:
• Understanding the requirement specifications, reviewing the functional specifications document
and developing the solution design document.
• Developed the mapplets and Reusable Transformations in the mappings to facilitate daily, monthly and
yearly loading of the data
• Developed the interfaces using PL/SQL stored procedures, implementing cursors.
• Included error handling and exception handling by logging in the error tables and sending an alert
message via e mail to the concerned distributors.
• Worked on integration testingof all the modules and preparation of test plans
• Responsible forretrofitting the code toQA environment, and extending the support for the QA and
UAT for fixing the bugs.
• Involved in bug fixing of the current production issues, and delivery in a short time frame.
• Developed the source definitions, target definitions to extract data from flat files and relational
sources to Data warehouse.
• Created different transformations for applying the key business rules and functionalities on the source
data.
3
Divya Cherukuri
• Used PMCMD command to start and run the workflow from the UNIX environment.
• Involved in performance tuning of sources, targets, mappings, and sessions . Also involved in
tuning of SQL queries and PL/SQL procedures.
• Extensively used debugger to identify data transfer problems in mappings .
• Analyzing long running SQL's and tuning them.
• Ensuring that all production changes are processed according to release management policies and
procedures.
• Ensuring that appropriate levels of quality assurance have been met for all new and existing
applications / CRs.
• Ensuring that application changes are fully documented, supportable.
• Proactively identifying the changes required within the production environment and working on the
enhancements/Change requests.
• Worked with the shell scripting: file archiving, generate auto email to end users.
• Responded to and resolved the performance issues.
• Prepared the coding standards and quality assurance policies and procedures.
Environment:Informatica Power Center 8.6.1, Oracle 10g, TOAD, UNIX Shell Scripting, Windows XP.
MylanInc,Pennsylvania,PA Apr 2009 – Feb 2010
Role: ETL Developer
Mylan is one of the world’s leading generics and specialty pharmaceutical companies, providing products to
customers.Mylan Laboratories Limited also is one of the world’s largest producers of API used to make
generic antiretroviral (ARV) therapies for the treatment of HIV/AIDS. Mylan has attained leading positions in
key international markets through its wide array of dosage forms and delivery systems, significant
manufacturing capacity, global commercial scale and a committed focus on quality and customer
service.Purpose of this project is to maintain a data warehouse that would enable the management to take
corporate decisions.
Responsibilities:
Created the Technical specification document based on the Functional
•
document that includes high level and low level design.
Involved in creating Logical and Physical design of databases using Erwin.
•
Created Staging, Dimension Fact tables based on the cloud data and the
•
warehouse design.
Participated in build of the Data Warehouse, which includes theDesign of Data
•
mart using Star schema.
Created repository using Informatica Power Center – Repository Manager.
•
Extracted data form flat files and oracle database, applied business logic to load
•
them in the central oracle database.
• Worked with the Informatica Designer and created various mappings using
different transformations like filter, Router, lookups, stored procedure, joiner, update
strategy, expressions and aggregator transformations.
4
• Created and ran sessions/workflows using Workflow Manager to load the data into the Target
Database.
• Optimized/Tuned mappings for betterperformance, example the SQL override in source
qualifier, lookup overrides in lookups, and other fixes based on the bottlenecks identified.
• Created reusable transformations and mappletsand used them in mappings.
• Extensively used Shell scripts to automate the Pre session and Post sessions processes.
Performed data manipulation using basic functions and Informatica
•
transformations.
• Used session partitions, dynamic cache memory and index caches for improving performance of
Informatica services/ server.
• Extensively worked on SQL tuning process to increase the source qualifier throughput by analyzing
the queries with explain plan, creating new indexes,partitions and Materialized views
• Worked with Memory cache for static and dynamic cache for the better throughput of sessions
containing rank, lookup, joiner and aggregator transformations
• Created various tasks like Event wait, Event Raise, E mail and command etc.
• AutomizedWorklets and Session schedule using PMCMD and UNIX Shell scripts
• Troubleshoot problems by checking Sessions and Error Logs. Also used Debugger for complex
problem troubleshooting.
Environment:Informatica Power Center 8.6.1,Oracle 11i, UNIX, CSV Files, Excel files,
SQL, PL/SQL, Unix Shell scripting, Windows XP, Toad, Erwin4.5.
Best Buy, Minneapolis, MN Jan 2008 March 2009
Role: ETL Developer
Best Buy Co., Inc is North America's number one specialty retailer of consumer electronics, personal
computers, entertainment software and appliances. Worked on extracting the data from various sources to
the Enterprise Data warehouse called PMS (Product Management System). It has the sales data, purchase
data, valued customer information, employee information. The project aims to help making decisions for new
product improvements, analysis of existing products and improve customer service.
Responsibilities:
• Participated in the review meetings to finalize the Functional documents based on the
requirements specifications.
• Involved in the design of the project to convert the functional specifications into data flow diagrams.
• Created the Mapping documents, identifying the lineage across the source and the target tables.
• Worked extensively on transformations like Source Qualifier, Joiner, Filter, Router, Expression,
Lookup, Aggregator, Update Strategy, Sequence generator and Stored procedure
transformations.
• Created variables in Expression transformation to compare current record with the previous
record and combine the data for these records based on the same claim numbers
• Debugging the mappings using break points, testing of Informatica sessions, to detect the root
cause of any data issues.
5
Divya Cherukuri
• Developed mapplets, workletswherever reusability was required.
• Created reusable transformations to use in multiple mappings.
• Used Informaticadebugger to identify the data issues and fix the mappings.
• Created workflow in the workflow manager, and monitored the runs in theWork flow Monitor.
• Wrote SQL, PL/SQL codes, stored procedures and packages for dropping and re creating indexes,
to generate oracle sequences.
• Created Informatica mappings with PL/SQLprocedures/functions to build business rules to load
data.
• Extensively used ETL to load data from wide range of sources such as flat files (CSV, fixed width or
delimited).
• Creating tasks, sessions, events, timers etc in the workflow manager and monitoring them in the
workflow monitor.
• Used AutoSys for Job scheduling, set up batches and sessions to schedule the loads at the
required frequency and tested the data loaded into the target tables as expected.
Migration of Informatica objects from Development to QA and from QA to production
•
and setting up the UNIX directories for Parameter files, Logs and Bad files.
• Worked cooperatively with the team members to identify and resolve issues relating to informatica
and other database related issues
Environment:Informatica Power Center 8.6.1, Oracle 10g,Autosys, TOAD, UNIX Shell Scripting, Windows
XP.
Kanbay, Hyderabad Mar 2006 – Dec 2007
Role: ETL / DB application developer
Kanbay International is a premier global systems integrator providing high quality, high value solutions to the
insurance, banking, lending, credit card, and capital markets industries. The combination of business focus
and technology expertise makes Kanbay a unique player in the technology consulting arena. This project id
for the corporate datawarehouse that helps in taking decisions. This system helps the customer service
representatives to deal and transact with customers loan, credit, debit, portfolios, investment etc. The
operational data of different financial departments loaded into central Data Warehouse and transformed into
different regional data marts.
Responsibilities:
• Worked on 24X7 system on rotational/shift basis, for the Production support.
• Worked on the enhancements using the Informatica client tools Source Analyzer, Warehouse
Designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source
& Target definitions and coded the process of data flow from source system to data warehouse.
• Involved in analyzing source systemsdata.
• Created source definitions from the operational sources and target definitions from the target database.
• Involved in the end to end process in the extraction, transformation and loading of source data to the
staging and then to the datamart.
• Created numerous Informatica mappings using XML files, Flat files, Oracle as sources and targets.
• Used InformaticaWorkflow Manager and Workflow Monitor to schedule and monitor session status
6
• Developed mappings in Informatica to load the data from various sources into the Data Warehouse,
using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup,
Sequence Generator, Filter, Sorter, Source Qualifier.
• Worked extensively on Slowly Changing Dimensions.
• Involved in performance tuning and finding out the bottlenecks.
• Extensively used SQL and PL/SQL scripts.
• Developed reusable transformations, mapplets and Worklets, wherever required.
• Used Informatica scheduler to schedule the jobs.
Environment:Informatica Power Center 7.1.3, Oracle 10g/9i, Erwin 4.5/4.0, PL/SQL, UNIX, Putty, Toad.
Education:,Hyderabad.
Bachelor of Technology in Computer Science, JNTU
7
Divya Cherukuri