Sridevi Rao
******@*****.***
SUMMARY:
• About 11 years extensive experience in analysis, design, development, testing, implementation,
support and administration of highly complex ETL / Data warehouse applications using IBM
InfoSphere / Data Stage, Informatica and Microsoft SSIS for various business domains like
Automotive, Education, Oil and Gas, Finance, Insurance, and Telecom in global environment.
• Full life cycle experience, from requirements through testing, of data warehouse & ETL
process, OLAP, application development, testing, performance tuning, Data Stage
administration and product support.
• Solid experience in leading data analysis, conversion & migration and backend, system
integration, and user acceptance testing.
• Proficient with data modeling, standardization, data integration and metadata management.
• Solid experience in designing DataStage jobs using multiple data sources including sequential /
files, Dataset, hashed & EBCDIC files, ODBC, DRS and DB2 stages.
• Extensive experience in data extraction from various data sources including SQL Server, Oracle,
DB2, Teradata database and transforming and loading the data into data warehouse / data mart.
• Extensive experience in developing server and parallel jobs using DataStage V8.x/7.x/6.x.
• Expertise in developing ETL process to extract, transform and load data from various sources
including legacy systems into data warehouse using Datastage Designer and Parallel Extender.
• Expert in use of all stages viz., Transformer, Aggregator, Sort, Join, Merge, Lookup, Funnel,
Filter, Copy, Remove duplicates, Link Collector, Link Partitioner, Change capture stages in
Parallel Extender job.
• Proficiency in working with DataStage client tools - Designer, Manager, Director and
Administrator. Used Manager to import/export Datastage projects and jobs and define table
definition in repository and Director to debug, validate, schedule, run & monitor Datastage jobs.
• Carried out Datastage configuration tasks - adding projects, purging log files, issuing commands,
using Datastage Administrator tool.
• Proficient in usage of Ascential ProfileStage and Metastage for repository management and
Ascential QualityStage for standardization and cleansing the data and also worked on various
stages such as Transfer, Select Stage, Sort, Build and Uni join Stage.
• Experience in data warehouse concepts, design and implementation of Star schema, Snowflake
schema and Multi Dimensional modeling using ERWIN tool and UNIX shell scripting.
• Design and development of standard business reports using Cognos and Business Objects.
• Working knowledge of TSO, ISPF, JCL, and Cobol - Mainframe environment.
• Expert in unit and system integration testing and strong interpersonal and client interface skills.
Skills Years of Experience
Analysis, design, development and testing ETL jobs 10 years
Large data warehouse 8+ years
Extraction, cleansing and validating of source data 10 years
Data stage administration and support 2 Years
IBM DataStage/Informatica/DTS/SSIS 10+ years
SQL, Stored Procedures, Functions 10 years
SQL Server 6+ years
Oracle 6+ years
DB2, Teradata 3+ years
Data Modeling 2+ years
UNIX Scripting 4 years
Business requirements, functional and technical specs 8 years
TECHNICAL SKILLS:
ETL Tools: DataStage 8.5/8.1/7.x/6.x, Designer, Manager, Director and Administrator,
ProfileStage7.x, MetaStage 7.x, QualityStage 7.x, MS DTS / SSIS,
Informatica 8.6/8.1 (Powercenter Designer, Powercenter repository
manager, Powercenter workflow manager, Powercenter workflow monitor,
Informatica Power Exchange)
Sridevi Rao
Databases: Oracle 11g/10g/9i/8i, DB2 UDB 9.x/8.x, SQL Server 2008/2005/2000, Teradata
BI Tools: Cognos 8, MicroStrategy 7.0, Business Objects
Languages: SQL, PL/SQL, TSQL, SP, ASP, Java, Cobol II, JCL, XML, XSL,
XSLT, JavaScript, HTML 4.0, DHTML, Shell Scripting
Data Modeling: ERwin 5.x/7.3, Visio 2003
Database Tools: SQL*PLUS, Management Studio, Query Analyzer, Enterprise Manager, SQL
Navigator, SQL*Loader, PL/SQL Developer, TOAD, QueryMan
Testing tools: Quality Center / Test Director, QTP, Load Runner
GUI: Developer 2000 (Forms, Graphics, Reports and Report Server)
Web/App servers: IIS 5.0, Apache Web server, Netscape Suite
Methodologies: Star Schema, Snow Flake, Data warehouse life cycle, SDLC, Agile/RUP, ITIL
Platforms: Windows 7/XP/ 2000, Unix/Linux, MVS and DOS
Others: Maximo, CVS, Visual Source Safe, PVCS, MS Office, MS Access,
AutoSys, SSH client
PROFESSIONAL EXPERIENCE:
FORD MOTOR COMPANY / Allen Park, MI April 12 – Present
ETL Admin Specialist
Ford Motor Company, a global automotive industry leader based in Dearborn, Mich., manufactures or
distributes automobiles across six continents. With about 168,000 employees and about 65 plants
worldwide, the company’s automotive brands include Ford and Lincoln. The company provides financial
services through Ford Motor Credit Company. Information Technology team enables our manufacturing
facilities to receive parts just-in-time, customers to check the status of their vehicles online, and
empowers employees to facilitate the sale of millions of vehicles. Analytic and Collaboration Engineering
team provides support for IBM InfoSphere / DataStage products for use in client's enterprise data centers
across Ford Motor Company.
Responsibilities:
• Provided support to evaluating and engineering IBM InfoSphere / DataStage products for use in
client's enterprise data centers across the Ford Motor Company.
• Worked and Supported over 30 Data stage servers serving applications globally.
• Install/upgrade Information Server product suite on different tiers (Server and Client). This includes
DataStage, QualityStage InfoAnalyzer. Post install configuration.
• Apply fixes, patches and Service Packs as required and provided, keeping the DS environment
up-to-date.
• Maintain ownership of release activities which interact with ETL projects. Work with Prod Support
to maintain and administer the ETL deployment environment.
• Strong knowledge of Susie-Linux operating systems and Websphere application servers.
• Provide technical expertise, direction, and development support for the ETL development team as
well as support to the developers
• Data Stage product related issues; Developing new tools and processes to ensure effective use of
Data Stage product.
• Responsible for administrating and maintaining Data Stage shared environment and Fast/CKS
environments.
• Worked with Application owners/Project manager and Tech lead to create the ETL projects.
• Performed Data Stage Administration tasks such as creation of Projects in Dev, Test and Prod
environments. Defined environment variables and configuration files.
• Supported Datawarehouse projects in US, India, Germany and Argentina across client's enterprise
data centers.
2
Sridevi Rao
• Responsible for all project migration activities in client's enterprise data centers across Ford Motor
Company from 7.5, 8.1 to 8.5
• Monitored and analyzed the audit usage data to ensure the optimal usage and efficient
performance of the infrastructure.
• Worked with application owners and development teams to ensure the shared environment
resources are optimally utilized.
• Created Design documentation and implementation plan for reference/deployment in shared
environment.
• Analyze existing Infrastructure and DataStage jobs, ensuring best practices are being followed,
and recommending short term and long term improvement roadmap.
• Implement procedures for maintenance, monitoring, backup, and recovery operations for the ETL
environment.
• Responsible for maintaining run books and project documentation in sharepoint.
• Created PMR tickets and interacted with IBM engineering team to resolve the Data stage issues.
Environment: Websphere DataStage / QualityStage 7.5/ 8.1/8.5, Oracle 11g, Teradata, SQL Server
2008, TOAD 8.0, Hummingbird Connectivity/FTP, Windows, Susie Linux/Unix, Shell script, AutoSys
GMAC / Ally Financial Inc, Southfield, MI July 10 – March 2012
ETL Lead
Ally Financial Inc. (formerly GMAC Inc.) is one of the world's largest automotive financial services
companies. The company offers a full suite of automotive financing products and services in key markets
around the world. The Core Policy Systems Project (CPS) / Insurance Data Warehouse (DW) program
will provide reporting and data warehouse capabilities for policy, claims and earnings information for
products handled by iWarranty. The program is to support actuarial, finance, operations and reinsurance
functions primarily for insurance divisions in North America and United Kingdom. The CPS is being
implemented in two phases: (1) for the Car Care Plan (CCP) in the United Kingdom, and (2) for Motors
Insurance Corporation (MIC) in North America.
Responsibilities:
• Responsible for all activities related to the design and development of enterprise wide data
analysis and reporting solutions
• Interfaced with business team and customers to understand and gather the business requirements
through various meetings.
• Analyzed the requirements, business rules, scope of application, and relationship within &
between groups of data and prepared Technical specifications and test plan.
• Worked closely with Data modelers / DBA team in designing data model and the database for the
Data warehouse and Operational Data Stores.
• Performed Data Stage Administration tasks such as creation of Projects in Dev and Test,
Environment variables and configuration files.
• Designed and created ETL templates for the jobs to make the development task easy.
• Led the team and supervised 5 to 6 consultant onshore and offshore team.
• Created data mapping source to target. Verified the source and target metadata.
• Designed and developed the slowly changing dimension jobs using Type –I and II logic.
• Wrote Shell scripts and parameter files to call the ETL jobs sequence.
• Responsible to load the data into staging tables and into Data marts for US and UK.
• Extensively used Complex Flat File and dataset files for designing the ETL jobs.
• Extensively used Join, Merge, Lookup, Link Partitioner Stage, Funnel, Filter and Remove
Duplicates Stage, Transformer Aggregator Stage, Copy stage during ETL development.
• Performed unit, integration, and performance tests per the approved test plan and documented
and reviewed the test results.
• Migrated the tested code to higher environments and documented the Data Warehouse
development process and support documents.
3
Sridevi Rao
• Worked on jobs tuning for optimum performance using various stages like Peek, Copy and Modify.
• Provided rotational 24x7 on call and production support. Tracked trouble tickets in Remedy.
Environment: Websphere DataStage/QualityStage 8.1, Oracle 10g, TOAD 8.0, Quality center 9.0,
Cognos 8, Hummingbird Connectivity/FTP, Windows, Linux/Unix
Michigan State University, East Lansing, MI Nov 09 – July 10
ETL Technical Lead
The Enterprise Business Systems Projects ( EBSP) is working to create streamlined business processes
and interconnected administrative systems for Michigan State University's finance, human resources,
and research administration areas. The multiple year initiative will change the way the university does
business in the future. Once fully implemented, the applications will be used by virtually all faculty and
staff at the university. The EBSP is working to create connecting administrative business systems that
support an advancing university.
Responsibilities:
• Technical Subject Matter Expert (SME) in Data Warehouse, Data Mart and ETL process.
• Arranged meetings with business team and customers to understand and gather the requirements.
Prepared Functional and Technical specifications and test plan.
• Analyzed scope of application, defined relationship within & between groups of data. Performed
data mapping source to target. Verified the source and target metadata.
• Actively involved in data modeling and designing the database for the Data warehouse.
• Installed and configured Data Stage in development and test environment.
• Designed Data Stage jobs and processes for extracting, cleansing, transforming, integrating
and loading data into Data Warehouse database.
• Performed Import and Export operations and table definitions using DataStage Designer.
• Designed and developed the slowly changing dimension jobs using Type –I and II logic.
• Used Change Capture, Join, Merge, Lookup, Link Partitioner Stage, Funnel and Filter and
Remove Duplicates Stage, Transformer Aggregator Stage, Copy stage during ETL
development. Extensively used dataset files for designing the ETL jobs.
• Used DataStage Director to run and monitor the jobs and automation of ETL jobs using
DataStage Scheduler.
• Created test cases based on requirements and performed Unit, System and Integration testing
of the modules and helped in User Acceptance Testing (UAT).
• Documented the test results and did the data & production validation after the jobs are migrated
into production. Resolved data conflicts.
• Wrote several SQL scripts for back-end testing.
• Performance tuning done at database level as well as DataStage job level.
• Provided Data stage administration, configuration management and application support.
• Documented the Data Warehouse development process and support documents.
• Managed work schedules and assigned work and provided technical support to team members.
Environment: Websphere DataStage/QualityStage 8.1, Oracle 10g, PL SQL Developer, Quality center,
Cognos 8, Smart CVS 6, Toad, SSH Secure Shell Client/FTP, Windows, Linux/Unix
Chevron Products Co. Houston, TX Dec 06 – Oct 09
Technical Lead / Data stage Analyst
Chevron Products Co. is one of the largest refiners and marketers of petroleum products in the United
States. With eight refineries and more than 8,100 retail outlets, it serves customers in 29 states. The
Lynx project consists of two sub projects IA and RIA and planned for two releases. The Lynx IA layers are
the places where all the systems come together. The Information Architecture (IA) project is a
foundational element of the Lynx project. The purpose of the IA project is to support Supply Chain
Management (SCM) consisting of data management, integration of systems & tools, and reporting
4
Sridevi Rao
functionality. The Refining IA Layer exists separately for each refinery, and resides within the refinery
gates. It contains refinery-specific plans and schedules, master data, operational data, and processing
history. The Supply Chain IA Layers are regional views of data. They contain shared information from
sources throughout the supply chain to include forecasts, plans, schedules, economics, actual, global
master data, and lookback. Supply Chain layer would support multiple refinery reporting and analytics.
Responsibilities:
• Successfully led a team of on- and off- site developers and testers.
• Interfaced with business team and customers to gather requirements and understand the scope
of the projects through various requirement meetings.
• Analyzed Functional requirements and prepared Technical specifications.
• Involved in analyzing scope of application, defining relationship within & between groups of data.
Performed data mapping source to target.
• Created Technical Design Document for the ETL process with Source to Target Mappings and
the extract filters giving a high level overview of the process using diagrams built in Visio.
• Interacted with Business users and Data Architects and Database Administrators in designing
the database for Data warehouse.
• Installed and configured Data Stage in development and test environment.
• Designed Data Stage jobs and processes for extracting, cleansing, transforming, integrating
and loading data into Data Warehouse database.
• Performed Import and Export operations and table definitions using DataStage Manager.
• Designed and developed the slowly changing dimension jobs using TYPE –I logic.
• Designed and developed the data migration and data conversion from El Segundo,
Richmond, Pascagoula, Pembroke refineries.
• Extensively used Join, Merge, Lookup, Link Partitioner Stage, Funnel and Filter and Remove
Duplicates Stage, Transformer Aggregator Stage, Copy stage during ETL development. Wrote
and used user defined queries, Data Stage macros & functions.
• Created Batches (DS Job Control) and sequencers to control set of DataStage Jobs.
• Used DataStage Director to run and monitor the jobs and automation of ETL jobs using
DataStage Scheduler.
• Wrote Shell scripts to run data stage jobs and stored procedures.
• Involved in performance tuning of the ETL process and performed the data warehouse testing.
• Provided Data stage administration, configuration management and application support.
• Documented the Data Warehouse development process and support documents.
• Coordinated a team of four developers both off-shore and on-site and provided technical
guidance.
Environment: Ascential DataStage 7.5.1, MS SQL Server 2005, SSIS, Oracle 9i, SQL Server
Management Studio 2005, HP Quality center 9.0, Cognos 8, Visual Source, Toad, FTP, Windows/Unix
IBM/Fireman Fund Inc. Novato, CA Jun 06 – Nov 06
ETL Developer
Fireman's Fund offers unique specialized products, while remaining competitive in areas such as upscale
homeowners and commercial Insurances. Enterprise Metrics project is designed to support the Business
Objects reporting application. This project involved extracting data from mainframe and loading into Data
Marts. The major job involved is cleansing the data and transforming the data to the staging area, then
loading the data into the Data Marts.
Responsibilities:
• Involved in analyzing scope of application, defining relationship within & between groups of data
and star schema design.
• Used DataStage Designer to design the jobs and processes for extracting, cleansing,
transforming, integrating Data Warehouse database and loading data into.
5
Sridevi Rao
• Performed Import and Export operations and table definitions using DataStage Manager
• Extensively used Flat File Stage, Hashed File Stage, DB2 UDB Stage, FTP Plug-in Stage and
Aggregator Stage during ETL development.
• Used DataStage Director to run and monitor the jobs, automation of DataStage Job using ESP
to execute and schedule various DataStage jobs.
• Strong in usage of Join, Merge, Lookup, Link Partitioner Stage, Funnel and Filter and
Transformer Aggregator Stage, Copy stage during ETL development.
• Wrote and modified existing UNIX shell scripts to run the DataStage jobs.
• Performed unit, system and regression testing.
• Tuned the jobs for higher performance by dumping the look-up data into hash-file.
• Maintenance and support of existing jobs and modifying them according to customer and
business needs.
• Documented the Data Warehouse development process and performed knowledge transfer to
Business Intelligence developers.
Environment: Ascential DataStage 7.5.1, UNIX shell scripting, IBM AIX 5.7, IBM IMS, DB2 UDB, Oracle
9i, Business Objects, Lotus Notes 6, ESP, Clear Case, Windows
Wellington Management-Boston MA Jul 04 –May 06
Data Stage Developer
Wellington Management is one of America's oldest investment management firms. Wellington
Management provides investment services to many of the world's leading public and private institutions.
RMG project is designed to support the Risk Metrics and other management reporting applications.
Responsibilities:
• Design, develop and implement transformation processes using DataStage that load data into
the Data Warehouse from various sources including XML files.
• Developing the jobs by selecting the source data, defining stage variables, data mapping,
applying data transformation rules and target source.
• Work closely with the business analyst in developing the source to target data mappings.
• Defining and formatting the custom bucket list used for sending the referential data to Risk
Metrics server.
• Worked with Data stage Manager for importing metadata from repository, new job categories
and creating new data elements.
• Developed Parallel jobs using Stages, which includes Join, Transformer, Sort, Merge, Filter
and Lookup.
• Participate in the development of the ETL design document and transformation rules.
• Modify UNIX shell scripts and PERL scripts to support the ETL processes.
• Prepare test cases for unit and integration testing and test the jobs and support user acceptance
testing. Scheduling the jobs, ftp the data to RM server, archiving and purging the data.
• Participate in the installation and upgrade the DataStage 7.X to 7.5.
Environment: DataStage 7.5, Oracle 9i, SQL/PL-SQL, PVCS, UNIX shell scripting, PERL scripts, Ftp,
XML, PL/SQL Developer, AccuRev, Maestro, Windows 2000, Sun Solaris
Bell South - Atlanta, Georgia Jul 02 – Jun 04
Data Stage Developer
Decision Supporting System (DSS) was designed to support various external/internal clients. System
resided on MVS/Teradata platform. Data from multiple sources was loaded, transformed and reported
from the Teradata warehouse. This project involved developing of adhoc reports for the FCC to keep track
of the telecom services provided by Bell South in various states.
Responsibilities:
• Identified sources and analyzed source data.
6
Sridevi Rao
• Involved in designing the Data model and in creating storage objects by using ERWIN.
• Data Architect for Internet Data Aggregation. Provided data models and data maps (extract,
transform and load analysis) of the data marts and feeder systems in the aggregation effort.
• Used DataStage Manager, Designer, Administrator, and Director for creating and implementing
DataStage jobs.
• Developed various jobs using Hashed file, Aggregator, Sequential file stages in DataStage
server jobs.
• Identified other source systems like SQL Server, DB2 their connectivity, related tables and
fields and ensure data suitability from mapping using profile stage.
• Developed various jobs using dataset, sequential file set stages in parallel extender jobs.
• Extracted data from oracle, sol server and DB2 and transformed and loaded in to Teradata
warehouses. Created shared containers to use in multiple jobs.
• Performed Unit testing and Integration testing of the module.
• Created Shell Scripts, which will intern call the DataStage jobs with all paths to Source and
targets and even with connection information.
• Used Ascential Quality stage in Parallel extender for batch processing and cleansing the data.
• Worked in developing Metrics, compound metrics and filters for complex reports using Micro
strategy. Ensure that all the reports identified produce the desired results using Profile stage.
• Documented the Purpose of mapping so as to facilitate the personnel to understand the process
and incorporate the changes as and when necessary.
Environment: Ascential DataStage 7.X parallel extender job, Profilestage 7.x, Qualitystage7.x,
Teradata V2R4/V2R3, Oracle 9i, SQLServer 2000, DB2, Cobol II, JCL, Business Objects, Erwin5.5,
Windows 2000, IBM AIX5.2, UNIX shell scripting, MVS/TSO
PROFESSIONAL DEVELOPMENT:
• IT Service Management - ITIL
• Business Management Process (BPM)
• Pursuing PBDM/MBA at Athabasca University, CAN
Others: US Permanent Resident (Green card holder)
7