Post Job Free
Sign in

Data Warehouse Azure

Location:
Carlsbad, CA
Posted:
March 04, 2025

Contact this candidate

Resume:

Suresh Kumar Ravirala

******.********@*****.*** 760-***-****

Summary:

18.5 years of ETL tool experience in ETL Tools (DataStage, Informatica, and SSIS), Reporting Tool (Business Objects 6.5, SSRS) and an year experience in Big Data Hadoop, Knowledge on IBM Cloud Pak for Data (CP4D).

16.5 years of experience in IT Industry, in Data warehouse Tools - DataStage (11 years), Informatica (4 years), SSIS & SSRS (6months) and experience in BO (~1year) IBM Cognos Reportnet (~1year).

16.5 years of Data Integration experience, including ETL, Data Warehouse, and Data Conversion.

Strong Skills on ETL (IBM Infosphere DataStage and Informatica).

Certified IBM Infosphere DataStage Developer for version 8.5

Excellent SQL, PL/SQL skills.

Experience in data ingestion into HDFS using Hadoop ecosystem, SQOOP and performing data transformation/ analysis using PIG and HIVE.

About 11 years of ETL tool experience using IBM Information Server, DataStage and QualityStage 8.x, Ascential DataStage 7.x/6.0 in designing, developing, testing and maintaining jobs using Designer, Manager, Director, and Administrator.

Experienced in troubleshooting of DS jobs and addressing production issues like performance tuning and fixing the data issues.

Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.

Experience in integration of various data sources with Multiple Relational Databases RDBM systems Oracle, Teradata, Sybase, SQL Server, MS Access & DB2.

Worked on integrating data from flat files, COBOL files and XML files.

Expertise in delivering DWBI solution using Waterfall, Agile methodologies (Scrum)

Experience in different phases of DWBI projects such as Requirements gathering, Analysis, Design, Development, Testing, Implementation/Deployment, Production Support and Documentation, along with cross team communication and Knowledge Transfer activities.

Experience in making Proof of concepts and solution architecture.

Experience in ETL & DB performance tuning, designing and developing highly scalable systems that will handle very large volumes of data globally.

Experience in working on leading DBs like – Teradata 12, Oracle 9i/10g, Sql Sever

Expertise in working on Unix/Linux Platforms & Unix Shell scripting

Excellent Analytical and problem-solving skills.

Self-motivated, adaptive to new technologies, team player with good interpersonal and communication skills.

Excellent Team leading and people management skills.

Experienced in Project Management, leading, planning, managing priorities, scoping & estimation for large projects/programs.

Experienced in managing project communications within in teams, across teams and higher management, delivery status tracking.

Excellent communication and interpersonal skills with the ability to work in a team as well as individually.

Managed all development and support efforts for the Data Integration/Data Warehouse team.

Worked on Quality Stage stages (Standarize stage and Investigation Stage)

Educational Qualification:

The Specialized Certificate in Business Intelligence Analysis from University Of California (UCSD Extension),San Diego, Extension, on June 3rd 2019.

Certified IBM Infosphere DataStage Developer for version 8.5.

B. TECH from JNTU Hyderabad, in the year 2001.

IT Experience:

Worked as Sr DataStage Consultant in GAP INC, Rancho Cordova CA from Nov 2022 to Till Date.

Worked as Sr DataStage Consultant in VSP (TSP tech), Rancho Cordova CA from Aug 2021 to Nov 2022.

Worked as DataStage Consultant in Bank of America (Randstand), Thousand Oak CA from 27th May 2019 to 14th May 2021.

Worked as ETL Architect in Chevron Bakersfield CA from 12th Feb 2019 to 24th May 2019.

Worked as Sr Datastage Consultant/Hadoop Developer in GM Motors, Chandler, AZ from 4th July 2017 to Dec 2017.

Worked as ETL Datastage Consultant in Apria Health Care, Lake Forest CA from 19th Sep 2015 to July 2017.

Worked as ETL Datastage Consultant in FlagStar Bank, Troy MI from 09th Nov 2014 to Sep2015.

Worked as Sr. Technical Lead in IHS, Denver from 04th April 2010 to Jan2014.

Worked as Sr. Software Engineer in Wipro Technology, Bangalore from Jan14th 2008 to 1st April 2010

Worked as Technical Consultant in PT. Emerio Indonesia from March 26th, 2007 to Dec 2007

Worked as Sr. Software Engineer in Patni Computer Systems Limited, Mumbai from February 2006 to Dec 18th, 2006

Worked as Software Engineer in Smart vision solutions, Hyderabad, from September 2003 to Feb2006.

Technical Skills:

ETL Tools: DataStage (7.1, 7.5.1, 8.1, 8.5, 9.1, 11.3, 11.5, 11.7)

Informatica (Power Center 7.1, 8.1, 9.1)

SSIS 2008R2, 2012

OLAP Software: Business Objects 6.5, SSRS 2012

Operating System: Windows 2000/NT, UNIX AIX5.3

Databases: Oracle 9i, 10g

Programming Languages: SQL, PL/SQL.

Big Data Skills: Hadoop, HDFS, PIG, SQOOP, HIVE

Cloud Tools: IBM Cloud Pak for Data (CP4D), Azure Data Factory

Certifications: IBM Information Management DataStage Developer V8.5

Project Name: Inbound Modernization & GLV Migration, Gap INC, Rancho Cordova, US

Role: Sr DataStage Consultant

Period: March 2022 to Till Date.

ETL Tool: Data stage 7.1,8.1, 11.7

Cloud Tools: IBM Cloud Pak for Data (CP4D), Azure Data Factory

Scheduling Tool: CAWA

Roles & Responsibilities: -

Worked on Enhancement project and also worked on Datastage Migration from 11.3 to DS 11.7 and Moved code from DS7.1 to DS11.7 for few adapters.

Used the DataStage Designer to debug processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database and also the process of loading data into Data-mart.

Used the IBM WebSphere DataStage designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Got the source data in the form of Oracle tables, Sequential files.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Validated the current production data, existing data and the programming logic involved.

Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run.

Identified all data sources and size of the data load, analyzed data and Responsible in identifying data mapping rules.

Used the DataStage Director and its run-time engine for testing and debugging its components, and monitoring the resulting executable versions

Designed and developed the Audit process.

Currently working in the development project and guiding the team.

Designer role:

Based on the requirement identify the changes in the application and analyze the impact in the code changes

Involved in Design & development of complex jobs.

Developed ETL jobs and Build Sequencer.

Project Name: VSP, Rancho Cordova, US

Role: Sr DataStage Consultant

Period: Aug 2021 to Feb 2022.

ETL Tool: Data stage 11.7

Scheduling Tool: Autosys

VSP Vision Care (VSP) is a vision care health insurance company operating in Australia, Canada, Ireland, the United States, and the United Kingdom. It is a doctor-governed company divided into five businesses: “eye care insurance, high-quality eyewear, lens and lens enhancements, ophthalmic technology, and connected experiences to strengthen the relationship between patients and their eye doctors.”[1] It has about 80 million people worldwide and is the largest vision insurance company in the United States.[2]

Formed in 1955 as a nonprofit organization by a group of optometrists in Oakland, California, it became a national provider, and expanded internationally by 2007. In 2003 the Internal Revenue Service revoked VSP's tax exempt status citing exclusionary, members-only practices, and high compensation to executives

Roles & Responsibilities: -

Used the DataStage Designer to debug processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database and also the process of loading data into Data-mart.

Used the IBM WebSphere DataStage designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Got the source data in the form of Oracle tables, Sequential files.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Validated the current production data, existing data and the programming logic involved.

Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run.

Identified all data sources and size of the data load, analyzed data and Responsible in identifying data mapping rules.

Used the DataStage Director and its run-time engine for testing and debugging its components, and monitoring the resulting executable versions

Designed and developed the Audit process.

Currently working in the development project and guiding the team.

Designer role:

Based on the requirement identify the changes in the application and analyze the impact in the code changes

Involved in Design & development of complex jobs.

Developed ETL jobs and Build Sequencer.

Project Name: MSP (Mortgage & Sales Package), Bank of America, Agoura Hills CA

Role: ETL Consultant

Period: May 2019 to May 2021.

ETL Tool: DataStage 8.7, 9.1, 11.7

DB: Oracle11g

Scheduling Tool: ControlM

Design various block diagrams and logic flowcharts.

Prepare various computer software designs.

Document all program level and user level processes.

Migrated DataStage 8.7 jobs to DataStage 11.7 jobs.

Understand the server job design and re-design the job in parallel job.

Design, Develop and maintain various ETL jobs that sources data from AS400/Oracle DBs and load them into Oracle data warehouse using stages like Transformer, Join, Merge, Funnel, Pivot, Unstructured Stage etc.

Develop Unix Shell Scripts for file transfers and basic transforming functions.

Develop Post Load Validation jobs to validate the data extraction.

Prepare ETL architecture and associated procedures.

Analyze performance and monitor work with capacity planning.

Working in tools like IBM Infosphere DataStage 9.1 Toad, Win SCP, Putty, Autosys, SQL Server 2012.

Working in Agile methodology.

Involved in Release Planning, Story Writing and Sprint Planning.

Project Name: Data Foundation Integration to Insight Program (DFi2i), Bakersfield CA, US

Role: ETL Architect

Period: Feb 2019 to May 2019.

ETL Tool: Datastage 11.5

DB: Teradata 16.0

Scheduling Tool: ControlM

Chevron SJVBU is currently in the process of upgrading the information integration, governance and reporting/ analytics capabilities to increase business decision speed and quality by streaming lining delivery of relevant, trusted, and accurate information.

SJV has launched a Data Foundation Integration to insight program, The DFi2i Foundational Capabilities Project is the initial work stream under the DFi2i Program designed to deploy the fundamental technologies to start enabling Chevron’s future analytics journey.

Chevron has 3 DataMart’s that encompass Operations, Finance and Health Environment and Safety information.

Chevron intends to accomplish the following objectives via the DFi2i FC project:

Consolidate the existing FDS, RMIS and SJVDW data marts, and CRDB into an enterprise data warehouse supported by Teradata Data Warehouse Appliance technology.

Design and implement and Enterprise Logical information Model.

Roles & Responsibilities: -

Used the DataStage Designer to debug processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database and also the process of loading data into Data-mart.

Used the IBM WebSphere DataStage designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Got the source data in the form of Oracle tables, Sequential files.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Validated the current production data, existing data and the programming logic involved.

Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run.

Experience in using the XML transformations to get the desired results and used all types of files like the flat files and hash files.

Responsible in debugging Ticketing issues in the production phase.

Used the DataStage Director and its run-time engine for testing and debugging its components, and monitoring the resulting executable versions

Provided production support and performed enhancement on existing multiple projects.

Designer role:

Based on the requirement identify the changes in the application and analyze the impact in the code changes

Get the estimates sign-off from the user and Design the Low Level Design document

Involved in development of complex jobs.

Project Name: GM Motors, Campaining & Lead Management, Chandler AZ, US

Role: Sr DataStage Consultant

Period: July 2017 to Dec 2017.

ETL Tool: Data stage 8.1, 9.1, 11.3 & 11.5 SSIS 2012 And SSRS Reports, Hadoop, HDFS, SQOOP, Hive, and Spark

Scheduling Tool: Autosys

General Motors Company, commonly known as GM, is an American multinational corporation headquartered in Detroit, Michigan, that designs, manufactures, markets, and distributes vehicles and vehicle parts, and sells financial services. With global headquarters at the Renaissance Center in Detroit, Michigan, United States, GM manufactures cars and trucks in 35 countries.

Roles & Responsibilities: -

Used the DataStage Designer to debug processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database and also the process of loading data into Data-mart.

Used the IBM WebSphere DataStage designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Got the source data in the form of Oracle tables, Sequential files.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Validated the current production data, existing data and the programming logic involved.

Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run.

Identified all data sources and size of the data load, analyzed data and Responsible in identifying data mapping rules.

Experience in using the XML transformations to get the desired results and used all types of files like the flat files and hash files.

Responsible in debugging Ticketing issues in the production phase.

Responsible for using the data mapping to direct into correct systems, data reconciliation and validations.

Used the DataStage Director and its run-time engine for testing and debugging its components, and monitoring the resulting executable versions

Experience in using the database objects like packages, procedures, and functions according to the client requirement.

Used Hierarchy stage to read the Jason file, xml file

Provided production support and performed enhancement on existing multiple projects.

Designed and developed the Audit process.

Currently working in the development project and Migration of code from 9.1 to 11.5 and also guiding the team.

Designer role:

Based on the requirement identify the changes in the application and analyze the impact in the code changes

Get the estimates sign-off from the user and Design the Low Level Design document

Involved in development of complex jobs.

Hadoop Developer:

Involved in development of complex jobs.

Created a process to pull the data from existing applications and land the data on Hadoop.

Used sqoop to pull the data from source databases such as Oracle database.

Created the Hive tables on top of the data extracted from Source system.

Partitioning the Hive tables depending on the load type.

Created the hive tables to show the current snapshot of the source data.

Project Name: Apria Health Care EDW, Apria Health Care Lake Forest, CA, US

Role: DataStage Consultant

Period: Jan2016 to July2017

ETL Tool: Data stage 8.1, 9.1 & 11.3, SSIS 2012 And SSRS Reports

Scheduling Tool: Autosys

Apria Healthcare is one of the nation’s leading providers of home respiratory services and certain medical equipment, including oxygen therapy, inhalation therapies, sleep apnea treatment, enteral home nutrition and negative pressure wound therapy. Apria owns and operates more than 400 locations throughout the United States and serves more than 1.8 million patients each year

Currently supporting the production activities and also participating in the Enhancements and guiding the team.

Roles & Responsibilities: -

Used the DataStage Designer to debug processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database and also the process of loading data into Data-mart.

Used the IBM WebSphere DataStage designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Got the source data in the form of Oracle tables, Sequential files.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Validated the current production data, existing data and the programming logic involved.

Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run.

Identified all data sources and size of the data load, analyzed data and Responsible in identifying data mapping rules.

Experience in using the XML transformations to get the desired results and used all types of files like the flat files and hash files.

Responsible in debugging Ticketing issues in the production phase.

Responsible for using the data mapping to direct into correct systems, data reconciliation and validations.

Used the DataStage Director and its run-time engine for testing and debugging its components, and monitoring the resulting executable versions

Experience in using the database objects like packages, procedures, and functions according to the client requirement.

Used Hierarchy stage to read the Jason file,xml file

Provided production support and performed enhancement on existing multiple projects.

In HTF Multivendor application used unstructured stage to read the XLSX files

Designer role:

Based on the requirement identify the changes in the application and analyze the impact in the code changes

Get the estimates sign-off from the user and Design the Low-Level Design document

Involved in development of complex jobs

Along with Datastage, Supported the SSIS application and developed SSRS reports

Developer SSIS role:

Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS-2012 and SSIS in Business Intelligence Development Studio (BIDS).

Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.

Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.

Developed, deployed, and monitored SSIS Packages.

Created SSIS Packages using SSIS Designer for export heterogeneous data from OLE DB Source (Oracle), Excel Spreadsheet to SQL Server 2012.

Performed operations like Data reconciliation, validation and error handling after Extracting data into SQL Server.

Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.

SSRS: Migrate the AS400 reports to SSRS report, develop the report the SSRS and deploy in the client application

Project Name: Migration from 9.1 to 11.3, Apria Health Care Lake Forest, CA, US

Role: DataStage Consultant

Period: Sep2015 to Jan2016

ETL Tool: Data stage 8.1, 9.1 & 11.3

Scheduling Tool: Autosys

Apria Healthcare is one of the nation’s leading providers of home respiratory services and certain medical equipment, including oxygen therapy, inhalation therapies, sleep apnea treatment, enteral home nutrition and negative pressure wound therapy. Apria owns and operates more than 400 locations throughout the United States and serves more than 1.8 million patients each year

Apria Healthcare is migrating from Datastage 8.1 & 9.1 to IBM DataStage and Quality Stage 11.3, Projects are moved successfully into 11.3 env without any issues.

Roles & Responsibilities: -

Work closely with Datastage Admin and Setup the environment.

Identify the list of project to be moved and move them to Datastage 11.3 env,

Test all the 3 environment (Dev,Test & PROD)

Once scheduled in 11.3, Unscheduled in the lower environment.

Preparing the risk management strategy for how risks are identified, analyzed/ quantified, mitigated, reported, escalated and tracked.

Project Name: 3X Sub Servicing EDW, FlagStar Bank, Troy Michigan, US

Role: DataStage Consultant

Period: Nov2014 to Sep2015

ETL Tool: Data stage 9.1, Data stage 11.3, IBM Cognos Report Net

Flagstar Bank is Subsidary of Flagstar Bancorp, which is listed on the New York Stock Exchange under the symbol if FBC.The Company is the largest publicly held savings bank head quartered in the Midwest USA, They are implementing.

FlagStar bank is migrating from legacy system towards Enterprise Datawarehouse using IBM infosphere suite, with IBM DataStage and Quality Stage 11.3, currently project is in initiation phase, Preparing the Auditing and error handling documents, ETL frame work document, and developing the re-usable jobs for building the re-usable jobs

Roles & Responsibilities: -

Prepared the design document.

Developing reusable object for error handling and auditing tables

Developed Datastage job

Developed datastage job POC for SCD type1,2 &3

Worked on Quality Stage stages (Standarize stage and Investigation Stage)

Designer role:

Understand the existing system and gather the requirements and get it sing-off from customer.

Prepare High Level Design Document and Low Level document for the project

Supporting the team during the Design phase.

Involved in development of complex jobs

Reviewed the jobs developed by the team members

Involve in Scrum call to update the customer on project status on daily basis.

Project Name: AB Posting, IHS, Calgary, Canada

Role: DataStage Designer & Lead

Period: June 2012 to Jan 2013

ETL Tool: Data stage 8.1, 8.5

This Project is developed and solution is provided using Data stage tool. The purpose of the project is for inserting or updating the ABCrown database to keep history data current and accurate for AB Regulatory Customers. Source of the data is in a XML file format that is available on Govt. site.

Roles & Responsibilities: -

Designer role:

Understand the existing system and gather the requirements and get it sing-off from customer.

Prepare High Level Document and Low Level document for the project

Supporting the team during the Design phase.

Involved in development for complex jobs

Reviewed the jobs developed by the team members

Involve in Scrum call to update the customer on project status on daily basis.

Project Name: Alberta Crown Data loader, IHS, Calgary, Canada

Role: DataStage Designer & Lead Developer

Period: Jan 2010 to May 2012

ETL Tool: Data stage 8.1

The Alberta Crown Data loader Project was initiated to develop and implement a solution in the Data stage that replicates the current ABCrown Manual entry association of the same prototype. The purpose of the prototype is for inserting or updating the ABCrown database to keep history data current and accurate for UK Regulatory Customers

Roles & Responsibilities: -

Designer role:

Understand the existing system and gather the requirements and get it sing-off from customer.

Prepare High Level Document and Low Level document for the project

Supporting the team during the Design phase.

Involved in development for complex jobs

Reviewed the jobs developed by the team members

Involve in Scrum call to update the customer on project status on daily basis.

Project Name: UK Regulatory, IHS, UK

Role: DataStage Designer & Developer

Period: April 2010 to Dec 2010

ETL Tool: Data stage 8.1

The UK Regulatory data loading Project will develop and implement a solution into the Data stage that replicates the current UK Regulatory Manual entry association of the same prototype. The purpose of the prototype is for inserting or updating the UK Regulatory database to keep history data current and accurate for UK Regulatory Customers

Roles & Responsibilities: -

Designer role:

Understand the existing system and gather the requirements and get it sing-off from customer.

Prepare High Level Document and Low Level document for the project

Supporting the team during the Design phase.

Involved in development for complex jobs

Reviewed the jobs developed by the team members

Involve in Scrum call to update the customer on project status on daily basis.

Project Name: SMART 5, LLOYDSTSB, UK...

Role: Sr.DataStage Developer

Period: Aug 2008 to Dec 2009

ETL Tool: Data stage 7.1, 7.5

SMART (Sales Management and Relationship Toolkit) is an intranet web based CRM system for Corporate customers. SMART delivers sales, pipeline management, relationship management, account planning processes and information for the front line business staff in Major Corporate, Large Corporate & PACS and (ultimately) FI. The SMART application is a composite of a Siebel front end application, supported by data drawn primarily from the Wholesale Data Warehouse using the Datastage ETL Tool, and financial and analytical reporting provided through Business Objects. It forms a key part of Corporate Market’s strategic architecture at present. IT has recommended that it is replaced by the HBoS’ Microsoft Dynamics platform.

Current project has been initiated to implement some business critical fixes to ensure that the solution remains robust through during 2009 till the completion of the Integration.

The current Marketing Bulk upload process will be enhanced to provide a strategic, automated and less manual solution with appropriate error handling and job status reporting. Also this process will be brought within the overall service cover and SLA provided by ITSD, which would be in-line with the existing SMART batch processes

Roles & Responsibilities: -

Did the development and Designing using the IBM Datastage

Project Name: CORMAC (Corporate Markets Customer), LLOYDSTSB, UK...

Role: Sr. DataStage Developer

Period: Jan 2008 to Nov 2009

ETL Tool: Data stage 7.1, 7.5

As part of this project Corporate Markets Customer (CorMaC) has introduced a new Customer Master Database (CMD) that will drive accurate, compliant reporting of risk and new compliant credit application and assessment processes. CMD will store and maintain a single view of LTSB corporate customers. The overall aim is to obtain Advanced Internal Ratings Based (AIRB) status under the BASEL II accord as expected of a Tier 1 financial services organization.

In CORMAC project we have involved in many application and many tool we have used,

EBCDIC files, Teradata source, Siebel source, webservices, XML ouput, oracle source,

There are 6 feeds in CORMAC project.

1) Data feeds from CMD database to SMART application

2) Data feeds from CMD database to Rhea application

3) Data feeds from Crisp application to CMD database

4) Data feeds from Rhea application to CMD database

5) Loading the CAP & Common systems accounts into CMD database (Added based on CR054/055)

6) Also includes the changes done as part of Address Id resolution in CMD – SMART data feed

Roles & Responsibilities: -

Involved in requirement gathering.

Prepared Low Level Document for the project and various components of the project.

Supporting the team with the Design. Trained the team members in related to the project

Involved in development for Complex jobs(SCD,Siebel loaing,XML output,Webservice load)

Reviewed the jobs developed by the team members

Raised the Change request using HP service manager to move code to UAT and PROD.

Involved in preparing the Weekly Task sheet and conducting the weekly meeting with client to update the work status.

Prepared training documents.

Conducted training sessions to fresher’s

Project Name: MIS+, Bank of Danamon, Jakarta, Indonesia

Role: Consultant

Period: March 2007 to dec2007

ETL Tool: Informatica 7.1, 8.1

As part of this project the Bank of Danamon has replaced DTS system (SQL Server) to Informatica 7.1, All the application using the DTS have been moved to Informatica 7.1. There were various source systems used to load MIS+ warehouse which is done both on daily basis and some monthly. This project will create Informatica ETL mappings required for data feeds/fetch from SQL server and updating attributes and entities within MIS+ based on specified events. The scope of the project is to build, test, implement and schedule ETL workflows

Roles & Responsibilities: -

Used Informatica ETL to do the development.

Project Name: SALESFORCE-PGP DATAWAREHOUSE INTEGRATION, P&G, Chicago, USA

Role: Senior Software Developer

Period: Oct 2006 to Dec 2006

ETL Tool: Informatica 8.1

P&G is migrating to an externally hosted web-based CRM (salesforce.com) solution as a result of which it needs to synchronize specific functional entities between PGP data warehouse and Salesforce.com. P&G will be using Informatica 8.1 with Informatica Connector to feed data to and from saleforce.com

Roles & Responsibilities: -

Used Informatica ETL to do the development.

Project Name: GE Fleet Services, GE, Chicago, USA

Role: Senior Software Developer

Period: Feb 2006 to Oct 2006

ETL Tool: Informatica 7.5



Contact this candidate