Post Job Free

Resume

Sign in

Data Project

Location:
San Ramon, CA
Salary:
130000
Posted:
March 14, 2017

Contact this candidate

Resume:

Gowrisankar S

+1-952-***-**** acy910@r.postjobfree.com

Senior ETL Consultant with 11+ Years of End-To-End BI Development experience ranging from ETL, Data Profiling, Data Quality, Datamodelling and reporting.

Experience Summary:

11+ years of total experience in Business Requirements Analysis, Data Integration, Data Warehouse/Data Marts, Business Intelligence, ETL, Dimensional Data Modeling, Coding, Development, Testing and Implementation with RDBMS and non RDBMS.

9 years of experience using Informatica in designing and developing ETL mappings and scripts using Informatica Power Center 9.6/9.5/9.1/8.6/8.5/8.1/7.1/7.0.

3 Years of experience in designing and developing ETL mappings and scripts using SAP data services(BODS)

2 Years of experience in SAP HANA modeling.

2 years of experience on big data-Hadoop

11+years of database experience using Oracle, DB2,MS SQL Server, Teradata, Sybase,Netezza,SAP HANA

2 years of experience in IBM infosphere CDC management console.

Extensive experience working in Wealth Management, Finance, mortgage, retail and sales-shipment.

Experience in translating user requirements into functional, technical specifications and ETL mapping documents.

Extensive knowledge of Dimensional Data Modeling like Star and Snowflake schemas

Business Intelligence experience using Oracle BI (OBIEE), Business Objects.

Extensively worked on SQL & PL/SQL programming.

Skilled in UNIX shell scripting and experience on different UNIX platforms.

Excellent troubleshooting/Production Support skills.

Experience in working in an onsite-offshore structure and effectively coordinated tasks between onsite and offshore teams.

A highly motivated self-starter and a good team-player with excellent verbal and written communication skills.

Proficient in production support activities with effective change management configurations using visual source safe VSS, Handling Incident tickets, change requests and Problem tickets using Remedy Incident Management and Quality Center tool.

Involved in carrying out Quality Reviews, Audits, design and development reviews, review of user-cases and test-cases, business scenarios and root-cause analysis, coordinating user acceptance testing, training, and implementing new processes.

Education:

Bachelors in Computer science engineering from JNT University, Hyderabad

Technical summary:

Technologies: Informatica power center 7.x/8.x/9.x,Oracle 9i/10g/11g/12c

SQL Server 2000/2005/2008, Sybase ASE/Sybase IQ, Netezza,Tidal Scheduler, UNIX/HP-UX/Sun Solaris,Datastage 8.x,Infosphere change data capture(CDC) management console,Netezza,Hadoop, SAP Data services-BODS,SAP-HANA modelling

Domain Expertise: Sales, shipment, mortgage, Finance, human resources and Insurance

Industry Expertise: Oil and Gas Industry, Banking, Retail, Mortgage and Manufacturing Industry and IT industry.

Professional experience:

Client: Bank of the west San Ramon CA

Project title: ECRM-Enterprise Customer relationship management

Role: Sr ETL developer/ETL architect

Duration: Aug2015-till date.

Objective of the project ECRM is to provide 360 degree customer view across all the line of businesses in the bank which helps for better sales and prospecting, integrating onboarding touchpoint system with CRM-Microsoft dynamics systems through Enterprise service bus (ESB), near real time processing of referral –prospect files from onboarding systems and Microsoft dynamics, solution helps in better referral process for the new products of the bank, providing incentives to the employees.

Responsibilities:

Understanding the business requirements and creating technical artifacts.

Involved in designing ETL architectures and flows for enterprise service Bus (ESB).

Involved in creating pl/sql packages to process near real time data between Touchpoint-FIS and Microsoft dynamics (MD).

Involved in loading flat files to relational tables.

Involved in creating jobs for Informatica jobs to capture history and current data using slowly changing dimension concepts.

Involved in reading heterogeneous sources to consume transactional data

Involved in reading reference data tables to consume master data

Involved design of data high availability jobs.

Created Tidal scheduling plan based on dependency.

Involved in creating UNIX scripts for scheduling jobs.

Involved in creating UNIX scripts for file processing

Involved in database scripting.

Involved in designing scheduling process based on the dependency.

Involved in project tracking and task assignments

Involved in creating deliverables and tasks.

Involved in implementation activities during release management.

Supported/Participated in the SIT and UAT testing cycles and worked on the quick remediation of the defects fell out of the testing.

Provided Support to training team to develop Training materials for all downstream users, support team and scheduling team.

Provided post go-live support and provided quick remediation for the issues reported.

Trained users with best practices and ways to use system more efficiently.

Technologies: Informatica 9.x, Touchpoint-FIS, Microsoft dynamics, Oracle PL/SQL, ESB, SQL Server, Tidal, Linux

Client: General Mills Minneapolis MN

Project title: Ref R3 remediation

Role: Sr BI developer/ETL architect

Duration: Sep 2014-Aug 2015

Objective project REF R3 is to remediate the risk while migrating current BW system to SAP HANA which is currently on Oracle, to remediate this SAP data services has been introduced to pull data from BW system using open hubs and ECC tables and loaded to warehouse from which the downstream system can consume data with major enhancements to attain predictive, sentimental analysis based on the retail data belongs to market competitors from Nielsen’s, with this sales of food products company enhanced to the greater extent, system suggests manufacturing units on the products with great demand and reason for their interest for such products.

Responsibilities:

Leading team at onshore and coordinating with offshore

Understanding the business requirements and building the jobs for REFR3 remediation.

Involved in designing technical specifications.

Involved in design of ETL artifacts.

Involved in working on CDC dataflows.

Involved in loading flat files to relational tables.

Involved in creating jobs for Data services jobs to capture history and current data using slowly changing dimension concepts.

Involved in reading heterogeneous sources to consume transactional data

Involved in reading reference data tables to consume master data

Created control scripts to handle huge transactional data

Involved design of data high availability jobs.

Created Tidal scheduling plan based on dependency.

Involved in creating UNIX scripts for scheduling jobs.

Involved in database scripting.

Involved in designing scheduling process based on the dependency.

Involved in project tracking and task assignments

Involved in creating deliverables and tasks.

Supported/Participated in the SIT and UAT testing cycles and worked on the quick remediation of the defects fell out of the testing.

Provided Support to training team to develop Training materials for all downstream users, support team and scheduling team.

Provided post go-live support and provided quick remediation for the issues reported.

Trained users with best practices and ways to use system more efficiently.

Technologies: SAP BODS, SAP HANA, Sql server, Oracle 11g, Tidal, Linux

Client: Tyson foods AR

Project title: EDM

Role: Sr BI developer/ETL architect

Duration: Nov 2013 – Sep 2014

Objective of the project EDM is to build DataMart on the data pertinent to infant chickens produced, chickens survived, diseases, mart is to provide holistic view on the chickens growth tenure and best method to improve the chickens survival hence greater margins to the company, consumed data from heterogeneous systems using SAP BODS to SAP HANA views, systems helps uses to provide near real time reports both on the transactional and historical data.

Responsibilities:

Leading team from offshore

Understanding the business requirements and building the jobs for EDM.

Involved in designing technical specifications.

Involved in designing phase of modelling solution for EDM

Involved in design of data marts, aggregate tables and warehouse objects.

Involved in creating HANA attribute views, analytical views and Calculations views.

Involved in loading flat files to relational tables.

Involved in creating jobs for BODS jobs to capture history and current data using slowly changing dimension concepts.

Involved in creating UNIX scripts for scheduling jobs.

Involved in database scripting.

Involved in designing scheduling process based on the dependency.

Involved in creating test cases and unit testing

Involved in project tracking and task assignments

Involved in creating deliverables and tasks.

Involved in month end close processes.

Interacted with end users to understand their issues in using the system.

Trained users with best practices and ways to use system more efficiently.

Technologies: SAP BODS, SAP HANA, Oracle.

Client: Nationstar TX

Project title: EDW- Enterprise data warehouse

Role: Sr ETL Developer

Duration: Dec 2012 – Oct 2013

Objective of the project EDW is to build enterprise data warehouse on the company’s whole assets, customers and accounts, this system helps to provide better services to the customers, as part of the project consumed data from various source systems through IBM CDC management console and Datastage, data replication has been attained on the numerous data sources.

Responsibilities:

Involved in Design, Develop and Deployment stages of the project.

Leading team from offshore

Understanding the business requirements and building the jobs for EDW.

Development of dimension, fact tables.

Involved in working on CDC management subscriptions, data stores and mappings.

Involved in loading flat files to relational tables.

Involved in creating jobs for data stage jobs to capture history and current data using slowly changing dimension concepts.

Involved in database scripting.

Involved in designing scheduling process based on the dependency.

Involved in creating jobs, sequences in Datastage

Involved in creating test cases and unit testing

Involved in project tracking and task assignments

Involved in creating deliverables and tasks.

Supported/Participated in the SIT and UAT testing cycles and worked on the quick remediation of the defects fell out of the testing.

Provided Support to training team to develop Training materials for all downstream users, support team and scheduling team.

Provided post go-live support and provided quick remediation for the issues reported

Technologies: Datastage 8.x,Infosphere CDC management console,Netezza,Sql server,DB2

Client: AHMSI TX

Project title: IDR

Role: Sr BI Developer

Duration: Oct 2010 – Nov 2012

AHMSI Utilizes LPS’s Mortgage Servicing Software for its loan servicing activities,the data obtained through passport queries and three custom download files (daisy, dares and CPI) are used to feed downstream applications activities and provide data for operational and management reporting and supporting large adhoc queries across the organization. There are data coming from multiple source systems from downstream (AIRS, MI Reporting, IROC and ICAR) and put the data into datawherehouse. The goal of this project is to create a centralized repository of AHMSI focused data by creating an integrated system of records through the consolidation of many disparate source applications. The project will leverage Oracle’s Business Intelligence Enterprise Edition (OBIEE) to provide the solution.

Responsibilities:

Leading team from offshore

Involved in Data modeling and Data architecture changes

Understanding the business requirements and building the mappings for IDR data warehouse.

Development of dimension, fact tables.

Involved in working on COBOL copy books.

Involved in loading flat files to relational tables.

Involved in scheduling process.

Developed encoding and decoding functions to generate hash keys for CDC

Involved in creating test cases and unit testing

Technologies :Informatica 9.0.1, Oracle, DAC,OBIEE

Client: Key Energy TX

Project title: KEY HR analytics

Role: Sr BI Consultant

Duration: Mar 2010 – Jun 2010

Key energy is one of the pioneers in rental services, well services, coiled tubing and fluid management, objective of the project is to integrate processes across geographies for Financial, sales shipment and procure modules by extend the OTB data model and ETLs to accommodate customization of the source application from EBS Financial and HR module modifications.

Responsibilities:

Understanding the business requirements and building the mappings for OBIEE R2 data warehouse.

Development of dimension, fact tables.

Developed stored procedures to load data from staging tables to dimension and fact tables.

Involved in loading flat files to relational tables.

Involved in scheduling process.

Developed encoding and decoding functions to generate hash keys for CDC

Involved in creating test cases and unit testing

Technologies: Informatica 8.6.1, Oracle, DAC,OBIEE

Client: Manheim GR

Project title: CRR DataMart

Role: Sr BI Consultant

Duration: Dec 2009 – Mar 2010

Manheim project will provide a reporting solution that consolidates the various sources of dealer debt and activity data into a single interface for the end users to leverage in support of their credit and collections related processes and analysis, Manheim currently relies on several systems to provide data and reporting in order to obtain a view of Manheim’s current debt exposure as well as support credit management and collections activities. These systems include auction-level iSeries reports, corporate iSeries reports, manually maintained spreadsheets and some limited data obtained via Business Objects. These systems provide bits and pieces of the information needed, but cannot provide a consolidated view of Manheim’s current debt exposure. In addition, auction debt is maintained and managed separately from MAFS debt, adding to the complications in obtaining a view of total dealer debt. The project will leverage Oracle’s Business Intelligence Enterprise Edition (OBIEE) to provide the solution.

Responsibilities:

Understanding the business requirements and building the mappings for CRR data mart

Development of dimension, fact tables.

Developed stored procedures to load data from staging tables to dimension and fact tables.

Involved in scheduling process.

Developed encoding and decoding functions to generate hash keys for CDC

Involved in creating test cases and unit testing

Technologies: Informatica 8.6.1, Oracle, DAC, Power exchange.

Client: ICICI prudential MH-India

Project title: IRIS

Role: BI Consultant

Duration: Jun 2009 – Dec 2009

Objective of the project IRIS is to provide integrated reporting system across company which helps holistic view on the company’s assets, current reporting is being done on various sources like excel, PDF and business objects etc., with this project 360 degree view of centralized reporting availability on customers, accounts,products, this helps lucrative business by analyzing customers interest on various products and by providing better services, this project provides near real-time data availability for the users to generate adhoc reporting as well.

Responsibilities:

Understanding the business requirements and building the mappings for IRIS data mart (Sybase RDBMS).

Development of dimension, fact tables.

Generating control files to load data from flat file targets to relational targets.

Developed stored procedures to load data from staging tables to dimension and fact tables.

Involved in scheduling process.

Involved in SMS process to MIS users.

Involved in User acceptance testing

Involved in building strategies

Developed encoding and decoding functions to generate hash keys for the tables without Primary keys.

Technologies: Informatica 8.6.1, DB2, Sybase IQ, Sybase ASE, Oracle, Power exchange.

Client: HP TX

Project title: BPMS- Business performance management system

Role: BI Consultant

Duration: Mar 2007 – Dec 2008

BPMS refers business performance management system. This to manage HP sales and shipment for EMEA region which provides complete information about orders, acknowledgements, deliveries and shipment of HP products, customer can track the status of their ordered products directly with help of BPMS. This BPMS is now serves as Formula. Formula is a sales reporting system. It extracts the data from different types of sources like Eclipse, Mirror Formula, Polaris, FDWE and C4 and data is being loaded into Formula Database. All users can generate reports from the Formula Database end.

SVR is the Service Value Reporting solution provides a worldwide service value-reporting environment with integrated data from Passkey 2000, Workflow Management IM, and ABE. It is based on a single worldwide data mart database with a web based user interface layer.

Responsibilities:

Involved in ETL part as per the client’s requirement.

Involved in migration assets.

Implemented SQL scripts and shell scripts for scheduling.

Worked on Tidal scheduler to schedule jobs as per the schedule requirements

Worked with data quality issues on dealing with data pertinent to various code pages

Involved in working on change requests from customers

Involved in fixing the L4 issues escalated by L3 support

Implemented test cases and defect logs.

Involved in unit testing and UAT

Involved in User interaction and assistance

Involved in deployment process.

Technologies: Informatica 7.4.1, Informatica 6.x, Oracle, Sql server, SSIS, Unix/HP-UX, EV reports.

Client: Target MN

Project title: EDW

Role: ETL Developer

Duration: Apr 2005 – Jan 2007

Project pertinent to sales management system; Scope of this Project is to capture the Sales and Marketing processes in data warehouse. This organization needs a data warehouse for lucrative decision making and analysis and to get information about overall sales and its growth, Profit and loss analysis, branch wise analysis. Data from Operational Source System was extracted, transformed and loaded into data warehouse Using Informatica.

Responsibilities:

Involved in ETL mapping development.

Involved in database objects development.

Involved in reading from and loading to heterogeneous sources.

Involved in creating mapplets for holding reusable logic.

Involved in designing scheduling process.

Involved in performance tuning.

Involved in creating stored procedures to move data from one layer to other layer.

Involved in customer interaction and assistance.

Implemented test cases and defect logs.

Involved in deployment process

Technologies: Informatica 7.4.1, Informatica 6.x, Oracle, Business Objects



Contact this candidate