Post Job Free

Resume

Sign in

Data Manager

Location:
Columbus, OH
Posted:
May 23, 2017

Contact this candidate

Resume:

Satish Devatha

BI and ETL Senior Developer

ac0gbe@r.postjobfree.com

614-***-****

Objective:

To establish as a BI and ETL consultant in the field of ETL (Data warehouse, Data mart, Data Integration and Data Conversion) and BI (reporting and Decision support system) Projects with the use of my inherent and technical skills.

Professional Profile:

Have extensive 11+ years of Experience in Data Warehouse, Data mart, Data Integration, Reporting and ETL Data Conversion Projects.

Ability to manage multiple projects simultaneously and meet project deadlines. With exposure to all phases of data warehousing life cycle.

Experience in implementation of Data Warehouse projects mainly Extraction, Transformation and Loading process using INFORMATICA POWER CENTER.

Extensive knowledge in Cognos and Informatica

Expert in providing high level design and low level design for the business requirements.

Expert in Cognos 10 Framework Manager – Creating Models in Framework Manager – Design and Develop Complex reports using Report Studio According to the Requirement.

Having Knowledge to create Informatica Data Quality rules in Informatica Developer.

Created Complex SQL queries to validate Reports.

Expertise in designing reports using Report Studio, Query Studio.

Having knowledge on Metric Studio, Analysis studio & Event Studio.

Created and published packages using Framework Manager and Generated different kind of reports including Templates, List, Cross-tab, Different types of Charts and Conditional Block in Report Studio.

Have extensive experience in Development projects and Production support projects.

Expert at analyzing data and providing ETL & Reporting solutions.

Finance and Insurance Domain Knowledge.

Have good knowledge with different ETL tools like Ab Initio, Datastage and SSIS.

Good understanding of Data Mart Architecture, Data modeling and dimension modeling using Star Schema and Snow-flake schema.

Scheduling Informatica and Cognos jobs using third party scheduling tools like ESP, Zena and Cronacle

Created COBOL Programs to extract the data through Informatica.

Ability to quickly grasp technical aspects and willingness to learn.

Good interpersonal and communication skills.

Current & Previous Work experience summary

Currently working as Senior Developer in Nationwide Insurance, Columbus from Jan ’16 to Till Date

Worked as Senior Developer in Tech Mahindra, Hyderabad from Jan ‘10 to Dec ‘15

Worked as Asst.Systems Engineer in TCS, Bangalore from July ’07 to Jan ’10

Worked as Associate Projects in Covansys India Pvt Ltd., Bangalore from July ’05 to July ’07

Data Warehousing Tools

BI Tools : Cognos10x,8x suits includes Framework Manager, Query Studio,

Report Studio, Analysis Studio, Event Studio

ETL Tools : Informatica 9.x Repository Manager, Designer,

Workflow Manager and Workflow Monitor

Database : Oracle, DB2, Microsoft SQL Server 2005 & 2008, Teradata

Scripting : NT Script for Automation

Quality Tools : Visual Source Safe.

Data Quality Tools : Informatica Data Quality

Project Experience Details

Duration : Sept 2016 to Till Date

Client : Nationwide Insurance, USA

Project Name : Producer Information Management

Technologies : Informatica9.x, DB2

Project summary

DPIM is one more Producer-centric corporate model which requires seamless service delivery across multiple products and producer touch points and a whole new level of producer knowledge across the enterprise for sales and servicing.

DPIM provides a consolidated view of the Producer, and allows Nationwide to take advantage of enhanced selling capabilities, such as active marketing, cross-enterprise business models and cross sell analysis. To enable this view and the enhanced sales and servicing capabilities DPIM will become the system of record for producer preference data (i.e. opt-out preferences, billing preferences, etc.), Household account data and the system of reference for all other customer related data. To enable the enterprise customer goals certain existing systems, need to be retired/ migrated to DPIM. For example, customers from Nationwide Finance can log on to Internet (e.g. to set preferences, for servicing across the enterprise).

Roles & Responsibilities:

As a Senior Developer, Responsible for the following activities

Preparing & reviewing all necessary project related documents/sheets like Model Specification sheet, ETL Specification sheet, Test Plan, Test Case, Test Result.

Involved in System Documentation of Dataflow and methodology.

Created reusable mapplets, reusable transformations and performed Unit tests over Informatica power center.

Designed complex Business rules in IDQ.

Developed complex IDQ rules which can be used in Informatica and as well in Online

Developed webServices through IDQ to be interacted in online.

Created reusable mapplets using Informatica Developer.

Worked on Performance tuning of Informatica designs and mappings.

Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML Parser, XML Generator, Update strategy, union, aggregator, normalizer and sequence generator.

Created various mappings for to generate the xml files for the sourcing team.

Involved in deployment and ESP scheduling for production migration.

Supporting Data Quality Team on PDS, DBAI and AMF Production Defects.

Duration : Jan 2016 to Aug 2016

Client : Nationwide Insurance, USA

Project Name : Denodo to ETL Conversion for Sourcing

Technologies : Informatica9.x, DB2 and Denodo

Project summary

Nationwide is one of the largest insurance and financial services companies in the world, focusing on domestic property and casualty insurance, life insurance and retirement savings, asset management, and strategic investments.

Nationwide building Distribution Partner Information Management using their different Producer’s information into one area for enterprise level using Data Integration Framework. Currently PDS, DBAI and AMF source systems data pulling from Denodo data virtual tool but it is taking more time while extracting. To overcome Performance issues with denodo, replacing with Informatica to extract data from different Producer’ like PDS, DBAI and AMF using GALAXY Framework.

Roles & Responsibilities:

As a Senior Developer, Responsible for the following activities

Preparing & reviewing all necessary project related documents/sheets like Model Specification sheet, ETL Specification sheet, Test Plan, Test Case, Test Result.

Working with team on Agile method on work sharing by creating Cards according to complexity of mappings.

Created mapplets for code reusability.

Created common mapping and used those mappings for different source systems.

ETL Design and defined a suitable architecture for data integration from multiple, disparate data sources.

Extensively developing Low Level Designs (Mapping Documents) by understanding PDS, DBAI and AMF source systems

Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML Generator, Update strategy, union, aggregator, normalizer and sequence generator.

Created various mappings for to generate the xml files for the sourcing team.

Involved in deployment and ESP scheduling for production migration.

Organization : Tech Mahindra, Hyderabad

Duration : Apr 2015 to Dec 2015

Client : Selective Insurance Ltd, USA

Project Name : BI Metrics

Technologies : Cognos10.2 BI suite and MS SQL Server 2012,

Informatica9x

Project summary

Selective Insurance Group, Inc. is primarily a holding company for seven customer-focused property and casualty (P&C) insurance companies rated “A+” (Superior) and ranked as the 48th largest property and casualty insurance group in the United States. Client is maintained operational data using IBM mainframe system. Operational Data is extracted and processed to Enterprise Data Warehouse which is in SQL Server 2008. Again data copied into data marts like CLAIMS, CAMIS and KM REPORTING using Informatica ETL tool. Reports are creating using Cognos8x tool.

Roles & Responsibilities:

As a Senior Developer, Responsible for the following activities

Preparing & reviewing all necessary project related documents/sheets like Model Specification sheet, Report Specification sheet, Test Plan, Test Case, Test Result.

Conducting the daily meeting with Client for task sharing and approvals from client

Developed the Relational modeling in Framework Manager as per specs.

Created the dashboards for metrics reporting for Business validation and using Cognos work space and Created supporting reports using Report Studio

ETL Design and defined a suitable architecture for data integration from multiple, disparate data sources

Extensively developing Low Level Designs (Mapping Documents) by understanding different source systems

Created Complex SQL queries for validating ETL and Reports data.

Created various mappings for populating the data to feed dashboard tables.

Organization : Tech Mahindra, Hyderabad

Duration : Nov 2013 to Mar 2015

Client : AJG-GB, USA

Project Name : RISX-FACS Claims

Technologies : Cognos 8.4 BI suite, Cognos 10.2 and MS SQL Server 2008

Project summary

Gallagher Bassett is the largest property/casualty third-party administrator, offering enlightening insights and services in the areas of claims management, information management, medical cost containment, and consultative services which include risk control consulting and appraisal services. Most of the data comes from Tandem environment to Data Warehousing database using MSBI - SSIS technology. Framework Manager builds business model and published in content store. Reports and Cubes are developed based on FM packages. Various reports have been developed using Report Studio in different subject areas like Loss, Claims, Managed Care, Banking, Payment and Liability. Cube has developed for each subject area.

Roles & Responsibilities:

As a Senior Developer, Responsible for the following activities

Preparing & reviewing all necessary project related documents/sheets like Model Specification sheet, Report Specification sheet, Test Plan, Test Case, Test Result.

Coordinating with onsite team and get it approved by the client.

Conducting the daily meeting with offshore and onsite team for task sharing.

Created complex reports using Report Studio

Used Java scripts in Report Studio Report Development

Responsible for code migration during deployments.

Created SQL queries for validating ETL and Reports data.

Involved in job monitoring for ETL process for daily and monthly loads.

Resolved report performance issues

Involved in Performance Improvement of DTS Packages

Organization : Tech Mahindra, Hyderabad

Duration : Jan 2010 to Oct 2013

Client : Selective Insurance Ltd, USA

Project Name : SI- Knowledgement Management

Technologies : Cognos10.2 BI suite and MS SQL Server 2008,

Informatica9x

Project summary

Selective Insurance Group, Inc. is primarily a holding company for seven customer-focused property and casualty (P&C) insurance companies rated “A+” (Superior) and ranked as the 48th largest property and casualty insurance group in the United States. Client is maintained operational data using IBM mainframe system. Operational Data is extracted and processed to Enterprise Data Warehouse which is in SQL Server 2008. Again data copied into data marts like CLAIMS, CAMIS and KM REPORTING using Informatica ETL tool. Reports are creating using Cognos8x tool.

Roles & Responsibilities:

As a Senior Developer, Responsible for the following activities

Preparing & reviewing all necessary project related documents/sheets like Model Specification sheet, Report Specification sheet, Test Plan, Test Case, Test Result.

Coordinating with onsite team and get it approved by the client.

Conducting the daily meeting with offshore and onsite team for task sharing.

Developed the Relational modeling in Framework Manager as per specs.

Created the dashboards and applied the Global filters in Cognos connection.

Created complex reports using Report Studio

Created IQD’s Using Frame work manager in order to create the Transformer Model for Cubes.

Monitoring the Processes before loading the data to the Summary Tables and provide project support which is SLA driven

Created various Transformer Models and published the cubes to Content store

Redesigned ETL flows and Reports due to performance issues

Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer and sequence generator.

Created reusable mapplets and reusable transformations.

Created SQL queries for validating ETL and Reports data.

Worked on Informatica upgrade from 8.6 to 9.1.

Created various mappings for populating the data into star schema.

Organization : TCS, Bangalore

Duration : July 2008 to Jan 2010

Client : GE-Health Care.

Project Name : Quantum Leap

Technologies : Cognos Report net 1.1, Cognos 8.3, Informatica 8.1.1,

Cronacle, Teradata, Windows NT

Project Summary:

The Project is to maintain enterprise data warehouse of GEHC with different modules like purchase orders Invoice details and shipping details of their healthcare products. Application data receives from more than 90 origins of GE Supplier’s worldwide and processed the data using Cronacle to run the workflows and run the job chains as per schedule. User will run the reports and compare the reports with previous runs and calculate the MPI of the material for previous and current year

Roles & Responsibilities:

Working on System Problem Reports raised by users

Understanding existing model and client requirements

Generating Report Studio reports, Query Studio Reports in Cognos Reportnet and Cognos 8.3

Creating Framework Model and creating Packages as per the user requirements

Working on the user requirements in order to rectify their problems

Back tracking the flow and analyze the logic and resolve the issue as per user requirement.

Involved in testing and validation of up gradation of Framework Model from Reportnet to Cognos 8.3

Involved in testing and validation of up gradation of Cognos Reportnet reports to Cognos 8.3 reports

Tuning the report queries as suggested by DBA’s to improve the performance by changing the reporting application views

Monitoring the Jobs and workflows

Verifying the Log files while running the jobs

Organization : TCS, Bangalore

Duration : Aug 2007 to Jun 2008

Client : GE-Health Care.

Project Name : Supplier Quality Data Store

Technologies : Cognos Report net 1.1, Informatica 8.1.1, Cronacle, Teradata,

Windows NT

Project Summary:

The Project is to maintain enterprise data warehouse of GEHC with different modules like supplier and invoice details of health care products. Application data receives from different origins of GE Supplier’s worldwide and processed the data using Cronacle to run the workflows and run the job chains as per schedule like daily, weekly and monthly. User will run the reports for analysis purpose. These reports mainly used for analysis of their products delivered by suppliers.

Roles & Responsibilities:

As Developer, was handled the following responsibilities:

Understanding existing business model and client requirements

Generating reports in Cognos Reportnet

Working with Cognos Reportnet 1.1 MR2 version

Creating Report Studio reports, Query Studio Reports

Monitoring the Processes before loading the data to the Dimension and Fact Tables

Monitoring the Jobs and workflows

Verifying the Log files while running the processes.

Organization : Covansys India Pvt Ltd, Bangalore

Duration : Feb 2007 to July 2007

Client : Inovant LLC A VISA Solutions Company.

Project Name : CCDRI (Commercial Card Data Repository

Integration)

Technologies : Cognos Report net 1.1, Ab Initio 1.13, DB2, Windows NT

Project Summary:

The Project is to build a data warehouse of Credit Card Customers Details of the VISA which are outside of U.S.A. CCDRI is mainly maintain the data of the Commercial Credit Card Details of the VISA Customers. Using CCDRI Analysis the data will provided to the different applications (QDW, QDS) to benefit the customers as well as clients. The objective of the Data Mart is to capture information of the Customers /clients of CCDRI system. The data Mart is used for the reporting needs of different applications related to the VISA.

Roles & Responsibilities:

Understanding existing business model and client requirements

Generating reports in Cognos Reportnet

Working with Cognos Reportnet 1.1 MR2 version

Monitoring the Processes before loading the data to the STAGE and ODS Tables

Verifying the Log files while running the processes

Organization : Covansys India Pvt Ltd, Bangalore

Duration : July 2005 to Jan 2007

Client : State of OHIO, USA,

Project Name : eTABS

Technologies : Cognos Report net 1.1, Cognos 8, Informatica 7.1.3,DB2,

Windows NT

Project Summary:

The Project is to build a data warehouse which analyses taxation and benefits of employers/employees in an integrated manner for Ohio state government. The objective of the Data Mart is to capture information of the employer’s/employees unemployment taxation and insurance system. The data Mart is used for the reporting needs of Ohio state government.

Roles & Responsibilities:

Understanding existing business model and client requirements

Generated reports in Cognos Reportnet and Cognos 8

Writing Test scenarios, Unit Testing and Peer review at developer level

Working with Cognos Reportnet 1.1 MR2 version

Developed various Reports and customized the reports as per requirements

As report author, Generating New reports (Report Studio, Query Studio) based on client’s requirement using Reportnet 1.1

Developed Complex functionalities like Conditional Formatting, Conditional display

Creating views from Facts and Dimensions

Working with set operators in Report Studio

Created Informatica mappings and workflows.

Involved in writing in new program to extract source data and transforming data as per mapping provided by BI Team

Educational Qualification

Degree

University

Year

B.Tech

Kakatiya University, AP

2001



Contact this candidate