Post Job Free
Sign in

Data Manager

Location:
Grafton, WI
Posted:
March 11, 2019

Contact this candidate

Resume:

Full Name: Raghu Ram Chittibommala

Cell: 331-***-****

Email: ************@*****.***

Summary

More than 12 years of work experience in application and product development using full SDLC primarily using Informatica Powercenter, Oracle, SqlServer, Business Objects, Crystal Reports. Expert in Data analysis. Well-equipped to provide software solutions. Deliver quality applications and excellent team player with strong technical and communication skills.

Education

B.Com, LLB (India)

Certifications & Training

Trained in Informatica Powercenter (9.5) Developer

HR Management, Michigan State University, USA

Training in Tableau 8.3

Technical Skills:

Operating Systems: Windows 2008/2007/2005/NT/XP, UNIX, Ms-DOS

ETL Tools Informatica Power Center 9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow Monitor

Databases Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata

Data Modeling tools: Erwin, MS Visio

OLAP Tools Cognos 8.0/8.1/8.2/8.4, Business Objects XI r2/6.x/5.x

Languages SQL, PL/SQL, UNIX, Shell scripts, VBScript

Scheduling Tools Autosys, Control-M

Professional Experience

Client: Computech corp.. Sept 2015 - Dec 2018

Role: Sr. ETL/Informatica Developer

Description: This project includes developing Data warehouse from different data feeds and other operational data sources. Built a central Database where data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for preparing design documents and interacted with the data modelers to understand the data model and design the ETL logic.

Responsibilities:

Responsible for Business Analysis and Requirements Collection.

Worked on migrating Informatica environments from physical Linux Environment to virtual Machines with HA in Informatica Services. This involve installation and configuration of informatica IDQ and Power center services.

Worked on Informatica Power center, Informatica Power Exchange / Power Connect in all phases of Design, Development, Implementation, support, migration and upgrade of Data warehousing applications using Informatica power center 10.x, 9.x& 8.x, SQL, PL/SQL, Oracle and Unix.

Full life-cycle project delivery experience from product definition to implementation including: requirements and specification writing, design documentation, unit and system testing, optimization and performance analysis, quality assurance, and release into a production environment.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Developed mapping parameters and variables to support SQL override.

Created maplets to use them in different mappings.

Developed mappings to load into staging tables and then to Dimensions and Facts.

Used existing ETL standards to develop these mappings.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Extensively used SQL* loader to load data from flat files to the database tables in Oracle.

Modified existing mappings for enhancements of new business requirements.

Used Debugger to test the mappings and fixed the bugs.

Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Involved in Performance tuning at source, target, mappings, sessions, and system levels.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos 8.

Client: XL Global Services Inc, Stamford, CT Nov 2013 to Aug 2015

Role: ETL Consultant

Description: XL Global Services Inc. provides the backbone Information Technology support to the XL Capital group of companies, a leading provider of insurance and reinsurance coverage, innovative risk management and financial solutions. As part of providing financial solutions, XL Global Services Inc generates various reports for presenting a comprehensive Credit and Risk analysis for its customers. The project was designed to develop and maintain Data Marts. We have to upload the data from various centers with the data in different systems using ETL Tools.

Responsibilities:

Logical and Physical data modeling was done using Erwin for DW database in STAR SCHEMA

Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.

Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.

Used various transformations like Source Qualifier, Joiner, Lookup, sql,router, Filter, Expression and Update Strategy.

Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.

Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools

Documented Informatica mappings in Excel spread sheet.

Tuned the Informatica mappings for optimal load performance.

Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.

Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.

Have generated reports using OBIEE 10.1.3 for the future business utilities.

This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.

Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.

Constantly interacted with business users to discuss requirements.

Environment: Informatica PowerCenter Designer 8.6/8.1, Informatica Repository Manager, Oracle10g/9i,DB2 6.1, Erwin, TOAD, SAP Version: 3.1.H,Unix- SunOS, PL/SQL,SQL Developer

Client: EQT Corporation, Pittsburgh, PA Aug 2010 – Oct 2013

Role: Sr. ETL/Informatica Developer

Description: EQT Corporation is an integrated energy company, supplying natural gas, crude oil, and gas related services to the customers. The main objective of the project was to help the decision making team of the organization, to monitor/improve the sales and to explore avenues for new business opportunities. DW team is responsible for building Global Data Warehouse and providing reports for Production and Midstream groups. Worked on four capital projects EPC BI, EQUITRANS BI, EGC BI and ETRM BI. The data is extracted from Flat files, Oracle, SQL and DB2 into the Operational Data Source (ODS) and the data from Operational Data Source was extracted, transformed and applied business logic to load them in the Global Data Warehouse Using Informatica Power Center 9.1.0 tools.

Responsibilities:

Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.

Designed ETL specification documents for all the projects.

Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.

Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.

Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.

Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.

Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.

Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.

Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.

Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.

Extensively used workflow variables, mapping parameters and mapping variables.

Created sessions, batches for incremental load into staging tables and scheduled them to run daily.

Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

Implemented Informatica recommendations, methodologies and best practices.

Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

Involved in Unit, Integration, System, and Performance testing levels.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Migrated the code into QA (Testing) and supported QA team and UAT (User).

Created detailed Unit Test Document with all possible Test cases/Scripts.

Conducted code reviews developed by my team mates before moving the code into QA.

Provided support to develop the entire warehouse architecture and plan the ETL process.

Modified existing mappings for enhancements of new business requirements.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Involved in production support.

Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, IBM ISeries (DB2), MS Access, Windows XP, Toad, Tidal, Cognos 8.4.1., SQL developer.

Client: NYK Line North America Inc, Secaucus, NJ October 2008 - July 2010

Role: Sr ETL Developer

Description: NYK Lines is one of the world's premier full service intermodal carriers. The company utilizes a vast network of ocean vessels, barges, and railroad and motor carriers to link the international shipper with the consignee and also services offered include Intermodal services, terminals and warehousing, insurance, as well as repair and maintenances.

Modules: Shipment Data Mart, Job order Cost Mart, Net Contribution Mart, DnD Mart (Detention & Demurrage)

Responsibilities:

Involved in the analysis of the user requirements and identifying the sources.

Created technical specification documents based on the requirements by using S2T Documents.

Involved in the preparation of High level design documents and Low level design documents.

Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.

Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.

Followed Ralph Kimball approach (Bottom Up Data warehouse Methodology in which individual data marts like Shipment Data Mart, Job order Cost Mart, Net Contribution Mart, Detention & Demurrage Mart are provides the views into organizational data and later combined into Management Information System (MIS)).

Prepared Level 2 Update plan to assign work to team members. This plan is very helpful to know the status of each task.

Administered the repository by creating folders and logins for the group members and assigning necessary privileges.

Designed and developed Informatica’s Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.

Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router and Update Strategy.

Developed reusable Mapplets and Transformations.

Used debugger to debug mappings to gain troubleshooting information about data and error conditions.

Involved in monitoring the workflows and in optimizing the load times.

Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.

Involved in writing procedures, functions in PL/SQL.

Developed mappings in Informatica using BAPI and ABAP function calls in SAP.

Used the Remote functional call RFC as the SAP interface for communication between systems

Implemented RFC’s for the caller and the called functions module for running in the same sytem.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.

Worked with SQL*Loader tool to load the bulk data into Database.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in)

Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.

Defect Tracking and reports are done by Rational Clear Quest.

Environment: Informatica Power Center 8.6/8.1, SQL*Loader, IDOC, RFC, HP Quality Center, Oracle9i/10g, AUTOSYS, Rational Clear case, Rational Clear Quest, Windows XP, TOAD, UNIX.

Client: Confidential, Chicago June 2006 to Sept 2008

Role : Sr. ETL Developer

Description: This project responsibility is to develop an Enterprise DataWarehouse (EDW). This responsibility of this project to completely integrate the business to single environment.This data warehouse is used to access easily the detailed data on a single platform and also to facilitate the enterprise-wide data analysis to Reporting within the business environment. This Data Warehouse is build using Informatica Power Center 8.6.1 for extracting data from various sources including flat-files, SAP-ABAP, Teradata and Oracle.

Responsibilities:

Analyzed the requirements and framed the business logic for the ETL process.

Extracted data from Oracle as one of the source databases.

Involved in JAD sessions for the requirements gathering and understanding.

Involved in the ETL design and its documentation.

Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.

Followed Star Schema to design dimension and fact tables.

Experienced in handling slowly changing dimensions.

Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.

Responsible for the development, implementation and support of the databases.

Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.

Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.

Developed reusable Mapplets and Transformations.

Used data integrator tool to support batch and for real time integration and worked on staging and integration layer.

Optimized the performance of the mappings by various tests on sources, targets and transformations

Design, develop and Informaica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows

Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes

Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.

Scheduled the tasks using Autosys.

Loaded the flat files data using Informatica to the staging area.

Created SHELL SCRIPTS for generic use.

Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10.

Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.

Environment: Windows XP/NT, Informatica Powercenter 9.1/8.6, UNIX, Teradata V-14, Oracle 11g, Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO, Autosys, Korn Shell, Quality Center 10.

Client: T-Mobile, Seattle WA Aug 2003 to Jan 2006

Role: ETL DEVELOPER

Description: T-mobile is the one of the largest Telecom Companies in the USA. Joined existing onshore BI team as ETL Developer and successfully designed, developed business solutions. The project which aims in fulfilling T-mobile’s need for reporting to better understand the market trends, behavior, future opportunities and to improve there decision making process. Coordinated with the business and P&A team to understand the system requirements and then analyzing and designing ETL solutions to accomplish the same. Involved in various successful releases to accomplish T-mobile’s reporting needs under order-activation (OA) functional area.

Responsibilities:

Gathered business requirements from Business Analyst.

Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.

Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load

Installed and Configured the Informatica Client tools.

Worked on loading of data from several flat files to XML Targets.

Designed the procedures for getting the data from all systems to Data Warehousing system.

Created the environment for Staging area, loading the Staging area with data from multiple sources.

Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.

Used workflow manager for session management, database connection management and scheduling of jobs.

Created UNIX shell scripts for Informatica ETL tool to automate sessions.

Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.

Environment: Informatica Power Center 5.1.2/7.1, Erwin 4.5, Oracle 9i, Windows NT, Flat files, SQL, Relational Tools, Clear Case, UNIX (HP-UX, Sun Solaris, AIX) and UNIX Shell Scripts.



Contact this candidate