Post Job Free

Resume

Sign in

Manager Data

Location:
United States
Posted:
July 08, 2015

Contact this candidate

Resume:

VENKATA N BONAM

732-***-****

acqnc9@r.postjobfree.com

SUMMARY

Over 13 years of IT experience in Business Requirements analysis, Application Design, Data Modeling, Coding, Development, Testing and implementation of business applications with RDBMS, Data Warehouse/Data Mart, ETL, OLAP, Client/Server environment applications.

Strong experience in Data Warehousing and ETL using Informatica Power Center 9.6.1/9.5/9.1/8.1.1, OLAP, OLTP, SQL, PL/SQL, Complex Stored Procedures, and Triggers.

Strong experience in Informatica Power Exchange to extract data from Application like SAP, Mainframes and metadata manager

Extracted data from SAP R/3 using ABAP and data sources with BCI (Business Content Integration), RFC/BAPI to load into SAP systems and BI using OHS (Open Hub Services) and open hub destination(OHD)

Strong knowledge on Info Cubes, ODS for SD, MM, CRM, HR SEM and FI/CO, GL modules.

Integrated of various data sources/Targets with Multiple Relational Databases like Oracle, Terawatt, Flat files, DB2

Having experience working on End-To-End implementation of data warehouse and strong understanding of Data Warehouse Concepts, ETL, Star schema, and Data Modeling.

Experience in debugging and performance tuning of targets, sources, mappings and sessions.

Strong skills in back-end testing using SQL Queries on SQL SERVER, Oracle, MS Access

Experience in creating BTEQ scripts and good knowledge on Teradata utility scripts like FastLoad, MultiLoad and Teradata parallel transporters to load data from various source systems to Teradata.

Experience working on Business Intelligence tools like Business Objects 6.0/5.1/5.0,MS OLAP & Cognos Suites 7.1 (Impromptu, Power play, Report Net), Design and development of Mappings and Scheduling of batches for ETL projects using Informatica as ETL tool.

Configured Database and ODBC connectivity to various source/target databases.

Experience in Software Development Life Cycle (SDLC) 9001:2000,SEPG, Methodologies, CMM Level5, and Validations to ensure complete Quality Assurance Control

Ability to work in Multi-Platform Environments like Windows and UNIX.

TECHNICAL SKILS

Data Warehousing Tools: Informatica Power center 9.5/9.1.1/8.1.1, Informatica Power Exchange, Cognos & Business Objects, Microstatergy, Erwin 4.5, Trillium

Databases: Oracle 10g/9i/8.0, Teradata V2R6MS SQL Server 2006/2007, DB2UDB, MS Access

Operating Systems/ Network: Windows 98/2000/XP/2000/2003 and UNIX (AIX, Sun Solaris, HP-UX, RS 6000)

Technologies: Java script, UNIX shells Script, XML, Perl and Korn scripting, C#, .Net, ASP.Net, C, and C++.

Testing Tools: QTP, Win Runner, Manual Testing, Test Director, Quality Center.

Scheduling tools: Autosys, Maestro, Tidal

ERP: SAP ECC 6, SAP R/3 4.6 C, BW 3.0B,salesforce.com

EDUCATION

MADRAS UNIVERSITY

Completed Master’s in Computer Applications (MCA)

ANDHRA UNIVERSITY

Completed Bachelors in Science (B.Sc.)

EXPERIENCE

EBay Inc. / PayPal San Jose, CA

Technical Lead May 2015- Current

Coordinating with business users and collecting requirements and creating functional and technical design documents

Creating mappings with respect to mapping design documents to load BETL from SAP HANA staging layer to SAP HANA business layer and reviewing code developed by offshore Deloitte team and check if code is as per standards

Deployed all the code developed in EBay server to PayPal servers.

Environment: Informatica 9.6.1, Power Exchange, SAP ECC 6, SAP HANA, Windows XP, Business Objects, Control-M

Idaho Power Boise, ID

Informatica Lead Aug 2014-Apr 2015

Coordinated with data analyst and business users to create source to target mapping documents from SAP sources like ABAP tables, SAP Data Sources, SAP function, SAP BW (OHD)

Created Informatica mappings using ABAP type extraction to load data from SAP tables, and generated BCI mapping to load data from SAP business content to load to staging area

Created SCD type 2 mappings to load data from staging to foundation area using various transformations

Created reusable code to generate MD5 hash to identify changes coming from source.

Created technical designs and unit test scripts and test documents and migrated code to higher environments

Participated in daily standup’s to track project/issues status.

Worked with SAP ADMIN to identify authorization issues for BCI loads.

Environment: Informatica 9.5, Power Exchange, Informatica metadata manager, SAP ECC 6, SQL Server 2012, PL/SQL, Windows XP, Business Objects, UC4

Sony Electronics Limited San Diego, CA

Tech Lead/Architect May 2013- July 2014

Responsibilities:

Worked with upstream teams in identifying impacted list of fields that are Logically/Structurally impacted or required conversion

Created ripple query to identify the impacted fields across all ETL/DB objects and compared with manual analysis to identify the gaps

Created new ETL interface to load data from retiring applications like mainframes ORA apps into new SAP (DRP) System

Created new BAPI/Idoc interfaces to extract data from SAP BAPI and IDoc’s.

Worked closely with Data modelers and create standards, guidelines for data modeling and ETL processes

Reviewed and approved functional specification /Technical designs for all impacted ETL/New ETL’s and created UAT/SIT docs for the same after required changes are complete.

Involved in designing CDW (various levels of staging) from ultimate source to target and created standard guidelines for Informatica mappings

Designed and developed mappings to load data from Salesforce.com using Informatica Power Exchange for Sales force using

Extensively used Informatica metadata manager tool to identify the data flow using data lineages and created metadata manager services.

Extensively used Teradata utilities and configured Informatica sessions for Teradata parallel processing.

Configured sessions for Teradata parallel processing API targets and enabled session recovery for Teradata PT using stream mode

Extensively used Informatica Power Exchange for CDC.

Used data lineages from metadata manager for auditing the flow of data from source to target.

Extensively used Informatica metadata manager and business glossary to identify the impact analysis.

Involved in loading power center resources and relational DB schema to metadata repository and

Created Sales force lookup / merge transformation to lookup source data and delete duplicates.

Communicated to downstream systems about the impact analysis to handle the changes from reporting end

Extensively worked with offshore teams and evaluated the timely changes to the coding as part of conversion.

Developed & reviewed Informatica code done by team members and approved as per the standard guidelines document

Fine tuned ETL mappings with bottlenecks to improve the performance.

Held daily meetings with team members to resolve open issues/tickets, show stoppers to meet project plan deadlines.

Worked with Informatica admin team in upgrading Informatica repository from 8.6 to 9.5 and tested objects after the up gradation.

Environment: Informatica 8.6/9.5, Power Exchange, Informatica metadata manager, Teradata 13, SAP ECC 6, Oracle 11g, PL/SQL, SQL Server, Windows XP, Business Objects, Control-M

Devon Energy Corporation Houston, TX

Contractor August 2012- April 2013

Responsibilities:

Created Technical Designs & defined mapping, transformation rules as per the Functional Documents.

Extensively used Informatica Mappings & Workflows. Designed & created complex Mappings using various Transformations like Lookup, Update Strategy, Expression, Aggregator, Normalizer, Union, SAP Functions, ABAP, IDOC interpreter, BAPI & SAP Application Source Qualifier

Developed Mappings for inbound/outbound using, IDOC, and BAPI’s & set up Business Content mappings i.e. Listener, Send Request, and Processing & Cleanup to load into EDW for upstream Business Objects Reporting System.

Captured CDC from Sales force objects based on creates date and last modified date fields.

Configured Sessions and workflow to process sales force data.

Implemented SCD type2 design for CDW.

Designed & developed the Landing and ODS framework, ETL etc. using complex Oracle SQL queries, Informatica mappings using various transformations like Filters, Routers, Lookups, Expression etc.

Provided post Go-Live live support and troubleshooting with Production jobs.

Tuned the performance of Informatica, Database and Reporting performance. Some of the steps used to tune the performance include indexes, Informatica and database partitioning, Explain Plan, Effective rewriting the query and SQL Hints.

Created mappings to extract data from SAP APO/BI cubes/DSO’s using OHS (Open hub services) and coordinated SAP BW team in configuring the process chain.

Coordinated with SAP BW team in clearing logs and rerunning the SAP failed process chain.

Created basic BTEQ (basic Teradata query) to generate keys.

Created business glossary and performed data lineage analysis on how metadata for business term is derived.

Extensively used match and merge techniques to get the golden records using Informatica MDM hub and involved in ORS creation.

Extensively involved in Unit/Integration testing in Sand Box/Development environment. Participated in Migration to QA & Production environment, troubleshooting data load /data validation, Go-live & Support

Environment: Informatica 9.1.1, Power Exchange, SAP ECC 6, Teradatac13, Oracle 10g, PL/SQL, SQL Server, UNIX scripting, Windows XP, Business Objects, Salesforce.com, Tidal, Erwin.

Ericsson Plano, TX

Informatica BI Lead Mar 2011- July 2012

Responsibilities:

Worked with Business Analyst to gather the requirements and created functional specifications

Coordinated with data modeler in designing the various schemas for Finance and planning and manufacturing modules.

Worked on Offshore Onshore model, held meetings on a daily basis for coordination and planning.

Handled client enquiries and prepared daily workloads for the team members.

Created mapping documents and estimates for each mapping.

Worked with offshore team and conducted daily meetings to get status update.

Worked with SAP team to identify the tables, Functional modules and data sources.

Created BCI and ABAP mappings to extract data from various data sources and tables.

Used RFC/BAPI to load data into SAP.

Extensively used various transformations like Aggregator, Expression, connected and unconnected Lookup’s and update strategy transformations to load data into target.

Worked on performance issues and bottlenecks to reduce the load time and monitored the data loads.

Created mappings to support delta loads into target.

Used Teradata utilities fastload, multiload, tpump to load data

Performed tuning and optimization of complex SQL queries using Teradata Explain. Created several custom tables, views and macros to support reporting and analytic requirements. Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer

Created UNIX shell Scripts to schedule workflows.

Environment: Informatica 8.6.1, Power Exchange, SAP ECC 6, Oracle 10g, PL/SQL, UNIX scripting, Teradata 12, Windows XP, Business Objects, Autosys, Erwin.

Thomson Reuters Newark, NJ

Informatica Lead Aug 2010-Jan 2011

Responsibilities:

Extensively used Informatica Power exchange to extract data from SAP FICO, SAP SD modules.

Worked with SAP ABAP team in extracting data from critical SAP tables like KONV using BAPI code.

Involved in migrating ABAP code from Dev to QA and production.

Created various transformations to move SAP data into Oracle tables.

Analyze complex customer’s business requirements in areas of GL Account, Cost Center & Profit Center Accounting and map them to standard SAP processes, solutions and products within project scope.

Used Autosys to schedule workflows in production environment.

Extensively worked with data modeler in designing GL account, Cost center schemas.

Developed detailed functional specifications, enhancements and modifications documentation and track development progress partnering with on-site ABAP team specialists

Offered solutions to the top management regarding project related queries

Participated in testing at all levels (unit, integration and user testing); designing and driving testing initiatives; develop and execute test plans; provide leadership to others in support role; documentation and training end users on processes

Environment: Informatica 8.6.1, Power Exchange, SAP ECC 6, Oracle 10g, PL/SQL, UNIX scripting, UNIX AIX & XP, Business Objects, Autosys.

Johnson & Johnson Health care systems Piscataway, NJ

BI Lead Jun 2009-July 2010

Responsibilities:

Extensively used Informatica Power exchange to extract data from SAP ECC 6, SAP APO (SCM) 7.0, SAP BI 7.0, Oracle 10 g an SQL server.

Extracted data from standard and customized data sources using BCI Extraction.

Created Send request/Process request/Listener mappings to load data from SAP data sources.

Lead Bi team and coordinated with offshore WIPRO team Analyzed the individual performance of the team and motivated them to perform even better

Analyzed the assigned projects and distributing the tasks to the members as per their area of expertise

Worked with SAP team to identify the right data sources with respect functional design specs.

Generated ABAP code for extracting data from SAP tables for stream and file modes using Informatica.

Created SAP BW OHS (open hub services) source definitions in Informatica and configured work flow to run from SAP process chain.

Extensively worked on debugging the process chain failures and fixing them to re run the process chain to invoke the Informatica workflow to load SAP info cube data

Created step to step documentation to SAP BW data in to oracle target using OHS (open hub services)

Extensively worked on identifying the bottlenecks and fine-tuned the mappings.

Extensively used switch synonym process to load data into load and report tables in data mart.

Created Informatica mappings using various transformations.

Worked with Session Logs and Workflow Logs for Error handling and troubleshooting in DEV/QA/PRD environment.

Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD

Created and managed daily, weekly cycle and monthly data operations, workflows and scheduling processes using Maestro

Involved FDS review and designed TDS and design standards.

Environment: Informatica 8.6.1, Informatica Power Exchange 8.6.1, Oracle 10g, SAP ECC 6.0, Sap BI 7.0, SAP APO (SCM) 7.0, Maestro, Cognos 8.4.

Disney Studio’s & Home Entertainments Burbank, CA

Sr.Informatica developer Oct 2008-Apr2009

Responsibilities:

Extracted, transformed and loaded data from multiple source databases (SAP, RDBMS, Sybase flat files etc.)

Worked closely with SAP Functional team to gather required resources

Developed data maps to extract data from SAP using Informatica Power Exchange

Developed mapping to extract data from SAP Data source using BCI (Business Content Integration) mapping wizard, and activated the data sources from Informatica Power Center.

Imported Power center objects like Listener, Send Request, Process Request, Cleansing mappings

Customized fields in Data source and Activated data source from SAP

Created Send and Process Request files for the Full, Delta, Delta Initialization and Delta Repeat loads.

Created Request files from SAP Data source Using BCI (Business Content Integration)

Filtered Changed data using Power exchange CDC and loaded to the oracle target

Developed send & process request Workflows to source the data from SAP to ODS (Target-Staging Tables DB2 Using BCI Listener)

Used RFC/BAPI Transformations to update and process data from SAP Applications.

Created RFC/BAPI Transformations in mappings and configured transformations

Generated XSD files for table type parameters in RFC/BAPI mappings

Created various transformations like Expression, Lookup, Joiner, Router, Filter, Normalizer, and Aggregator to extract data from various sources like Oracle10g, SQL server files into Target (DB2).

Created UNIX shell script to run pre-session and post-session commands

Wrote PL/SQL stored procedures and triggers for validating business rules and transformations.

Involved in upgrading Production server from Informatica power center 7.1.2 to Informatica 8.1.1

Created UNIX shell script to schedule the jobs with Maestro tool

Applied Labels to the Latest versions of workflows & Used Deployment Techniques to migrate workflows from development to QA and to Production.

Environment: Informatica 8.1.1, Power Exchange, SAP ECC 6, R/3, DB2, Oracle 10g, SQL Server, XML PL/SQL, IBM I series, Trillium, UNIX AIX & XP, Micro strategy.

CARLSON Minneapolis, MN

Informatica Consultant Sep 2007 – Sep 2008

Responsibilities:

Extracted data from various Source Systems like Oracle, XML Files and Flat Files as per the requirements using Informatica Power Center 7.1.2

Identify the data transport mechanisms and Database tables that will be used in production following the Project implementation

Created Mapplets for reusable business logic to be used in multiple locations

Defined SAP BW as source system for Informatica Power Center

Created Batch process using Informatica Power Connect for SAP for moving large amounts of data between SAP data base to staging

Build and configured SAP BW components. Imported InfoSource definitions from SAP Administrative work Bench to Power Center warehouse designer.

Configured saprfc.ini file using Type A and Type R which is required to connect SAP system

Crated Inbound/outbound Idoc mappings for SAP/ALE Idoc Extraction

Created SAP/ALEIDoc Source definitions

Created and Edited SAP/ALE Idoc Transformations to Change the data segments

Created and Run the UNIX scripts for all Pre-Post session ETL jobs.

Used Autosys to run the ETL jobs

Worked with Mapping Parameters and Mapping Variables in various mappings to pass the values between sessions

Checked EDT (database) to determine successful transaction of test data (as per the business requirements) from the Jobs by establishing database connectivity and by using SQL commands

Installed and configured Informatica in UNIX test region and assisted installation in production region

Performed Requirements Analysis by coordinating with Business Analyst and Obtained sign-off on Requirements

Wrote SQL overrides in the Source Qualifier to filter the data being pulled/selected from the Source table and also to perform incremental loading of data into the Target table Developed and implement standards and guidelines for Cognos reporting

Managed one-time migration of data with planned outages in the weekend

Identify the source system tables/ manual files that contain the data required to populate.

Environment: Informatica Power Center 8.1.1, Oracle, DB2/AS400, Sybase, Cognos Report net, SAP R/3 4.6 C, SQL server, Autosys, Power Exchange, Crystal reports, MS SQL, PL/SQL, UNIX Scripts and Windows Server 2000.

Countrywide Insurance corp Pasadena,CA

Informatica devoloper Dec 2005- Jul 2007

Responsibilities:

Analyzed business process and gathered core business requirements. Interacted with business analysts and end users and identified the potential bottlenecks.

Involved in design/development of the business warehouse model.

Implemented dimension model (logical and physical data modeling) in the existing architecture using Erwin.

Design of ETL processes to populate the fact and dimension tables with staging table data.

Used Informatica designer to extract data from different data sources.

Created mappings and Mapplets to load data from source systems into data warehouse.

Created different transformations such as Joiner, Look-up, Rank, Expressions, Aggregator and Sequence Generator to implement the business rules.

Identified & Implemented Slowly Changing dimension methodology for accessing the full history of accounts and transaction information

Responsible for Session partitioning for concurrent loading of data in to the target tables to improve the performance.

Optimized the performance of the mappings by various tests on sources, targets and transformations.

Performed incremental aggregation to load incremental data into Aggregate tables.

Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.

Responsible for Scheduling of jobs using Autosys.

Involved in creation, backup, restoring, and performance tuning of Repository.

Environment: Informatica 6.2,Autosys, Business Objects 5.1/6.1, Dashboard Manager, TOAD, Erwin 4.0, SQL Server v7.0, Oracle9i, Solaris 2.8

Bear Stearns & Co New York, NY

Informatica Consultant Jun 2004 – Nov 2005

Responsibilities:

Defining Source-Target mappings for the transformations.

Created Reusable Transformations and Mapplets.

Responsible for scheduling jobs and preparing technical documentation.

Implementing the Repository level & data level security methods to restrict & control the data, which is exposed to the end users.

Worked with Sessions, Batches and Workflows.

Data Loading and verifying the Source and Target Databases.

Conducted System and Regression testing identified application errors and interacted with developers to resolve technical issues.

Performed positive and negative testing of the application for identification of bugs after fixing of errors in each subsequent build during the process of development.

Created Universe as a Source for the Business Objects Reporting and Analysis.

Environment: Informatica Power Center 6.1, XML, Business Objects 6.0, UNIX (SOLARIS 7.0/8.0), Oracle 8i, MS SQL Server 2000, TOAD, Windows 2000 server

Cigna Philadelphia, PA

ETL Developer Oct 2003 – May2004

Responsibilities:

Extensively used Informatica Power Center for Extracting, Transforming, and Loading into different databases.

Created mappings using Informatica designer, sessions using workflow manager and tested the sessions using test data.

Extensively used ETL to load data from Oracle database, SQL server database, and Flat files to Oracle.

Created necessary test plans and design documents for the mappings.

Analyzing the data model and identification of heterogeneous data sources.

Designing and testing the different mappings according to the requirements.

Used Source Analyzer and Warehouse Designer to import the source and target database schemas.

Worked extensively on different types of transformations like Source Qualifier, Expression, Filter, Aggregator, Rank, Lookup, Update Strategy, Stored Procedure, Sequence Generator and Joiner.

Integrated various sources into the staging area in Data Warehouse.

Environment: Informatica Power Center 6.1 (Power Center, Designer, Workflow Manager, Repository Manager), Oracle, SQL Server 2000, PL/SQL, Windows Server 2000, and UNIX

Elite Technologies, Hyderabad, India

Informatica Developer Mar 02 to Aug 03

Responsibilities:

Gather and analyze business requirements and translate them into specifications and Worked as a team member for design and development of Sales Data Mart.

Involved extracting data and transforming the data before loading the data into target (warehouse) Oracle tables. The transformations included generating unique sequence numbers and aggregations for the star schema target tables. The data sources were Oracle database tables and flat files and the target tables were in an Oracle database.

Created mappings using Informatica Designer to build business rules to load data and tuned them to enhance the performance

Experience in identifying Bottlenecks in ETL Processes

Schedule Sessions, Run sessions, Extraction, Load process and monitor sessions by using Informatica Server Manager

Environment: Informatica 5.1, Oracle 8i, PL/SQL, SQL, Windows 2000 Server.



Contact this candidate