Post Job Free

Resume

Sign in

Data Developer

Location:
San Ramon, CA
Salary:
$85/hr
Posted:
February 05, 2018

Contact this candidate

Resume:

VENUBABU DONTHULA

510-***-****

ac4csm@r.postjobfree.com

LinkedIn: http://linkedin.com/in/venubabu-d-525a9127

Summary:

Over 13 years of Experience in analysis, design, development, testing and deployment of Informatica 10.1/9.6.1/8.x/7.x,IDQ, Oracle, UNIX, Cognos.

About 4 year of Dimensional Data Modeling experience using Erwin, Ralph Kimball Approach, Star/Snowflake Modeling, Data mart, OLAP, FACT & Dimensions tables, Physical & Logical data modeling.

12 years of strong experience on Data Design/Analysis, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Source Systems Analysis and Reporting Analysis.

12 years’ experience in ETL design and development using various data sources Oracle, Teradata, DB2, SQL Server and flat-files, CCF files, XML, etc.

Has performed variety of roles including Informatica Architect, Admin, Developer and ETL Analyst.

Involved in functional and technical system analysis & Design, System Architectural design, Presentation and documentation.

Hands on experience in identifying and resolving performance bottlenecks in various levels like Sources, mappings and sessions. Hands on experience in optimizing SQL scripts and improving performance loading databases.

Strong experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views.

Experience in Informatica IDQ tool.

Experienced in Master Data Management (MDM) that consistently gives quality precise data.

Expert-level mastery in designing and developing complex Informatica mappings to extract data from diverse sources including flat files (fixed width, delimited), XML, RDBMS tables, legacy system files and proficient in creating workflows/ worklets and Mappings/Mapplets.

Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Business Intelligence experience using Cognos and OBIEE and experienced in coordinating with cross-functional teams.

Automation and scheduling of Informatica sessions and batches using Tivoli Scheduling tool.

Experience reviewing Test plans, Test cases and Test case execution. Understanding business requirement documents and functional specs and then writing test cases using Quality Center. Also played an active role in User Acceptance Testing (UAT) and unit, system & integration testing.

Has very good analytical, communication and inter-personnel skill.

Education Qualifications:

Master of Technology from Andhra University, India (2002)

Technical Skills:

Operating System: UNIX, Linux, Windows 98/NT/2000/XP/VISTA

Languages: PL/SQL, SQL, C, C++, XML and Korn shell scripting

RDBMS: Oracle 11g/10g/9i/8i, Teradata, Mainframes DB2, MSSQL Server.

GUI: Toad, SQL Developer, Teradata SQL Assistant, Erwin

Scheduler: ESP Scheduler, Tivoli Scheduler.

ETL Tools: Informatica 10.1/9.6.1/8.x/7.x, Informatica IDQ.

Reporting Tools: Cognos 10.2/8.4, Cognos TM1, Tableau.

Professional Experience:

Informatica Lead /Architect Bank of the West, San Ramon, CA Oct’16 –Till date

Sr. Informatica Developer/Architect @ Kaiser Permanente, Pleasanton, CA Feb’ 12 –Sep’16.

Sr.Informatica Developer @ McGraw-Hill Companies, NY Jan 11 -Jan'12.

ETL Designer @ Aviva Insurance, Des Moines, IA Feb’10-Dec’10.

ETL Analyst @KITS, India Jul ‘04 – Jan’10.

Work Experience:

Informatica Lead /Architect @Bank of the West, San Ramon, CA Oct’16-Till date

SAM (Suspicious Activity Monitoring):

Bank of the West is upgrading its current Suspicious Activity Monitoring Platform from the existing Actimize FORTENT application to Actimize SAM application. The project is responsible to source transactional, customer and accounts data from all the different systems that Bank has and then send the data to SAM application in the acceptable format. Currently there are around 24 different systems that Bank has and the project’s requirements is to send data to SAM for suspicious activity monitoring from all 24 systems. Also, reconcile the data to the actual Bank’s books.

CTR (Currency Transaction Report):

The Bank Secrecy Act requires banks to file a CTR (FinCEN form 104) for each transaction in currency (deposit, withdrawal, exchange, purchase or other payment or transfer) of more than $10,000 (or US dollar equivalent) by, though, or for the bank. Certain types of currency transactions need not be reported, such as those involving “exempt persons”, a group which can include retail or commercial customers meeting specific criteria for exemption.

Responsibilities:

Involved in the Design, Development, Testing, Analysis, Requirements gathering, function/technical specification, deploying.

Responsible for defining, documenting and publishing Functional Specifications. Non Functional Specifications and BRD-Business Requirements Documents.

Lead the development team during the development process, this include provide help with the tasks and transfer knowledge of Informatica.

Modified the ETL code as per the changed business requirements.

Worked closely with reporting team and helped them whenever they had any ETL issues.

Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

Expertise in working with Informatica Performance Tuning activities – Partitioning, Analyze thread statistics, Indexing, Aggregations and Pushdown Optimization.

Design and develop various ETL – Mappings.

Implemented ETL Automation process for ETL code Integrity testing.

Environment: Informatica 10.1/9.6.1, Informatica IDQ 9.5, Oracle 11g, UNIX.

Sr. ETL Informatica Developer/Architect @ Kaiser Permanente, Pleasanton, CA

Feb’ 12 –Sep’16

EFPA (Enterprise Financial Planning &Analysis):

NPR (National Performance Reporting): Worked with LDGL and One-link data sources and ETL process built for National functions and Hospital health plan (HP/H) for all KP regions.

FABS (Forecasting and Budgeting System):

FABS is a budget development system for the TPMG and NCAL Hospital and Health Plan. It provides budgeting and planning services for 2000 managers and analysts on a continuous availability schedule. The TPMG solution is a driver based forecast and budgeting application logically grouped into five main models: BW, MD, Non-Payroll, OMS and RSR Models

MAEDS (Medicare Advantage Encounter Data Submission):

KPMETA application will to process, store and track Medicare Cost and Advantage encounter data reportable to CMS for all regions of Kaiser. KPMETA will produce reports listing errors detected by KPEG and CMS and provide to the regions for correction, as well as a summary of the results of each batch that was processed. KPMETA will also provide its encounter and status data to for reporting and analysis. The CCF output batches will be verified by KPEG and transformed to 837i and 837p transactions for submission to CMS. CMS will return 277CA responses, and KPEG will transform these to the 277CA CCF format and send to KPMETA.KPMETA will produce reports listing errors detected by KPEG and CMS and provide to the regions for correction, as well as a summary of the results of each batch that was processed.

Responsibilities:

Involved in the Design, Development, Testing phases of the Data warehouse and in Software Development Life Cycle (SDLC).

Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.

Develop logical and physical data models that capture current state/future state data elements and data flows using Erwin.

Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart.

Used Oracle’s Explain Plan utility to tune the SQL queries for reducing the running time of the queries.

Created Informatics mappings for initial load and daily updates.

Extensively used Transformations like SQL Transformation, connected/unconnected Lookup and Stored Procedure transformations, Java, Transaction Control, Union, Normalizer, Joiner, Update Strategy, and Application Source Qualifier etc.

Created Stored Procedure transformations to drop and recreate the indexes, truncate the data, to disable and enable the constraints and to analyze the tables after loading the data.

Created Dynamic files and Implemented Error handling strategy.

Designed reusable transformations and shortcuts to share different mappings. Created mapping variable and parameters.

Used aggregate calculations in source qualifier by SQL override.

Used various Mapplets and Worklets for repeated business logic.

Root Cause Analysis and Fixing mapping problems.

Created various Shell scripts for manipulating flat files and used in command task.

Extensively involved in Unit testing, System testing and troubleshoot defects.

Environment: Informatica 9.6.1, 9.x, Informatica IDQ 9.5,Oracle 11g, Cognos 10.2, Cognos TM1, UNIX,SSIS.

Sr.Informatica Developer @ McGraw-Hill Companies, NY Jan 11 -Jan'12

Platts real time news, pricing, analytical services, and conferences help markets operate with transparency and efficiency. Traders, risk managers, analysts, and industry leaders depend upon Platts to help them make better trading and investment decisions. Platts serves the oil, natural gas, electricity, nuclear power, coal, petrochemical and metals markets. Data from Different Source Companies is loaded in to data warehouse using ETL Informatica. Reports are generated from data warehouse for Business Users according to the requirements.

Responsibilities:

Involved in development of Logical and Physical data models that capture current state/future state data elements and data flow using Erwin Modeling Tool.

Followed Star schema to organize the data in the Data Mart.

Extracted the data from different sources like Flat files, Oracle and transformed the data based on the business rules and loaded in to Oracle target database.

Extensively used various types of transformations such as Expression, Connected Lookup (both static and dynamic), unconnected Lookup, Filter, Router, Normalizer, Update Strategy to transform the data.

Implemented Slowly Changing dimensions Type 1, Type 2 methodology for accessing the full history of accounts and transaction information.

Involved in hitting the various performance issues. Optimized the mappings, targets, sources and sessions wherever applicable to improve the performance.

Used debugger to debug the mappings and to troubleshoot the information.

Involved in production support during the vital stages of loading the Data Mart.

Responsible for monitoring loads and tackled performance challenges whenever issues are observed during the production load.

Developed mapplets and worklets.

Designed reusable transformations and shortcuts to share different mappings

Developed, tested Store procedures, Functions and packages in PL/SQL for Data ETL.

Identifying Performance bottlenecks and fine-tuning ETL mappings and workflows to improve performance.

Day-to-Day Production support.

Recovering failed jobs.

Tuning of Informatica mappings.

Created various reports and dashboards using Tableau.

Environment: Informatica 8.6.1, Oracle 10g, Informatica IDQ, SSIS, Tableau 6.1, Windows, HP, UNIX.

ETL Designer @ Aviva Insurance, Des Moines, IA Feb’10-Dec’10

Aviva is one of the largest life insurance companies in U.S. It offers a broad range of life insurance and annuity products. Aviva has embarked upon an innovative road map to enhance financial reporting capabilities which include financial processing, data management, financial modeling and reporting data from different source systems are being loaded into the Insurance Annuity Schema (IAS) which acts as an integration layer. The financial data (FI) is fed into the Oracle E-Business Suite through the Financial Accounting Hub (FAH) for GL processing and non-financial (MI) data into the Data Store. The processed GL data from Oracle E-Business Suite is then transported to the Data Store from FAH-R12 to the reporting layer that will provide enhanced reporting capabilities to the business community accessing the system.

Responsibilities:

Modified mappings as per the changed business requirements.

Extensively used various types of transformations such as Expression, Connected Lookup (both static and dynamic), unconnected Lookup, Filter, Router, Normalizer, Update Strategy to transform the data.

Extracted the data from different sources like Flat files, XML, Oracle and transformed the data based on the business rules and loaded in to target database.

Extensively worked with Oracle E-Business Suite (OEBS), FAH, AP, GL modules.

Implemented Balance and Control and Restart and recovery.

Created Dynamic parameter files and changed Session parameters, mapping parameters, and variables at run time.

Used debugger to debug the mappings and to troubleshoot the information.

Responsible for monitoring loads and tackled performance challenges whenever issues are observed during the production load.

Identifying and tracking slowly changing dimensions and implemented in complex Mappings.

Designed reusable transformations and shortcuts to share different mappings.

Interact with BO Analyst to assist him in Understanding the Source and Target System.

Debugging the BO reporting problems and providing the solutions.

Developed, tested Store procedures, Functions and packages in PL/SQL for Data ETL.

Used ESP Tool to schedule shell scripts and Informatica Jobs.

Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.

Responsible for determining the Performance bottlenecks and fixing the bottlenecks.

Responsible for Unit Testing and Integration testing of mappings and workflows

Involved in troubleshooting the load failure cases, including database problems.

Involved in Production Deployment and Support.

Environment: Informatica 8.6.1, Oracle ERP, MS SQL Server, Business Objects XIR2, Teradata V12, DB2, HP UNIX.

ETL Analyst @ KITS, India Jul‘04– Jan 10

A comprehensive accounting system was developed for the company. The system started with entry vouchers. Unique security requirement of this system was that, the vouchers were not passed and posted until approved by concerned authority. By this system showed all pending vouchers. Voucher type includes sales, purchase, receipts, payment and journal entry provision for bill matching was made in the system. This provides extensive analysis of cash flow projections, aging analysis, interest calculations and reminder letters.

By this the system would generate a user analysis report and shown errors created by any user. The report section included common reports like general ledger and final statements.

One of unique feature of trial balance report is that the user can move from account balance to the ledger and finally to the voucher entry stage.

Responsibilities:

Analysis, Requirements gathering, function/technical specification, development, deploying and testing.

Involved in the Design, Development, Testing phases of Data warehouse

Logical & Physical Database Layout Design using ERwin.

Involved in Design and Data Modeling using Star schema.

Created Informatica mappings for initial load and daily updates.

Involved in fixing invalid mappings, testing of store procedures and functions, unit and integration testing of Informatica Mappings, Sessions, Workflows and the target data.

Developed several mappings to load data from multiple sources to data warehouse.

Developed, tested Store procedures, Functions and packages in PL/SQL for Data ETL

Logical & Physical Database Layout Design.

Involved in Developing ETL and Aggregate Strategy

Worked with slowly changing dimension Type1, Type2, and Type3

Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.

Involved in Performance tuning activity in different levels.

Performance tuning using round robin, hash auto key, Key range partitioning.

Used parameters and variables for Incremental load/Full load of data in target table

Identifying and tracking slowly changing dimensions and implemented in complex Mappings.

Developing, scheduling and monitoring sessions.

Created Informatica mappings for initial load and daily updates.

Involved in fixing invalid mappings, testing of store procedures and functions, unit and integration testing of Informatica Mappings, Sessions, Workflows and the target data.

Extensively involved in Unit testing, System testing and troubleshoot defects.

Performed Data conversions.

Extensively used mapping parameters, mapping variables and parameter files.

Generated Reports using Metadata Reporter

Environment: Informatica 5.1/7.4/8.1, Mainframes, Oracle, PL/SQL, Erwin, TOAD 6.X



Contact this candidate