Post Job Free

Resume

Sign in

Sr. Informatica Developer

Location:
Moline, IL, 61265
Salary:
negotiable
Posted:
August 23, 2011

Contact this candidate

Resume:

Prasad

810-***-****©-onew1c@r.postjobfree.com

Professional Summary

Sr. ETL Professional having 5+ years of experience with strong hands on expertise in Design, Develop and Deployment of Data Warehousing Applications based on Kimbal Data Warehousing Methodology.

• Deep understanding of Concepts and Components used in implementation of Enterprise Data Warehouse, Data Marts.

• Well versed with development of ETL solutions with Informatica PowerCenter 7.x.,8.x,9.0.1

• Experience in Installation and Upgrading of Informatica PowerCenter8.x to 9.x

• Extensively worked on Informatica PowerCenter Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and Union.

• Proficiency in using Power Center Transformation Language inbuilt Functions to design data conversions routines from wide variety of sources such as Flat Files & RDBMS.

• Skilled in Development of complex and optimized SQL queries for extracting data from multiple relational data source.

• Applied the Mapping Tuning Techniques such as Pipeline Partitioning to speed up data processing. Conducted Session Thread Analysis to identify and fix Performance Bottlenecks. Using Bulk Loading to reduce data write IO cycles.

• Possess good understanding of Dimensional Modeling concepts such as Star Schema, Snow Flake Schemas applied in development of Data Marts & Enterprise Data Warehouses.

• Proficient in translating Business Processes / Requirements into technical requirements.

• A Team Player with excellent communication, analytical, verbal and writing skills.

Technical Skills

Expertise Area Tools

ETL Informatica PowerCenter 7.x.,8.x,9.0.1 , Toad ,PL-SQL Developer,SQL* Loader

Data Sources Oracle 10g, Oracle 11g, Cobol Flat Files, CSV Files

Languages SQL ,PL-SQL ,Unix Shell scripting ,C , C++ ,Java ,UML

Operating Systems Linux ,HP-Unix, Windows 2003 Server

CASE Tools Mercury Quality Center, Rational Clearcase , MS-Visio, Rational Rose Studio

Scheduling Autosys , Tidal

Education

M.Tech, Andhra University, India

Specialization – Computer Science& Technology

Professional Experiences

Client: John Deere, Moline IL Nov 2010 – Till Date

Role: Programmer Analyst

Dealer Scorecard is sales performance management system which tracks product sells by dealers across various divisions and territories. It helps in understanding potential of dealers in selling particular products in specific regions. Based on performance dealers are provided with add on packages assisting them in building better business. Analytical information about the seasonal and geographical data helps marketing and sales teams in developing promotional campaigns and reaching potential customers in certain geography.

Responsibilities

• Involve in Business process analysis and technical design sessions with business and technical staff to develop requirements document, and ETL source to target specifications. Perform impact analysis for change requests and coordinate change management with the deployment teams.

• Perform data analysis on source system to provide inputs for building logical Data model for fact and dimension. Coordinate with DBA for physical data model implementation.

• Develop mappings & mapplets for historical load & incremental load using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.

• Implement Parameterization to support the loads based on region and the time window. Load data in Type -2 slowly changing dimension for new product category, dealers, geographies and customers.

• Design and develop PL/SQL procedures and call them through stored procedure transformations in the mapping.

• Develop Shell Scripts to apply various slicing & dicing operation such as split, sort, concatenate, merge & FTP data files.

• Create the Unit test cases & Support System ,Integration & User acceptance

• Monitor the production cycles for Daily, Weekly & Monthly loads.

Environment: Informatica 9.1, Oracle 11g, Linux, PL-SQL, Unix Shell Scripting, Business Objects

Client: CSX Corp., Seattle, WA June 2009 – Oct 2010

Role: Programmer Analyst

MIS Financial scorecard is BI program designed to analyze financial performance & improve decisions making in Financial Planning. MIS provides analytical perspective with a deep dive into the General Ledger, Account Payables & Receivables information generated by various group companies and various departments among those companies. MIS enables hierarchical drill down & rollup across the Organization Hierarchy for understanding line item level, Year over Year, comparison between Actual and planned expenses.

Responsibilities

• Involved in all phases of SDLC from requirement, design, development, testing, training and rollout to the field user and support for production environment.

• Extensively interacted with user and Involved in requirement gathering, prototyping and prepared various documents like Interface Requirement Document, Customer Requirement document, Integration test plan, Unit test plan, Release notes etc.

• Analyzed data sources to identify the common attributes between multiple tables and finalize the single source of data for Master Data Attributes.

• Designed and developed historical load & incremental load mapping.

• Implemented transformation logic for applying various business rules and data standardization rules for source data coming from multiple systems into the data warehouse.

• Extensively worked in Oracle SQL Query performance tuning, Created DDL scripts, Created database objects like Tables, Indexes and Sequences etc. Closely worked with DBAs to create Physical Databases.

• Involved in creating the schema from the data model to the target server using forward and reverse engineering and generated the script file using ERWIN 4.0.

• Improve ETL process performance by Query Optimization, Pipeline Partitioning, memory /CPU management. Interface with DBA’s to review the volumetric requirements.

Environment: Informatica 8.1, Oracle 10g, Linux, PL-SQL, Unix Shell Scripting, Control M

Client: RALPHS Corp., Los Angeles, CA Mar 2008 – May 2009

Role: Programmer Analyst

The main objective of the project Retail Business Analytical Process is to analyze the sales information. Currently Ralphs grocery group has over 33 distinct systems associated with orders and sales data. Design has been done to replace the existing code into Informatica. Logic has been designed through Informatica mappings to cleanse the data and load the data into the data mart. This helps the management to forecast the cost and expenses analysis for better utilization of resources there by implementing various sales promotional techniques and decisions to improve the market share.

Responsibilities

• Extensively interacted with user and Involved in requirement gathering, prototyping & publishing report layouts.

• Created Source to target mappings for staging & ODS layer.

• Mapped the dimensions sources to dimension hierarchy tables. Build the self referencing lookup transformations to convert serial dimension attributes to fit into hierarchical structure.

• Implemented ETL mappings loading Type-2 Slowly changing Dimensions.

• Implemented transformation logic for applying various business rules and data standardization rules for source data coming from multiple systems into the data warehouse.

• Documented the Unit Test Cases & Integration Testing Steps and captured the results. Prepared release notes for migrating the objects from DEV to INT to UAT environments.

• Member of team performing Production support, job monitoring, troubleshooting on Level-II SLA.

Environment: Informatica 8.1, Oracle 10g, HP-UNIX, PL-SQL, Unix Shell Scripting, Autosys

Client: Network Appliance Inc. Santa Clara, CA Feb 2006 – Mar 2008

Role: Programmer Analyst

Network Appliance Information Management: Network Appliance Inc. (NetApp) is a world leader in unified storage solutions for today's data-intensive enterprise. The Client had a vast datawarehouse (in Oracle Database) and a number of downstream Datamart to store history information for high-level analysis. This datawarehouse and the datamarts were to be loaded on daily basis from multiple data sources. Reports were published each day from the datawarehouse. This was a maintenance project with occasional enhancement and development requirements.

Responsibilities

• Involved in creating business rules, data cleansing rules & source to target mapping documents.

• Developed Mappings that includes transformations like Source Qualifier, Aggregator, Expression, Lookup, Filter, and Joiner.

• Developed Mapplets, Reusable Transformations to populate the Data into Data Warehouse.

• Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables.

• Designed mappings with mapping parameters and mapping variables for incremental loading.

• Designed test mappings to identify performance bottle necks.

• Involved in improving the performance for mappings and sessions.

• Involved in ETL testing.

• Interacted with dependent source system business users and technical team to understand the overall data flow cycle and develop various data audit measures to perform sanity checks on the incoming transactional data through FTP process.

• Participated in identifying the job flow and creating Autosys Job scripts based on dependencies between these jobs.

Environment: Informatica 7.1, 8.1, Oracle 10g, Autosys, Cognos Impromptu



Contact this candidate