Post Job Free

Resume

Sign in

Sr.Informatica Developer

Location:
miami, FL, 33172
Salary:
60k
Posted:
May 13, 2009

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

• Over ** years of IT experience, in Data warehousing and various technologies, with domain knowledge of State Client, Supply Chain Management, Banking and Finance, Insurance, Pharmaceutical & Health Care, Oil & Gas and Retail Industries.

• Certification on Data Warehousing Concepts.

• Proficient in Data Modeling, Data Integration and Business Intelligence Tools.

• Eight years experience of developing Informatica mappings using Informatica version, Informatica Power Center 8.5.1, 7.1.4, 6.x and 5.x.

• Five years proficiency in Data modeling using Cognos Frame Work Manager and Report Net, Cognos Transformer, Cognos Suite 8, and exposure to Cognos 8.3.

• Experience in leading the teams as well as an Onsite coordinator for Onshore/Offshore Projects.

• Implemented various Data warehouse projects and supported them by using Ralph Kimball and Bill Inmon methodologies.

• Involved in the Dimensional modeling following the Snowflake schema and Star Schema. Created Project, Packages related Meta data in Cognos, and created Universes in Business Objects.

• Extensive experience in Extraction, Transformation and Loading (ETL) of Data into Data Warehouse, Workflows, Worklets, Mappings, Mapplets creation of Sessions and scheduling them using Informatica Power Center 8.5.1, 7.1.4, 6.x and 5.x versions.

• Varied experience in Designing and developing the complex mappings from various Transformations like Source Qualifier, Joiners, Aggregator, Routers, Filters, Expressions, Lookups, Sequence Generators, Update Strategy etc.,

• Extensive development, support and maintenance experience working in all phases of the Software Development Life Cycle (SDLC) especially in Data warehousing and Business Intelligence.

• Experience in Debugging sessions and mappings; Performance Tuning of the Sessions and mappings, implementing the complex business rules, optimizing the mappings.

• Proficient in 3NF (3rd Normal Form) and Dimensional Modeling techniques.

• Experience in Data Modeling of STAR Scheme and Snowflake Schema using tools like Erwin and Visio.

• Experience in the creation of Multidimensional OLAP Power Cubes using Cognos Transformer.

• Proficient in the OLAP Reporting tools like Cognos and Business Objects.

• Solid experience with RDBMS like DB2 UDB / SQL /Oracle / Teradata.

• Involved in the Meta Data Management of the Schema Designs.

• Proficient with UNIX Shell Scripting.

• Experience in working with Terabyte Size Source and Target Databases.

• Analyze, Test & Deploy Migration and Tool Upgrades. Establish and execute procedures for moving Mappings to Production.

• Proficient in the Production Job Scheduling using BMC Control-M, CA-Unicenter Autosys Job Management and ASG Zena.

• Perform 24X7 On-Call Production Support on rotational basis.

• Prepared/maintained documentation on all aspects of Technical processes to design code and support the Client Applications.

• Excellent oral/written/interpersonal communication skills, quick learner, willing to adapt to the team/organizational environment.

TECHNICAL SKILLS

Data warehousing / ETL Tools Informatica Power Center and Power Exchange 8.5.1, 8.1.1, 7.1.4, 6.x and 5.x

Business Intelligence & Reporting Tools Cognos 8.3 & 8.2.4, Cognos Suite 8, Report Net, Framework Manager, Transformer, Report Studio, Analysis Studio, Query Studio and Business Objects XI, Data Integrator.

Data Modeling Tools Cognos 8.2 Modeling, CA Erwin 4.1 Data Modeler, MS Visio.

Databases / RDBMS Oracle 10g/9i/8i, DB2 UDB 9.1, MS SQL Server 2008/ 2005, TOAD 9.0.1, Teradata V2/R5, MySql, TSQL, Pl/Sql.

Microsoft Suite Visio, Excel, Word, PowerPoint.

ERP Tools SAP R/3 4.6c, 4.0b, 4.7e Versions.

Operating Systems AIX UNIX, Unix (HP-Unix), Windows 95 to XP and Vista, Windows NT.

Scripting Languages Unix Shell scripting and Korn Shell Scripting (Kshell).

Ticketing Tools People soft Vantive

Job Control Tools BMC Control-M, CA-Unicenter Autosys Job Management, ASG Zena.

Domain Knowledge Banking and Finance, Insurance, Pharmaceutical and Health Care, Oil & Gas, Retail Industry.

EDUCATION

• Master of Science in Software Engineering (MSSE)

• Post Graduation Diploma in Business Administration (PGDBA)

• B.S. (Computer Science and Engineering)

CERTIFICATIONS:

• Data Warehousing Concepts Certification.

PROJECTS PROFILE

Client: Miami-Dade County Public Schools, Miami, FL Jan 2009 to May 2009.

Role: Senior Data warehouse Consultant / BI Architect.

Environment: Informatica Power Center 8.5.1, Cognos 8.3, Microsoft SQL Server 2008, Frame Work

Manager, Report Studio, Query Studio, Metric Studio, Microsoft Visio.

Project Description:

The project is based on the development of the Data Mart for Food & Nutrition Department at Miami-Dade County Public Schools. As a Data warehouse consultant, had to study the existing system and capture the data that was fed from various sources like Mainframe application systems, Flat file source and other web based applications. Created Database Tables, ETL mappings and the Cognos Reports. The ETL logic is done through the Informatica Mappings. The Frame Work Model was built to load the Fact tables and packages are published to pull this data from here to generate reports for analysis and reporting purposes.

Responsibilities:

• As a Data warehouse Consultant created the Data Mart for Food and Nutrition Department for the Client.

• Was involved in the Analysis, design and implementation of the Data warehouse project and thereby assisted in the OLAP tool for report generation.

• Gathered the requirements from the business users for the up gradation of the existing schema model and for the further up gradation of application with the New Logic.

• Involved in the creation of the Data Model using Cognos Frame Work Manager.

• Created drill down and drill through reports and prompt pages.

• Generated the required Meta data at the source side, in the Informatica Mappings and in the Cognos Framework Manager.

• Created and maintained the required tables’ in Staging area and Target Database.

• Used stored procedures and materialized views to execute certain steps using Informatica mappings.

• Developed and implemented the ETL mappings using various Informatica Transformations like Source qualifiers, Joiners, Lookups, Expressions, Update Strategy, Router Transformation, Sequence Generators, Aggregator, Stored procedures and Mapplets.

• Maintained SCD type 1, type 2 using Informatica ETL mappings.

• Successfully created workflows, work lets and sessions moved them into Production environment.

• Tested the model in Development and SIT environment and successfully migrated the code into the Production environment after the successful UAT at the client location.

• Maintained facts and Dimensions using Star Schema and Snowflake Schema.

• Tuned the existing queries and implemented tweaking of the existing ETL code as a part of Performance Tuning.

• Created the Data models and projects and published the required packages using Framework Manager.

• Created and maintained the entire documentation work for the applications that were developed at the Clients location.

• Proficient in scheduling the jobs using the Informatica Scheduler.

• Created and maintained the Data model and its related architecture and diagrams using Microsoft Visio.

Client: Abbott Laboratories, Waukegan, IL July 2008 to Dec 2008.

Role: Senior Data warehouse Consultant / BI Architect.

Environment: Informatica Power Center 8.5.1 & 7.1.4, Cognos 8.3 & 8.2.4, Frame Work Manager,

Transformer, Report Net 1.1, Report Studio, Analysis Studio, Query Studio, Microsoft Visio, AIX

UNIX, Oracle 10g, TOAD 9.0.1, MKS Integrity Client 2007, Lotus Notes 6.5, BMC Control-M.

Project Description:

The project is based on Global inventory Supply Chain Management system. As a Data warehouse consultant, had to study the existing system and upgrade the system with new logic that was required from the Client. Had to change the Source code of the Tables, ETL mappings and the Cognos Reports. The Data feed was from various source systems like SAP, Web Applications, etc, which were loaded into Oracle Database. The ETL logic is done through the Informatica Mappings. The cubes were built to load the Fact tables and packages are published to pull this data from here to generate reports for analysis and reporting purposes.

Responsibilities:

• As a Data warehouse Consultant involved in the Enhancement of the existing Model and implementation of the New Model for the Global Inventory Supply Chain system.

• Was involved in the Analysis, design and implementation of the Data warehouse project and thereby assisted in the OLAP tool for report generation.

• Gathered the requirements from the business users for the up gradation of the existing schema model and for the further up gradation of application with the New Logic.

• Involved in the updation of the Data Model using Cognos Frame Work Manager.

• Created the multidimensional OLAP Power cubes using Cognos Transformer.

• Created drill down and drill through reports using Analysis Studio.

• Generated the required Meta data at the source side, in the Informatica Mappings and in the Cognos Framework Manager.

• Generated and developed the DDL scripts using TOAD in Oracle Database.

• Modified the existing DDL scripts that were in Oracle 10g Database.

• Used stored procedures and materialized views to execute certain steps using Informatica mappings.

• Developed and implemented the ETL mappings using various Informatica Transformations like Source qualifiers, Joiners, Lookups, Expressions, Update Strategy, Stored procedures and Mapplets.

• Successfully created workflows, worklets and sessions moved them into Production environment.

• Tested the model in Development and SIT environment and successfully migrated the code into the Production environment after the successful UAT at the client location.

• Maintained facts and Dimensions using Star Schema and Snowflake Schema.

• Tuned the existing queries and implemented tweaking of the existing ETL code as a part of Performance Tuning.

• Created the Data models and projects and published the required packages using Framework Manager.

• Coordinated with the reporting team and developed and enhance the reports in Cognos using Report Studio for structured reports and generated Adhoc Reports using Query Studio.

• Maintained the code using MKS Integrity 2007 Client. This application was also used to maintain all the Clients documents that are primarily used in Software Development Life Cycle (SDLC).

• Proficient in scheduling the jobs using BMC Control – M Enterprise Manager.

• Created and maintained the Data model and its related architecture and diagrams using Microsoft Visio.

Client: First Citizens Bank, Raleigh, NC Feb 2008 to June 2008.

Role: Senior Data warehouse Consultant – Data Architect.

Environment: Cognos 8.0, Cognos Suite 8, Frame Work Manager, Report Net, Report Studio, CA

Erwin 4.1 Data Modeler, AIX UNIX, DB2 UDB 9.1, Oracle 10g, Quest, WinSCP, Putty.

Project Description:

The project is based on Credit Card Conversion. As a Data warehouse consultant, was involved in 2 modules whereby, had to study the existing system and create a New Data model that you would be developed for the Credit Card Authorizations. Was involved in the Data Modeling for Logical and Physical designs for the existing Model and for a New Model. Frame work manager packages were enhanced and created in the process. The Data feed was from various source systems like Mainframes, DB2 records which were loaded into DB2 UDB and Oracle Databases. The Data warehouse team extracted the data and implemented the ETL logic and loaded it into the Target Databases. Then the Cognos reporting team pulled this data from here to generate reports for analysis.

Responsibilities:

• As a Data warehouse Consultant involved in the Enhancement of the existing Model and creation of the New Model for the Credit Card Conversion and Authorization Log.

• Was involved in the Analysis, design and implementation of the Data warehouse project and thereby assisted in the OLAP tool for report generation.

• Analyzing the existing model and making the changes as per the Change Requests from the Client.

• Took the requirements from the business users for the up gradation of the existing schema model and for the creation of the New Data model.

• Created the New Model in 3NF (3rd Normal Form) and designed the model as per the clients requirements using ERWIN.

• Was involved in the Data Modeling of Star Schema of the physical and logical model using tools like CA Erwin Data Modeler and MS Visio.

• Involved in the creation of Data Model using Cognos Frame Work Manager.

• Generated the reports as per the Clients specifications using Report Studio and Query Studio.

• Generated the required Meta data at the Back end and in the Frame work Manager.

• Generated and developed the DDL scripts using Quest in DB2 UDB.

• Modified the existing DDL scripts that were in Oracle 10g Database.

• Used stored procedures and triggers to execute certain steps using PL/SQL.

• Developed and executed the Database Load scripts using Korn Shell scripts in AIX UNIX environment.

• Gathered information from the clients and Business users during the development of the Data Model and design.

• Tested the model in Development and QA environment and successfully moved into the Production environment.

• Maintained facts and Dimensions using Star Schema and Snowflake Schema.

• Coordinated with the reporting team to develop the OLAP reports using Cognos.

Client: Selective Insurance, Branchville, NJ Feb 2006 to Jan 2008

Role: Senior Data warehouse Developer / Team Lead.

Environment: Informatica Power Center 8.1 and 7.1, MS SQL Server 2005, ERWIN,

DB2 UDB 9.0, HP - UNIX, Cognos Suite 8.0, Power play, Transformer and Report net, BMC Control-M.

Project Description:

The project comprises of almost 3 data marts. The team is based on onsite/offshore model and as an Onsite Team Lead and senior developer; was involved in the Data Modeling for Logical and Physical designs with Meta Data Management. It also involved in the new developments of the mappings and enhancements of the current mappings as per the clients requirements. The Data feed was from various source systems like Mainframes, DB/2 and XML files which were loaded into SQL server and Teradata. The Data warehouse team extracted the data and implemented the ETL logic and loaded it into the Data marts. Then the Cognos reporting team pulled this data from here to generate reports for analysis.

Responsibilities:

• Implemented and Supported the Data warehouse project following the Data warehousing methodologies.

• Analyzing the existing ETL process (SQL, PL/SQL) and there by created ETL design and coded using the Informatica Designer.

• Was involved in the Analysis, design and implementation of the Data warehouse project and thereby assisted in the OLAP tool for report generation.

• Managed the Team of around 5 people in the module as an Onsite Module lead.

• Analyzing the existing mappings and making the changes as per the Change Requests from the Client.

• Took the requirements from the business users for the regular update of the schema model and there by changes in the mappings.

• Was involved in the Data Modeling of Star Schema and Snowflake of the physical and logical model using tools like Erwin and MS Visio.

• Involved in the Meta Data Management in the Schema designs.

• Extensively used the Extraction, Transformation and Loading (ETL) process to extract the data from various sources and load into the target for reporting purposes.

• Developing New Mappings for the better performance of the existing process as a part of Performance Tuning.

• Involved in the creation and maintenance of Data Marts using Informatica Power Exchange.

• Involved in the creation of multidimensional cubes using Cognos Transformer.

• Generated drill down and drill through reports using Analysis Studio.

• Implemented special data loading techniques using Informatica External loader and Teradata tools, providing efficient load times.

• Used transformations like Source Qualifiers, Lookups, Joiners, Update Strategy, Sequence Generators, Routers, Aggregators, Mapplets, Connected and Unconnected lookups etc.,

• Successfully created and implemented workflows, worklets and sessions to run the ETL code and migrated to Production environment.

• Worked on all types of slowly changing dimensions using Update strategy.

• Design and development of ETL Mapping Documents.

• Was also involved in the up gradation from Informatica 7.1 to 8.1 in the project right from Source level to the session level.

• Implemented Performance tuning of the existing ETL code.

• Modified the existing shell scripts and wrote new script in the UNIX environment.

• Generated and developed the DDL scripts in DB2 UDB and used data as sources tables and target tables.

• Used to test the data at backend databases using Indexes, triggers, stored procedures, views and partitions using the PL/SQL scripts.

• Scheduling the jobs using BMC Control-M Job scheduler.

• Maintenance and Production support of the Informatica sessions and workflows.

• Maintained facts and Dimensions using Star Schema and Snowflake Schema.

• Coordinated with the reporting team to develop the OLAP reports in Cognos.

• Coordinated with all the technical resources and delivery of the quality solutions in the timely manner with overall project objectives.

Client: GE Energy Power systems, Atlanta, GA August 2004 to Jan 2006.

Role: Senior Data warehouse Developer / Team Lead

Environment: Informatica 7.x and 6.x, Cognos, Oracle 9i/10g, Teradata, UNIX and Win XP.

Project Description:

The GEPS project comprises of almost 80 data marts. The team is based on onsite/offshore model and as an Onsite Team Lead and senior developer; the responsibility was to take the business requirements from the users and developed Informatica Mappings. Was involved in the Data modeling of the schemas that were present in the ETL Design. Cognos was the reporting tool used for Analysis and reporting.

Responsibilities:

• Analyzing the existing ETL process and developed the mappings as requested by the Client.

• Worked on the Analysis, design and Implementation of the Business requirement using Informatica mappings

• Involved in the Data Model schema designs using the Data Model Tools like Visio.

• Experience in the Creation of Cubes and Packages Design following the Relational and Dimensional Modeling.

• Maintained the Data of the Data Marts using Informatica Power Center.

• Assisted in the Meta Data management of the schema designs.

• Created the mapping documents from Business requirement documents along with the Team members.

• Lead a Team of around 3 people as an Onshore Team Lead.

• Developed the mappings using various transformations like Aggregators, Sequence generators, Joiners, Source qualifiers, Lookups, Update Strategy, Connected and unconnected lookups etc.,

• Worked in the various types of slowly changing dimensions using Update Strategy.

• Used Teradata utilities like BTEQ, FastLoad and MultiLoad for loading purpose.

• Maintained the Data Model in Star Schema using Microsoft Visio.

• Documentation for the ETL mappings, sessions and workflows and Scheduled them.

• Run the mappings using Korn shell scripts in UNIX environment.

• Was involved in debugging the mappings and Performance tuning.

• Involved with the reporting team and created OLAP reports with using Cognos.

• Created and published the Cognos reports on Web.

Client: GSK - Glaxo Smith Kline, Philadelphia, PA Feb 2003 to July 2004.

Role: ETL Developer

Environment: Informatica 7.x and 6.x, Business Objects 6.5, Cognos, Oracle 8.1 and

UNIX, Shell Scripting, Teradata.

Project Description:

The project involved in the maintenance and enhancement support activity of the customized ETL Application developed for the customers. It was an Onsite/Offshore Project. The Feed files were Flat Text files and used to convert them into the Informatica mappings with the required transformations and load them into related OLAP tools like Business Objects and Cognos.

Responsibilities:

• Analysis of the specifications provided by the clients.

• Design of the ETL mappings as per the Users requirement.

• Used various transformations like Source Qualifier, Expression, filter, Aggregator, Lookup

and SQL stored procedures.

• Was involved in Unix Shell scripting.

• Created transformations, sessions and workflows.

• Used Teradata as the Database and loaded the data for Analysis.

• Scheduling of the Informatica workflows and sessions.

• Maintained the Data Model using Snowflake schema.

• Involved in Debugging and Performance Tuning of mappings and sessions.

• Involved with the reporting team and created OLAP reports using Business Objects and

Cognos.

Client: British Petroleum (BP), Naperville, IL Aug 2001 to Jan 2003.

Role: ETL Developer

Environment: Informatica 6.x and 5.x, Business Objects 6.5, Oracle 8.1 and Win NT.

Project Description:

It was basically the development of the Informatica Mappings as the Mapping documents that were provided by the Business Analysts.

Responsibilities:

• Development of the Informatica mappings in Power Center 5.x.

• Creation of the Workflows and the Sessions.

• Involved in running some Unix Shell scripts.

• Was involved in the debugging session and partially involved in Performance Tuning.

• Performed the Unit Testing of the mappings that were developed.

• Analysis of the specifications provided by the clients.

• Documentation of the entire work done and overseeing the quality procedures.

Client: British Petroleum (BP), Naperville, IL August 2000 to July 2001.

Role: Developer

Environment: Mainframes, Testing Tools, Business Objects 5.x, Oracle 7.x, Win NT,

Unix (HP-UX), SAP R/3.

Project Description:

It was an Application Support and Maintenance project. The Project comprised of various Applications right from Mainframes, SAP and Data warehousing on various environments using UNIX to Sun Solaris. Various Business users raised Vantive tickets and there were well-managed teams who used to respond and solve the issues faced by the end users.

Responsibilities:

• System management especially as SAP Basis Administrator.

• User administration in the Mainframe environment.

• Checking the reports that were generated using Business Objects tool.

• Analysis of the specifications provided by the clients

• Development of Reports requested by the users.

• Documentation of the Maintenance support work done.

Client: Matrix Laboratories, Hyderabad, India December 1998 to June 2000.

Role: Developer

Environment: SAP R/3, ABAP, HP UNIX, Windows NT and Oracle.

Project Description:

Matrix Laboratories Limited is a pharmaceutical company involved in bulk drugs manufacturing and into formulations. The Client had presence in multinationals and mainly using the Supply Chain Model for its product’s distribution purpose. The Project was about the Life cycle Implementation of SAP R/3 4.6c for the client. The company earlier had and older version of SAP R/3 4.0 B version of SAP, then upgraded into SAP 7.1.1 version.

Responsibilities:

• Designed the solution delivery documents after gathering the requirements from Client.

• Analysis of the specifications provided by the client.

• Design and Development of the Module as per the client’s specifications.

• Testing - unit testing & integration testing and user acceptance testing.

• Dealing in Material Management (MM) module of SAP 4.6c and heading a team of 3

people and primarily took all the responsibilities of the module.



Contact this candidate