Post Job Free

Resume

Sign in

Informatica Developer

Location:
Dublin, OH
Salary:
$45/hr
Posted:
October 03, 2017

Contact this candidate

Resume:

Professional Summary:

* +years of work experience in IT Industry specialized in Data warehousing using Informatica Power Center 8.6 & 9.6.1

Extensive experience in working in all phases of data warehouse project life cycle from Requirement Gathering, Design and Development, Quality Assurance and Production & Maintenance

Worked Extensively in Data Warehouses & Data marts with Informatica PowerCenter as an ETL tool and Oracle, Teradata as backend databases.

Extensively worked on Data Extraction, Transformation, and Loading with RDBMS and Flat files. Proficiency with Linux/Unix environments, file manipulations and shell scripting.

Expertise in entire process of Testing - Integration, Functional, Regression, System Testing, Load Testing, UAT Testing, Black Box and GUI testing

Vast experience in developing Traceability matrix between Requirements and Test Cases and involved in creation of Test Strategy, developing Test plan and execution of the Test plans, Test Cases, Test Scenarios and Test Results

Expertise in preparing Effort estimation, Resource planning, Test planning, Execution and Defect monitoring.

Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager

Hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, Multi Load, Fast Export, Tpump, TPT)

Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Experience in Unix shell scripts for testing the data and executing backend scripts for executing jobs.

Proficient performing Backend Testing by writing SQL Query to check the consistency in the data.

Strong domain knowledge on business functions including Order Management, Bookings, Manufacturing, Revenue & Marketing.

Expertise in Performance Tuning of Teradata SQL's and Informatica mappings.

Extensive experience in the different phases of SDLC including Waterfall, Agile and easily adaptable to any other customized methodologies

Education:

Bachelors’ in Technology (B Tech)

Technical Skills:

Scripting

Unix Shell Scripting

ETL Tool

Informatica

Database

Teradata 14.SQL Assistant, Oracle R12

Tools/Utilities

SSH Client, Kintana 4.6, U-Deploy6.2.1, HP Quality Centre 12, Dollar Universe5.3, PVCS

Operating Systems

Linux6.6, Windows7

Technical Skills

Teradata SQL Assistant 14, Informatica Power Center 9.6.1 (Designer, Workflow Manager, Workflow Monitor), OracleToad12.1.

Domain

Data ware Housing

Professional Experience:

Client: IDEXX Laboratories, Inc., Westbrook ME July 2016- August 2017

Project: Salesforce.com (SFDC)

Role:ETL Tester

Project Description:

IDEXX Laboratories, Inc. develops, manufactures, and distributes products and services primarily for the companion animal veterinary, livestock and poultry, water testing, and dairy markets worldwide. It also offers diagnostic, health-monitoring, and food safety testing products for livestock, poultry.

Currently an in house built system called BEACON is being used as the CRM system and is now being replaced with Salesforce; the project is to build the existing functionality and additional features needed in Salesforce along with required data set migration.

Responsibilities:

• Involved in creating Test strategy, Test plan and Test cases using the business specification documents.

• Developed QA Test plan from technical specifications and requirements for this project.

• Understanding the Design, ETL mapping and the SDR Documents provided by the development team for testing. And preparing the understanding document on the same summarizing this release scope for manufacturing tracks.

• Writing test cases on the related scenarios to be tested such as Count, attribute checks, performance, functional, business logic checks and so on.

• Uploading these test cases written to Quality Center after review and accordingly passing them as per weekly goals and baselines throughout the test cycle.

• Carrying out deployment activities for both the History and Incremental code respectively in multiple test instances.

• Carrying sanity checks for the code migrated during the deployment and then loading the History and Incremental data.

• Executing the jobs from Unix shell script to execute the ETL jobs and load data as on when required.

Environment: Informatica 9.5.1, Quality Center11

Client: Cisco Systems Inc, San Jose, CA Feb 2013- Jan 2014

Project: Order management LSS

Role: ETL Developer

Project Description:

Large-scale services are major services that every company needs to operate. We are building these Fundamental services for our partners, customers, and ourselves. Cisco Direct Booking transactions are Cisco booked sales order transactions along with Sales Credits. It captures net changes of sales credits along with specific sales order attributes like line ordered quantity, selling price etc. using Oracle Triggers.Net Change Reporting(NCR) transactions are captured in ERP which flows into NRT Bookings for OTS Reporting. Near Real Time (NRT) for Direct Bookings is built on top of NCR Bookings by joining reference data in order to derive key attributes like Corporate Bookings Flag, Net USD Amount, and Base Price etc. along with Net Change Reporting attributes.Net change Reporting Transactions flows to Teradata through golden gate by audit replication Objectives: Availability of near real time consistent, relevant and accurate data in the data warehouse from ERP systems ready for analysis and reporting for business.

Responsibilities:

Using Golden gate, we are migrating source data to target Teradata system.

Carrying out deployment activities for both the History and Incremental code respectively in test instances.

Designed ETL using Informatica tool for mapping and the SDR Documents provided by the development team for testing. And preparing the understanding document on the same summarizing this release scope for NCR NRT Bookings.

Writing test cases on the related scenarios to be tested such as Count, attribute checks, performance, functional, business logic checks and so on.

Uploading these test cases written to Quality Center after review and accordingly passing them as per weekly goals and baselines throughout the test cycle.

Carrying sanity checks for the code migrated during the deployment and then loading the History and Incremental data.

Loading data into EDW coming from different sources (Oracle and Teradata).

Writing queries in Teradata, Oracle for data analysis, reconciliation to meet the business requirements.

Analyze loaded data & validate if the data according given BRD.

Performed Data analysis and presented results using SQL and UNIX basic scripts, Validation of the History data through SQL Assistant queries for all the new tables brought about in the bookings 3NFs as per the flow.

Executing the jobs from Unix shell script to execute the ETL jobs and load data as on when required.

Checking Failed replicate errors using Unix shells scripts.

Executing sql queries connecting to BTEQ from Unix server.

Validation of the Incremental data for all the new tables brought about in the bookings 3NFs as per the flow.

Comparing and Validation of the old code and flow with the new ones.

Executed the UNIX shell scripts that invoked ETL jobs to load data into tables.

Providing IT validations signoff for both the history and incremental data for QA end.

Worked and supported the finance Bi in analyzing the data reconciliation issues.

Providing data to Business users for application level testing and ensuring data accuracy and consistency.

Environment: Teradata sql assiatant-14, Oracle Toad 12.1, UNIX 6.6, HP Quality Centre 12, TES/Tidal-14

Client: Cisco Systems Inc, San Jose, CA Dec 2011-Jan 2013

Project: Change Requests and Quick win Projects

Role: ETL Developer

Project Description:

The Change Requests are raised whenever there is an issue in production or a critical requirement that have to be implemented immediately in Production environment. And the scope for this is usually a week.

Quick Wins are the minor Enhancements majorly DDL Changes that are to be done which are separated from CRs to have an extended SLA.

Responsibilities:

Analysis of the specifications provided by the business on the critical requirements.

Developed new and customized existing mappings and bought in data into data warehouse.

Extensively used flat files as source table and loaded data into database tables.

Created mapping document with all the necessary fields which helps to develop or customize the mappings more efficiently.

Used several transformations in Power Center Designer to build a mapping.

Extensively used mapping variables, mapping parameters and parameter files in ETL mappings.

Involved in code deployments while pushing from one environment to the other using Repository Manager.

Extensively used connected and unconnected lookups, Update strategy, Router transformations in the mapping to bring in new fields into target table.

Used various inbuilt functions like Date, Character, Special functions to calculate new fields in the mapping using power center Designer tool.

Modified existing mapplets, added new lookup and expression transformations and used those mapplets in the mapping.

Responsible to monitor data during the daily and weekly loads and fix the issues related to load failures.

Responsible for Optimization of Informatica mappings, sessions, SQL queries.

Written and fixed various Oracle Stored procedures.

Environment: Informatica Power Center 8.6, HP Quality Center-10, PVCS, Kintana, Dollar Universe, Citrix

Client: Cisco Systems Inc, San Jose, CA Sep 2009 - Nov 2011

Project: CSSO

Role:ETL Developer/QA Lead

Project Description:

The Case Management process begins when a request is either sent via email, submitted via the Cisco Online Case

Management Tool (CMT), initiated via other Cisco internal tools such as IC Engine or via phone. Typically for requests

initiated via emails/CMT/IC Engine the Service Request is automatically created and assigned to a Group from which a

CSR will access and manually pick up. If the CSR identifies that the request has been mis-assigned or realizes that

resolution requires a completely different skill set, the CSR may reassign the Service Request accordingly to the

appropriate Group.Upon resolution, the CSR delivers the response and updates the Service Request.

Responsibilities:

Developed new and customized existing mappings and bought in data into data warehouse.

Designed ETL using Informatica tool for mapping and the SDR Documents provided by the development team for testing. And preparing the understanding document on the same summarizing this release scope for EDW-CQA Configuration.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Using Golden gate, we are migrating source data to target Teradata system.

Carrying out deployment activities for both the History and Incremental code respectively in Development&test instances.

Writing test cases on the related scenarios to be tested such as Count, attribute checks, performance, functional, business logic checks and so on.

Created mapping documents to outline data flow from sources to targets

Carrying sanity checks for the code migrated during the deployment and then loading the History and Incremental data.

Loading data into EDW coming from different sources (Oracle and Teradata).

Writing queries in Teradata, Oracle for data analysis, reconciliation to meet the business requirements.

Analyze loaded data & validate if the data according given BRD.

Performed Data analysis and presented results using SQL and UNIX basic scripts, Validation of the History data through SQL Assistant queries for all the new tables brought about in the bookings 3NFs as per the flow.

Executing the jobs from Unix shell script to execute the ETL jobs and load data as on when required.

Checking Failed replicate errors using Unix shells scripts.

Executing sql queries connecting to BTEQ from Unix server.

Validation of the Incremental data for all the new tables brought about in the bookings 3NFs as per the flow.

Comparing and Validation of the old code and flow with the new ones.

Executed the UNIX shell scripts that invoked ETL jobs to load data into tables.

Providing data to testing team and ensuring data accuracy and consistency.

Environment: Informatica Power Center 8.6, BTEQ, HP Quality Center-10, PVCS, Kintana, Dollar Universe, Citrix

Client: Cisco Systems Inc, San Jose, CA Dec 2007-Aug 2009

Project: EDW-CQA Configuration

Role: QA Team Member

Project Description:

The client for the project is Cisco Systems Inc. Cisco is one of the Global Market Leaders in manufacturing of network equipment’s. Cisco maintains its own Data Warehouse. The project name is Enterprise Data Warehouse – Cisco Quality Assurance (EDW-CQA) Support. It is a pre-production phase.

The Cisco Enterprise Data Warehouse testing phase also involves the creation and maintenance of multiple test Environments. These environments are owned, created and maintained by the configuration team. The codes, ETLs and jobs to be tested are deployed in these environments and are run for testing purposes.

Responsibilities:

Designed ETL using Informatica tool for mapping and the SDR Documents provided by the development team for testing. And preparing the understanding document on the same summarizing this release scope for EDW-CQA Configuration .

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Using Golden gate, we are migrating source data to target Teradata system.

Carrying out deployment activities for both the History and Incremental code respectively in Development&test instances.

Writing test cases on the related scenarios to be tested such as Count, attribute checks, performance, functional, business logic checks and so on.

Created mapping documents to outline data flow from sources to targets

Carrying sanity checks for the code migrated during the deployment and then loading the History and Incremental data.

Loading data into EDW coming from different sources (Oracle and Teradata).

Writing queries in Teradata, Oracle for data analysis, reconciliation to meet the business requirements.

Analyze loaded data & validate if the data according given BRD.

Performed Data analysis and presented results using SQL and UNIX basic scripts, Validation of the History data through SQL Assistant queries for all the new tables brought about in the bookings 3NFs as per the flow.

Executing the jobs from Unix shell script to execute the ETL jobs and load data as on when required.

Checking Failed replicate errors using Unix shells scripts.

Executing sql queries connecting to BTEQ from Unix server.

Validation of the Incremental data for all the new tables brought about in the bookings 3NFs as per the flow.

Comparing and Validation of the old code and flow with the new ones.

Executed the UNIX shell scripts that invoked ETL jobs to load data into tables.

Providing data to testing team and ensuring data accuracy and consistency.

Environment: Informatica Power Center 8.1, Dollar Universe, Citrix, TOAD, SSH Client, Oracle, Teradata.

Client: J.D.Irving Limited, Saint John, Canada Jun 2007–Nov 2007

Project: Cognos Migration Project

Role: QA Team Member

Project Description:

J.D.Irving Limited (JDI) is a diverse family owned company with operations in Canada and USA. For over 125 years JDI has focus on providing quality service and products to customers around the world. J.D.Irving LTD was using Impromptu 7.1 to develop reports. We have migrated the reports from Impromptu 7.1 to cognos8.3.

Responsibilities:

Analysis of Reports in Impromptu and Creating Report specifications for Cognos 8.3 BI

Identifying equivalent Functions in Impromptu and Cognos 8.3 BI and Verification of Impromptu and Cognos 8.3 BI Joins using WinSQL.

Development of Reports in Cognos 8.3BI

Unit Testing and Unit Test case documentation

Status tracking of reports using Report Tracking Sheet and Tracking of Effort Variance for every single report.

Environment: Impromptu, Cognos8.3, Migration Tools, MS SQL Server, WINSQL



Contact this candidate