Post Job Free

Resume

Sign in

Informatica/IDQ Lead developer

Location:
Austin, TX
Posted:
November 18, 2015

Contact this candidate

Resume:

Professional Summary:

Around *+ years of experience in complete Software Development Life Cycle of Analysis, Design, Development, Testing and Implementation.

Extensively worked on Informatica Data Quality and Informatica Power center throughout complete Data Quality and MDM projects.

Implemented end-to-end tasks effectively throughout the project till its delivery to customer.

Very good proficiency in Informatica Data Quality 9.5.1/9.6.1 and Informatica power center 9.x/8.x/7.x

Extensively worked on Informatica Analyst tool 9.5.1/9.6.1 as initial phase.

Good knowledge on Informatica Data Quality Admin tasks as well.

Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.

Good skills in analyzing trend charts from score cards to analyze the threshold which is to be considered in further development

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.

Extensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter, Filter, Router and other transformations as required.

Very strong knowledge on end to end process of Data Quality and MDM requirements and its implementation.

And good experience on Data Warehouse Concepts like Dimension Tables, Fact tables, slowly changing dimensions, Datamarts and dimensional modeling schemas.

Expertise in data modeling, development and enhancement.

Very strong in Bill Inmon and Ralph Kimball methodologies

Strong experience in Data modeling 1. Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.

Familiar about SDLC (Software Development Life Cycle) Requirements, Analysis, Design, Testing, Deployment of Informatica Power Center.

Strong expertise in relational database management system like Oracle, DB2, MS Access, Teradata, SQL server.

Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).

Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools -Task Developer, Workflow & Worklet Designer.

Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.

Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.

Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.

Design and develop applications using informatica, UNIX scripting, SQL and Autosys.

Highly proficient in UNIX shell scripting and administering job scheduler using UC4, Autosys UNIX machines.

In-depth knowledge in dealing with Flat files, Cobol and XML files

Excellent analytical, problem solving skills with strong technical background and interpersonal skills.

Efficient team player with excellent communication skills and good interaction.

Good experience in working as Lead managing small and also large teams as well.

Technical Summary:

Tools: Information Data Quality 9.5.1/9.6.1, Informatica Analyst tool 9.x and Informatica Power center 9.x/8.x

Databases: Oracle 11g/10g/9i, DB28.x, SQL Server 2000/2005, Teradata

Languages: SQL, PL/SQL, Shell Scripting

Operating Systems: UNIX (SOLARIS, AIX), Linux, Windows 95/98/NT/2000/XP.

DB Tools: SQL Plus, SQL Loader, Toad, Power Designer, Erwin

Scheduling Tools: Autosys, UC4.

Others: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word, MS Visio).

Web: HTML

Professional Experience:

Health and Human Service Commission, State of Texas. September 2015 – November 2015

Austin, Texas

Informatica Data Quality Analyst/Developer

Health and Human service commission implementing MDM (Master Data Management) project in the mission of finding the golden record. State of Texas which holds data from different sources mostly the data that belongs to Member and Providers considered in whole process of implementation. Large volumes of data will be handled throughout this process.

Responsibilities:

As a lead Data Quality developer in this team initiated the process of Data profiling by profiling different formats of data from different sources

And started analyzing its dimensions to find its actual structure and the rules which are implemented as part of Standardization.

Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.

Most of data which belongs to various members and Providers will be carried out throughout the development.

Match and Merge rules will be implemented in Informatica MDM 10.1v to find the duplicates and to analyze the golden record.

Extensively worked on Informatica IDE/IDQ.

Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

Created business rules in Informatica Developer and imported them to Informatica power center to load the standardized and good format of data to staging tables.

Have very good knowledge on all the data quality transformation which will be used throughout the development.

Have Knowledge on Informatica MDM concepts and implementation of De-duplication process.

Environment: Oracle 11g, SQL Developer, Informatica Data Quality 9.6.1, Informatica Analyst tool 9.6.1

Parexel International Corp March 2015 – July 2015

Billerica, MA- 01821

IDQ Developer

Hold Pharmaceutical/Medical business of SIMS (Site Intelligence Management System). And focused on De-duplication process in order to remove all the possible duplicates from the existing multiple sources which matches with Master Records in MDM hub.

Responsibilities:

Extensively worked on Informatica IDE/IDQ.

Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

Used IDQ’s standardized plans for addresses and names clean ups.

Worked on IDQ file configuration at user’s machines and resolved the issues.

Used IDQ to complete initial data profiling and removing duplicate data.

And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.

Performed many multiple tasks effectively and involved in troubleshooting the issues.

Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow.

Worked on UC4 as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings

Played a very important role in team in development of mappings, workflows and troubleshooting the issues.

Worked very closely with SQA team all the time in order to fix the issues and completed tasks effectively in time and proceeded to production level.

Very familiar about Sorter, Filter, Expression, Consolidation, Match, Exception, Association and Address validator transformations.

Working extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement.

Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment

Develop MDM solutions for workflows, de-duplication, validations etc., and Facilitate Data load and syndication

Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.

Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.

Environment: Oracle 11g, PL/SQL, Informatica Data Quality 9.x, IDQ admin console, UC4.

Exilant Technologies May 2014 – February 2015

Cupertino, CA- 95014

Informatica/IDQ Developer

Global Business Intelligence (Financial domain) which deals about end-users profit from a business and market intelligence resource that is widely used by executives within major companies worldwide is the main project which we have focused on, under Information systems and technology department.

Responsibilities:

Worked with heterogeneous sources including relational sources and flat files.

Work with Data modeler to understand the architecture of Data warehouse and mapping documents

Design mappings related to complex business logic provided by data modeler, which includes Dimensions and Fact tables.

Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.

Extensively worked on Informatica IDE/IDQ.

Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

Used IDQ’s standardized plans for addresses and names clean ups.

Worked on IDQ file configuration at user’s machines and resolved the issues.

Used IDQ to complete initial data profiling and removing duplicate data.

Involved in the designing of Dimensional Model and created Star Schema using E/R studio.

Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of dimensions and fact tables and Technical Specification Document.

OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis.

Interacting with the front end users to present the proof of concept and to gather the deliverables of the team.

Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.

Doing research work to resolve production issues and data discarded during workflow runs.

Extract the data from relational source VCMG (Oracle), flat files and perform mappings based on company requirements and load into oracle tables.

Used lookup transformation, Aggregator transformation, Filter transformation, Update strategy and Router transformations.

Extensively used Informatica functions LTRIM, RTRIM, IIF, DECODE, ISNULL, TO_DATE, DATE_COMPARE in Transformations.

Also used Used-defined Function, which declared once globally and used in various mappings.

Ensure integrity, availability and performance of DB2 database systems by providing technical support and maintenance. And maintain database security and disaster recovery procedures. And performed troubleshooting and maintenance of multiple databases. And resolved many database issues in accurate and timely fashion

Working extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement.

Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.

Extensively involved in monitoring the jobs in order to detect and fix unknown bugs and track performance.

Used Informatica Workflow to create, schedule, monitor and send the messages in case of process failures.

Involved in Performance Tuning of sources, targets, mappings, sessions and data loads by increasing data cache size, sequence buffer length and target based commit interval.

Environment: Oracle 10g, DB2, PL/SQL, Informatica power center 9.x and IDQ 9.x

Sharp Electronics December 2013 – April 2014

Mahwah, NJ

Informatica Developer

Sharp Electronics Corporation (SEC) is the U.S. sales and marketing subsidiary of Japan’s sharp Corporation. SEC was established in the U.S. marketplace in 1962 located at in Mahwah, New Jersey. Sharp electronics for your office needs like business copiers, data security solutions, AQUOS LCD TV and HDTV, audio systems, Calculators, Cash registers, Projectors and microwaves.

Responsibilities:

Created multiple power center mappings (20+) and power center workflows to accomplish data transformation and load process.

Used various complex Power Center Transformations like lookup, Joiner, Expression, Router, Update strategy, Source Qualifier, Aggregator, SQL filter, Sequence Generator, Normalizer to accomplish the mapping design.

Re-designed multiple existing Power Center mappings to implement change request (CR) representing the updated business logic.

Develop MDM solutions for workflows, de-duplication, validations etc., and Facilitate Data load and syndication

Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.

Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.

Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.

Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.

OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis

Created User Defined Functions (UDFs) and reusable Mapplets and Transformations to simplify maintenance process and improve the productivity.

Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ).

Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Centre.

Experienced in massive data profiling using IDQ (Analyst tool) prior to data staging.

Performed Unit Testing and Integration Testing of Mappings and Workflows.

UNIX shell scripting and administering job scheduler using Autosys UNIX machines.

Designed Data Quality using IDQ and developed several informatica mappings.

Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.

Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects.

Validated and fine-tuned the ETL logic coded into existing Power Center Mappings, leading to improved performance.

Maintained technical documentation.

Designed the ETL processes using Informatica to load data from Oracle, Flat Files and Excel files to staging database and from staging to target Oracle Data Warehouse database.

Designed and developed the logic for handling slowly changing dimension table load by flagging the record using update strategy for populating the desired.

Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.

Experience in job scheduling tools like Autosys and control-M setting up, monitoring in detail and using job control-M.

Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices environment

Involved in performance tuning and optimization of Informatica mappings and sessions using features like partition and data/index cache to manage very large volume of data.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions and validations based on design specifications for unit testing, expected results, preparing test data and loading for testing, error handling and analysis.

Used Debugger in troubleshooting the existing mappings.

Even used Teradata sources which are used in order to maintain huge databases.

Environment: Informatica power center 9.x, IDQ 9.x, MDM, Oracle 10g, Teradata 12.0, Autosys, Windows, SQL server 2005, TOAD.

Panasonic May 2011- November 2013

New Jersey

ETL Informatica Developer

NLS-Shipper Logistics Project is a maintenance Project, Which includes Panasonic Export, Buyback and Import Business. NLS is an open and innovative web-based connectivity platform that supports secure, real-time communications, transactions and logistics operations. Powerful data management and reporting tool enables the customers to use the flow of information effectively and profitably by tracking and tracing shipments from the point of booking the order to the point the customer has received it.

Responsibilities:

Analyzing the source data coming from various databases and files

Identified data source systems (Oracle Apps, IMS data, Legacy Systems, files) integration issues and proposing feasible integration solutions.

Created Oracle PL/SQL Stored Procedures, Packages, Triggers, Cursors and backup-recovery for the various tables.

Leveraged Explain Plan to improve the performance of SQL queries and PL/SQL Stored procedures.

Identifying and tracking the slowly changing dimensions (SCD), used CDC logic (Change Data Capture) for the SCD tables loading in Oracle.

Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.

OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis.

Ensure integrity, availability and performance of DB2 database systems by providing technical support and maintenance. And maintain database security and disaster recovery procedures. And performed troubleshooting and maintenance of multiple databases. And resolved many database issues in accurate and timely fashion

Extracting data from Oracle and Flat file, Excel files, XML and COBOL sources and performed complex joiner transformations, Expression, aggregate, lookup, stored procedure, filter, Router transformations and Update strategy transformations to extract and load data into the target systems.

Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ).

Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Centre.

Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.

Designed Data Quality using IDQ and developed several informatica mappings.

Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.

Fixing invalid Mappings, Debugging the mappings in designer, Unit and Integration Testing of Informatica Sessions, Worklets, Workflows, and Target Data.

Created reusable Tasks, Sessions, reusable Worklets and workflows in Workflow manager.

Provided user training and production support.

Fine Tuning of the SQL Statements to improve the database Performance.

Environment: Informatica Power Center 8.6.1, MDM and IDQ, Oracle 10g, DB2, SQL*Loader, TOAD.

Whish works business solutions private limited,Client: Gamooga March 2009 – April 2011

Mumbai, India

Informatica Developer

The project involves huge data coming from various sources, which then is extracted, transformed and loaded into a data warehouse following a snowflake Schema. Then further business objects universe is created so that users can create report and find discrepancies.

Responsibilities:

Worked with source system developers and business owners to identify data sources for defining data extraction methodologies.

Analyzed complex ETL requirements/tasks and provided estimates etc.

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

Staging table mappings has been created to hold the data from the source based on the CDC logic (Change Data Capture) and then transforms the data into appropriate Dimension tables.

Created reusable Tasks, Sessions, reusable Worklets and workflows in Workflow manager.

Created mappings to perform the tasks such as cleaning the data and populate that into staging tables, Populating the Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information.

Implemented Slowly Changing Dimensions as per the requirements.

Worked with Informatica Administrator to setup project folders in development, test and production environments.

Worked with the Quality Assurance team to build the test cases to perform unit, Integration, Functional and performance Testing.

Environment: Informatica Power Center 8.1, Oracle 9i, Toad, SQL loader.

Dhanush InfoTech January 2007 – February 2009

Client : Process Map

Hyderabad, India

PL/SQL developer

Dhanush InfoTech deals a powerful online platform focused on providing information services for simplify buying, selling and leasing residential and commercial properties. It is open for the general public. Acre Deals is a web portal designed to meet the need of property search in the Indian market.

Responsibilities:

Coding according to the standards

Writing SQL code using the technical and functional specifications.

Support to the existing system.

Creating SQL Stored Procedures, Functions, Triggers.

Creating Packages, Object Types.

Tuning the SQL code and SQL queries.

Used Database trigger for making history of insertion, updating, deletion and all kind of Audit routines

Documenting the functional specification and application flow.

Worked on Performance improvement on Database applications.

Environment: MSSQL Server 2005, visual studio 2005, .Net



Contact this candidate