Post Job Free
Sign in

Sr. Informatica Developer

Location:
Columbus, OH
Posted:
January 10, 2018

Contact this candidate

Resume:

Preet Chouhan

Email: ***********@*****.*** Cell: 409-***-****

Summary

Over8 years of experience in Information Technology including Data Warehouse/Data Mart development using ETL/Informatica Power Center across various industries such as Healthcare, Banking, Insurance, Pharmaceutical.

Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment,maintenance and supporting of Enterprise level Data Integration,Data Warehouse (EDW) Business Intelligence (BI) solutions using Operational Data Store(ODS)Data Warehouse (DW)/Data Mart (DM), ETL, OLAP, ROLAP Client/Server and Web applications on Windows and Unix platforms.

Extensive experience in ETL/Informatica Power Center and data integration, Data Masking experience in developing ETL mappings and scripts using InformaticaPower Center 9.x/8.x/7.x, IDQ 9.6.1/9.0.1/8.6 and Power Mart 8.x/7. x.

Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.

Involved in the ETL technical design discussions and prepared ETL high level technical design document.

Extensively created Mapplets, common functions, reusable transformations, look-ups for better usability.

Well versed in OLTP Data Modeling, Data warehousing concepts.

Good Knowledge on applying rules and policies using ILM (Information Life Cycle Management).

Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema). Goodunderstanding of Views, Synonyms, Indexes, Joins and Sub-Queries.

Extensively used QL, PL/SQL Dynamic SQL,Stored procedures/functions, triggers and packages, complex joins, correlated sub-queries, aggregate functions analytic functions, materialized views, indexing, partitioning and performance tuning the same using explain plan and TKPROF Analysis.

Experience in using Exception handling strategies to capture errors and referential integrity constraints of records during loading processes to notify the exception records to the source team.

Over 5 years of Relational Modeling and Dimensional Data Modeling using Star & Snow Flake schema, De normalization, Normalization, and Aggregations. Designed Data bases using Erwin 4.5

Executed OBIEE administrative tasks such as OBIEE server installation, migration, and user maintenance.

Experience in creating Reusable Transformations (Joiner, Sorter, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Aggregator, and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets

Experienced in UNIX work environment, file transfers, job scheduling and error handling.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).

Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions, packages.

Travel to different companies to introduce and encourage the use of Python programming to computer technicians

Upgraded to Informatica PowerCenter 8.6.1 from version 8.1.1. Installed Hotfixes, utilities, and patches released from Informatica Corporation.

Designed, developed Ab Initio graphs for Loading into Data store.

Extensive experience in Tuning and scaling the procedures for better performance by running explain plan and using different approaches like hint and bulk load.

Experience working in Agile methodology and ability to manage change effectively.

Experience in IDQ (Informatica Data Quality), ILM (Informatica Life Cycle Management), and Data Archive a plus

Expert in Oracle11g/10g, IBM DB2 8.1, Sybase, SQL Server 2008/2005/2008, SQL, PL/SQL Stored procedures, functions, and exception handling using Toad and PLSQL.

Maintaining and Supporting during Assembly/Integration Test,QA,UAT, and Production for Issues/Bug fixes/Defects

Experience in all phases of agile System Development Life Cycle(SDLC)

Excellent communication, interpersonal skills, enthusiastic, knowledge-hungry self-starter, eager to meet challenges and quickly assimilate latest technologies concepts and ideas.

Education:

Bachelor’s in Production Engineering from Sri Guru GobindSinghji IE&T, Nanded, Maharashtra, India.

Technical Skills:

Operating Systems

Windows 2010/ 2008/2007/2005/NT/XP, UNIX, MS-DOS

ETL Tools

Informatica Power Center 9.6/9.1/8.6/ (Workflow Manager, Workflow

Monitor, Source Analyzer, Transformation developer, MappletDesigner, Mapping Designer, Repository manager and Informatica server), Informatica Metadata Manager, Ab-Initio, OBIEE.

Databases

Oracle 11g/10g, DB2, Teradata 14/13/12, SQL*Loader.

Data Modeling

Erwin 4.5/7.3, Logical Modeling, Physical Modeling, Relational Modeling, ER Diagrams, Dimensional Data Modeling (Star Schema Modeling, Snowflake Schema Modeling, FACT and Dimensions Tables), Entities, Attributes, Cardinality, MS Visio

Languages

SQL, PL/SQL, UNIX Shell scripts,Perl, Java and C,COBOL

Scheduling Tools

Tivoli TWS, Control-M, ESP and Informatica scheduler

Front End Tool

Teradata SQL Assistant and DB Visualizer.

Packages

MS Word, MS Excel, MS Project, MS Visio, MS PowerPoint

Versioning tools

GIT

Ticketing Tools

JIRA, BMC Remedy

Methodologies

Agile and Waterfall

BI Tools

Business Objects 6.5/XI/XIR2,Cognos 6.0

Professional Experience

Monsanto, St Louis, MO Feb 2017- current

Sr. Informatica Developer

Responsibilities:

Involved in business requirement analysis and prepared functional requirement document

Involved in the ETL technical design discussions and prepared ETL high level technical design document

Responsible for performance tuning ETL process to optimize load and query Performance

Created complex transformations using connected / unconnected lookups / procedures

Performed analysis of the data transformations and data types for transforming business rules into transformation logic to be used in the ETL process.

Project Life Cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing logic and transformation as per the requirement and creating mappings and loading the data into different targets.

Created complex Informatica mappings using Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations to extract, transform and loaded data mart area

Added and modified Informatica mapping sand sessions to improve performance, accuracy, and maintainability of existing ETL functionality.

Provided suggestions to improve existing mappings.

Used Expression, Lookup, Router and Data Masking transformations to create the mappings.

Involved in design, development and testing of Ab Initio graphs for formulary management system.

Wrote SQL overrides in source qualifiers with appropriate schema names for extracting only the required rows for optimal performance. Implemented complex expressions in expression transformations. Tested the queries thoroughly in SQL navigator for integrity and identifying data errors

Skilled in website design, HTML, XML parsing with Perl /CGI and PHP scripting along with MySQL database

Extensively used SQL/DML COMMANDS to retrieve/access/manipulate the data sources from the database.

Provide support in the development of OBIEE components and solutions

Preparation of Unit Test Plans and verification of functional specifications and review of deliverables

Setting of Error Logic for streamlining and automating the data loads for cleansing and trapping incorrect data on staging servers and then loading it to the data warehouse

Scheduling and Loading data process and monitoring the ETL Process

Used various data Sources like Flat Files, Relational Tables and COBOL Copybooks

Extensively used Stored Procedures for Pre-Session and Post Session Data Loading Process

Developed Ab Initio graphs to unload required data from database.

Developed graphs to do complex calculation by using normalize and de-normalize components.

Developed Ab Initio graph to load data into fact tables and update dimensional tables.

Design, test and modify OBIEE security using roles, privileges in Enterprise Manager (EM), Administration Console to Access security

Responsible for writing Perl scripts to render the html page from the API dynamically based on the data returned on ticket search and asset validation.

Developed Perl modules to handle backend validation of the form fields and to handle the data displayed as exporting it to excel.

Debugged the mappings using the Debugger facility in Informatica. Used the target load order feature for loading tables with constraints

Exposure toInformatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions

Worked with the B2B Operation console to create the partners, configure the Partner management, Event Monitors and the Events

Exposure toInformatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats.

Provided production support by monitoring the processes running daily

Identified the bottlenecks and improved overall performance of the sessions

Environment: Informatica Power Center 9.6, Erwin 4, Oracle 11g/10g, PL/SQL, PVCS, SQL*Loader, TOAD, MS SQL Server 2005/2008, Business objects XI R2, DB2 8.0, Windows XP

Union Bank, NYC, NY Nov 15- Jan 17

ETL Developer

Responsibilities:

Creating the design document as per requirements.

Developing Informatica mappings to implement business logic and creating test cases and executing to minimize the defect.

Experience in resolving issues related to file transfer processes taking place through Actelion manager

Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model (Kimball and Inmon), Star and Snowflake schema design

Analyzed queue managers and queues to find potential problems and worked towards fixing them

Developing ETL components and loading data from numerous data sources to Oracle database, flat file, mainframe systems.

Hands on Master Data Management tool, IBM Initiate Inspector.

Create Pl/SQL functions and procedures to provide ETL solution and create change requests and database requests to move code to production.

Used MQ explorer to check the status of the queues, delete messages in the DLQs, browse messages in the queues just to name a few.

Developed complex code based on PL/SQL (Procedure, Packages, Functions etc.) and UNIX Shell scripts for new enhancements

Analyzed Informatica job failures & connection issues and worked with DBA and network teams to resolve them

Developed Data sync ETL processes between SQL Server, Mongo DB, HDFS and Hive in Informatica Power Center and converting existing stored procedures to ETL workflows utilizing methodologies like Data Subsetting, Data Masking and Change Data Capture.

Experience in code migration and folder refreshes across different environments.

Experience in analyzing the cause of long running jobs and implementing different performance tuning methods

Work with companies to implement Python programming and train on-site Python Programmers

Speak at conferences and make presentations regarding Python as well as its applications and features

Develop readable and simple curriculum for the average user to get started with Python

Analyzed business requirement and file transfer error using MQ FTE logs

Restarted agents and outboxes after analyzing the file transfer issues

Used Active directory to check and modify the groups' and users' permission

Designed and developed ETL processes using Data Stage designer to load data from Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.

Used Data Stage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.

Developed job sequencer with proper job dependencies, job control stages, triggers.

Used Quality Stage to ensure consistency, removing data anomalies and spelling errors of the source information before being delivered for further processing.

Analyzed Autosys job failures and worked with Autosys team to resolve the issues

Used Blade Logic to check the status of services running on windows servers and checking the permissions of users and groups

Used ER/Studio to create and manage database designs, documents and reuse data assets.

Used admin console to create and modify different Informatica services

Worked with respective teams for the cleanup of the agents and outboxes in the decommissioned servers.

Environment: Informatica Power Center 9.5 (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Repository Server Admin console), Autosys Scheduler, Oracle 10g/11i, SQL, SQL *PLUS, MS SQL Server SSIS, ER/Studio, BladeLogic, UNIX, Putty, CA7 Automation Tool, Harvest, Toad for Oracle

Fidelity, Merrimack, NH Apr 2014- Oct 2015

Informatica Developer

Description:

The main objective of this Project is all about a customer information store. Solution is required to provide a suitable enterprise-wide information environment capable of meeting our future platform for information collation from various source systems, performing complex calculations. This Final year engineering project provides different reports and support &adhoc queries for making intelligent banking decision based on data available in various branches across the location connected over a period

Responsibilities:

Proficient in understanding business processes / requirements and translating them into technical requirements.

Involved in performance tuning and optimization of Data Stage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.

Preparation of Technical Design Document.

Define and implement ETL standards and guidelines and ensure adherence to enterprise level policies.

Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.

Focused to secure the confidential information existing in the non-production environments using ILM tool through Data Masking Transformation.

Working closely with Architects, Lead and Project Manager for the applications assessment to all the Data Masking Team on Proxy server and proving support on the databases.

Good Knowledge on applying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformationand loading into targets.

Involved in data integration for management decisions. Used to pull the data of the customers from different data sources like Oracle, SQL etc., integrate the data and generate and submit the reports to the client.

Created various resources Informatica, Teradata, Erwin and reporting and loaded into Metadata Manager Warehouse using Informatica Metadata Manager.

Modified Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data mart.

Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator and Joiner on the extracted source data according to the business rules and technical specifications.

Implemented a Data Quality Solutions, which includes Standardization and Matching of Source data using IDQ.

Experienced in setting up Informatica Data Quality(IDQ), Informatica Data Exchange IDE and Data Masking software’s

Created reusable transformations and Mapplet and used with various mappings.

Created Connected, Unconnected and Dynamic lookup transformation for better performance and increased the cache file size based on the size of the lookup data.

Involved in creating workflows and Worklets for new mappings.

Developed and Implemented Informatica parameter files to filter the daily data from the source system.

Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.

Excellent in using highly scalable parallel processing Infrastructure using Data Stage Parallel Extender.

Efficient in incorporation of various data sources such as Oracle, MS SQL Server, and DB2, Sybase, XML and Flat files into the staging area.

Experience in Mapping Server/parallel Jobs in Data Stage to populate tables in Data warehouse and Data marts.

Proven track record in addressing production issues like performance tuning and enhancement.

Excellent knowledge in creating and managing Conceptual, Logical and Physical Data Models.

Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.

Involved in fine tuning of SQL queries in the transformations.

Performed regression testing and integration testing.

Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.

Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool DataStage (Ver8.0/7), designing and developing jobs using DataStage Designer, Data Stage Manager, DataStage Director and DataStage Debugger.

Created databases, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.

Worked extensively in Performance Tuning with existing mappings by analyzing data flow.

Performed scheduling techniques with ETL jobs using scheduling tools, through pm cmd commands, based on the business requirement.

Developed Shell Scripts for getting the data from source systems to load into Data Warehouse.

Worked with VSS to maintain the versioning of various documents.

Environment: Informatica Power Center 9.1, Informatica Power Connect, (Repository Manger, Designer, Server Manager, Workflow Monitor, Workflow Manager), Power Exchange, Superglue, ETL, Teradata V2 R6.0 Flat files, Oracle11i

Comcast, Philadelphia, PA June '13 – Mar '14

ETL Developer

Comcast Corporation is an American global telecommunications conglomerate that is the largest broadcasting and cable television company in the world by revenue. As an ETL developer was involved in creating various logical mappings for the data marts based on the business requirements that carry the data related to the customer having information about Bill Payment, Dues, and Plan Details etc.

Responsibilities:

Prepared the required application design documents based on functionality required

Designed the ETL processes using Informatica to load data from Oracle, DB2 and Flat Files to staging database and from staging to the target database.

Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.

Involved in migration of mappings and sessions from development repository to production repository

Extensively used Informatica and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.

Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.

Involved in cleansing and extraction of data and defined quality process using IDQ for the warehouse.

Experience of Perl for File handling and Regular expressions for parsing sensitive information and DBI/DBD module for Sybase connection from Perl scripts for information storage.

Design code in Perl, with web frontends using HTML, CSS and Java Script and Java Script libraries as Sencha ExJS and jQuery and Perl frameworks such as Dancer and Catalyst plans and services.

Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions, index cache to manage very large volume of data.

Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.

Translated Business specifications into PL/SQL code. Extensively developed and fine-tuned Oracle Stored Procedures and triggers.

Used Update Strategies for cleansing, updating and adding data to the existing processes in the warehouse.

Defects are logged and change requests are submitted using defects module of Test Director using HP Quality Center

Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.

Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems per the user requirements.

Involved in migrating objects from DEV to QA and testing them and then promoting to Production

Involved in production support working with various tickets created while the users working to retrieve the database.

Environment: Informatica Power Center, IDQ, Business Objects, Terra Data, TOAD, Erwin, SQL, PL/SQL, XML, UNIX.

Benjamin Moore & Co., Montvale, NJ Dec '11 – May '13

ETL/ Informatica Developer

Benjamin Moore & Co. is a market leader in the North American paint and coatings industry. I worked on the project 'PRISM' which is the biggest migration implemented with Benjamin Moore. We were able to maintain the historical data in the legacy data base Infinium. This mergence project is implemented to migrate the data from legacy to the SAP data services. The system maintains the contract information, restructures the payment procedure and generates reports including monthly collection/income and list of defaulters. This Data Warehouse is build using Informatica Power Center 9.5.1 for extracting data from various sources including flat-files, SAP-ABAP, Teradata and Oracle.

Responsibilities:

Formulate and define best practice programming standards that meet regulatory compliance requirements for Implementation, Support and Upgrade Projects.

Working closely with Functional Leads, Business Analysts to determine optimal programming solutions to functional requirements in O2C, R2R and P2P process areas

Experience in ETL jobs design and development experience using SAP and non-SAP data sources which includes but not limited to SQL Server, Exadata, SAP BW, SAP ECC, SAP CRM.

Involved in creating the Greenplum framework on the security model, data layer and schema creations.

Designed and developed Data Profiling in Informatica Designer to profile the Source data to meet business requirements.

Used Informatica Data Quality for Data Analysis, Cleansing, Matching, Reporting and Monitoring

Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).

Uploading the Performance Test Plan, Test Scripts, Scenarios and Final Reports in the Quality Center for every application.

Prepared Traceability Matrices to track the requirements with the test cases and make sure none of them have been missed.

Used the Teradata external loading utilities like Multi Load, TPUMP, Fast Load and Fast Export to extract from and load effectively into Teradata database

Involved in Unit Testing Interacted with QA team for system/integration testing

Scheduled the Informatica Sessions and Batches based on Event based scheduling

Worked on Informatica Power Center Designer - Source Analyzer, Warehouse designer, Mapping Designer &Mapplet Designer and Transformation Developer.

Migration of the sessions, Workflows, Mappings and other objects from Development to Production by Exporting/Importing as XML files from Development to QA and Production Deployment

Improved the data quality by understanding the data and performing Data profiling.

Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts.

Have experience in developing mappings according to business rule, migrating to QA and production, naming conventions, mapping design standards and good knowledge in Data warehouse, PL/SQL, ODBC connections etc.

Environment: Informatica Power Center, Informatica Power Exchange, Oracle 11g, PL/SQL, DB2, Teradata TD13.10, Teradata Tools and Utilities, Autosys, SQL Server 2014, UNIX, Perl, MS Visio.

TTK Health Hyderabad, India Aug 2009 - Nov 2011

Application ETL Developer

Responsibilities:

Responsible for design and development of Sales Data Warehouse.

Involved in complete life cycle of ETL process for data movement from Concorde System to Staging and finally to data base.

Extensively used Power Center/Mart to design multiple mappings with embedded business logic.

Analyzed critical/complex processes of application packages to design ETL processes and adopted strategy to prevent failures and acted swiftly to recover from application failures.

Implemented Shell Scripts to use FTP, SQL Loader utilities.

Transformations like Sequence generator, Lookup, Joiner and Source qualifier transformations in Informatica Designer to effectively utilize Informatica services.

Supported the Informatica developers in Database Server Performance, tracking user activity, troubleshooting errors, tracking server resources and activities, tracing server events.

Implemented industry best practices ex Mapplets.

Involved in performance tuning of the Informatica mapping using various components like Parameter files, Variables and Dynamic Cache.

Documenting the ETL design specifications and mappings and maintaining the version control.

Involved in migration of Informatica mapping from Development to Production environment.

Environment: Informatica Power center 8.6, Oracle9i, Teradata v2r6, SAP, SAP BI 7.0, SQL Server, Sun Solaris.



Contact this candidate