Post Job Free
Sign in

Informatica ETL Lead

Location:
St. Louis, MO
Salary:
90000
Posted:
July 04, 2017

Contact this candidate

Resume:

Atul Malhotra

+1-314-***-****

**.************@*****.***

Professional Summary

** ***** ** ********* ******** industry experience in Designing, Development, Testing and Implementation of Data Warehouses & Data Marts using Informatica Powercenter tool and web based applications using Microsoft technologies.

Extensively worked on Designer, Workflow Manager and Workflow Monitor using Informatica

Extensively worked on Informatica Power center transformations with complex mappings using Expressions, Joiners, Routers, Lookups, Update strategy, Sorters, Sequence Generators, Source Qualifiers, Stored Procedures, Rank, Filter, Transaction control, Normalizer and Aggregators to develop and load data into different target types.

Good Knowledge of Data warehouse concepts, Star & Snow-Flake Schemas, FACT and Dimensions Tables, Physical, Logical Data Modeling using E- R diagram using MS Visio tool

Experience in creating the Workflows, Worklets, Mappings, Mapplets, Reusable transformations and scheduling the Workflows and Sessions using Informatica Power Center.

Worked extensively in Pushdown Optimization, Incremental Aggregation, and Persistent Cache in order to increase the performance.

knowledge on end to end process of Data Quality and MDM requirements and its transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer, Trust, User exist and other significant transformations

Good experience on Data Warehouse Concepts like Dimension Tables, Fact tables, slowly changing dimensions, DataMart’s, ODS and dimensional modelling schemas.

Good understanding of database development, Stored Procedures and Functions using SQL Server.

Experience in working with both waterfall and agile methodologies to meet regular business needs as well urgent PROD fixes or integrated business releases.

Knowledge and expertise in .NET framework1.1, 2.0, 3.5 encompassing ASP.NET, Web Services, WCF, ADO.NET, C#, MVC, Entity Framework and Enterprise Library.

Knowledge on SPLUNK and Tidal scheduler monitoring tools.

Understanding of Microsoft Project Plan, ITIL concepts and Incident Management.

Worked on different client locations at UK (Swindon), PERU (Lima), ECUADOR (Quito) and USA.

Certifications and Awards

PRINCE2 Practitioner.

ITIL v3.0 Foundation.

BI Data Visualization – Tableau Foundation and Agile Development Methodology – Internal (TCS)

Received PRIDE award from customer (Nationwide Building Society) Swindon, UK. On the successful and bug free deliverables.

Experience

Working as an Associate Consultant (From Oct 2010 to till date) with TATA Consultancy Services.

Worked as a Senior IT Consultant (From Apr 2010 to Oct 2010) with Fujitsu Consulting India.

Worked as a Senior Software Engineer (From June 2007 to Apr 2010) with Fiserv India Noida.

Worked as a Senior Software Engineer (From March 2006 to June 2007) with Birlasoft (India) Ltd. Noida.

Worked as a Software Engineer (From Aug 2005 to March 2006) with HCL Technology Ltd.

Education

M.Sc. (Master of Science) Computer Science, Kurukshretra University Haryana, India

Bachelor of Science in Computer Science, Kurukshetra University Ambala Cantt, India

VISA Status

EAD Valid till June 2019

Technical/Special Skills

Database: SQL, PL/SQL, SQL Server and Oracle11g

ETL & Reporting Tools: Information Data Quality, Informatica Power center, SSIS, Tableau, SPLUNK, Tidal scheduler and SQL Developer.

Languages: Classic ASP, ASP.Net.1.1, 2.0, 3.5, C#, VB.Net, Web Service, WCF, JAVA and JAVA SWING.

Scripting Languages: Java Script, XML, XSD, LINQ, PowerShell Script.

Operating Systems: Linux, Windows 95/98/NT/2000/XP.

Project Planning : Microsoft Project Plan

Configuration Tools : Team Foundation Server 2010 and VSS

Management Methodology: Agile and Waterfall.

Relevant Experience

Zimmer Biomet, Warsaw, Indiana May. 2016 – Feb 2017

Informatica ETL Lead

Data Integration Hub or DIH provides an effective mechanism to share data at Zimmer Biomet. In a DIH, source application is called a “Publisher”, while consumers of data are called “Subscribers”. DIH minimizes technical considerations when data is pulled from multiple sources to a single subset of data. DIH also separates the distribution of data from the publishing system allowing the source to send once and DIH to handle distribution of data to multiple subscribers at different frequencies.

Responsibility

Coordinating with the business stakeholders to understand the business requirements and working with SME to design the best possible solutions to meet the requirements.

Working to technical team to develop custom solution for automating various business processes and meet specific client requirements.

Involved in design, development and implementation of enterprise data warehouse and Data-Mart models and Operational Data Store (ODS).

Based on the logic, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping. Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging

Review data marts / warehouse deliverables and suggest and co-create standards and guidelines.

Perform data quality and profiling activities, advice on data quality issues and provide analysis on data collection.

Develop mappings; define workflows and tasks, monitor sessions, export and import mappings and workflows in Dev, QA and Production environment.

Translating the requirement into design document like HLD, LLD and Mapping Integration and Transformation Sheets.

Use Data Validation Option to test the workflow and respective data flow from source and target.

Involved in creating, monitoring, modifying, & communicating the project plan with other team members.

Technologies: INFORMATICA PowerCenter, INFORMATICA MDM, ORACLE11g, XML, XSD, DB2, INFORMATICA Power Exchange.

Boston Scientific Corporation, St Paul, Minnesota and INDIA Aug. 2014 – May. 2016

Lead ETL Developer

Responsibility

Worked with Business users in gathering the requirements and created Functional and ETL specification documents.

Created mappings document and performed analysis between source and target

Worked on PowerCenter Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Used various transformations like Filter, Aggregator, Source Qualifier, Router, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop mappings

Implemented slowly changing dimensions (SCD) Type 1 and Type 2 for some of the Tables as per user requirement.

Developed mapping parameters and variables to support SQL override and ongoing requirements

Load the transformed and aggregated data into downstream.

Involved in Performance tuning at source, target, mappings and sessions by finding the performance bottlenecks.

Provide end to end support to business users to correct any post implementation issues.

Managed and lead the offshore team singlehandedly.

Generated and validated the reports in Tableau.

Performed data validation between sources to target systems.

Monitor the scheduled batch jobs for the execution of the workflows for any issues/failure

Designed complex SQLs against source systems and supported unit testing and QA validation

Prepared validation scripts and validated the data against source system

Technologies: INFORMATICA Powercenter 9.5.1, Web Services, SQL, PL/SQL, SQL Server, XML, XSD and ORACLE11g.

BCP Migration Assessment (Banco de Crédito del Perú) LIMA, PERU July, 2013 – Aug 2014

Lead ETL Developer

Responsibility

Worked with Business Analysis team in gathering the requirements and created Functional and ETL specification documents.

Using Informatica PowerCenter Designer analysed the source data to Extract & Transform from various source systems (oracle 10 g and flat files s) by incorporating business rules using different objects and functions that the tool supports

Implemented slowly changing dimensions (SCD) Type 1 and Type 2 for some of the Tables as per user requirement.

Data analysis of ODS system and WebFocus report

Designed mappings between source systems to ODS.

Load the transformed and aggregated data into Salesforce

Monitor the scheduled batch jobs for the execution of the workflows for any issues/failure

Responsible for efforts estimation and allocation of activities to the offshore team members

Involved in use cases and unit testing, Production support

Prepared validation scripts and validated the data against source system

Supporting the System testing and User Acceptance Testing.

Coordination with offshore and onsite teams for status reporting.

Technologies: Informatica PowerCenter, SQL, PL/SQL, Oracle SQL Developer, Oracle, SQL Server and UNIX.

Nationwide Building Society, Swindon, UK June 2012 – July 2013

EIT and 7DAS Account Switcher.

ETL Developer

Responsibility

Connect with BAs and come up with high level design documents

Designed mappings between source systems to ODS

Created Tables, Keys (Unique and Primary) and Indexes in the MS SQL server database.

Extracted data from SQL Server (source) to build an Operation Data Source. Applied business logic to load the data into local operational Data Warehouse.

Worked on Source Analyzer, Target Designer, Mapping Designer and Transformation Designer.

Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop the mappings.

Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

Involved in Unit, Integration, System, and Performance testing levels.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Migrated the code into QA (Testing) and supported QA team and UAT (User).

Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Responsible for efforts estimation and allocation of activities to the offshore team members

Monitor the scheduled batch jobs for the execution of the workflows for any issues/failure

Involved in use cases and unit testing, Production support

Technologies: Informatica PowerCenter, PL/SQL, SQL server, Oracle11g, flat files, MS Access.

Banco Pichincha, Quito, Ecuador and India Oct 2010 – Arp 2012

Informatica Developer

Responsibility

Analysed design documents for informatica mappings.

Developed complex mappings using Informatica Power Center Designer to transform and load the data from various source systems like Flat files (Fixed width and Delimiter),SQL server loading to Oracle target database.

Extensively used various types of transformations such as Expression, Joiner, Update strategy, Filter, Aggregation, Lookup (Connected and Unconnected) to load the data

Efficiently Used mapping parameters and parameter files on different mappings and sessions, based on various business requirements.

Performed Mapping Optimizations on various transformations on the mappings to ensure maximum Efficiency.

Implemented complex mapping such as Slowly Changing Dimensions (Type II) using current Flag, Effective Date Range.

Effectively used various tasks (Reusable & Non Reusable), Command, Session, Decision, Event Raise, Event wait, Email, etc.

Effectively Performed in Unit testing and Peer testing

Participated in the technical discussions with client and senior management

Regularly interacting with the client and attend the status call on daily basis

Technologies: Informatica PowerCenter, Flat Files, ORACLE, SQL Server, UNIX, Windows NT.

Fiserv, DCU (XP2 Platform) INDIA - Noida June. 2007 – March 2010

Sr. Software Engineer

Responsibility

Understanding workflow rules, system requirements. Designing logical & Physical system flows.

Responsible for producing a reproducible build from the files placed in the software configuration management system.

Incorporated enhancements and change requests.

Involved in system/integration testing.

Development in ASP.Net, C#, HTML, JavaScript and Unit testing

Unit testing, debugging and deployment of the system.

Documenting the processes as per standards.

Technologies: ASP.net 2.0, C#, TFS, Web Service, WCF, SQL Server 2005 etc.

Monster – BCS Rewrite, Noida, INDIA March, 2006 – June 2007

Sr. Software Engineer

Responsibility

Analysis of requirements, Design and develop web-application.

Involved in Analysis, Requirements Gathering and documenting Functional & Technical specifications into development requirement document

Development in ASP.Net, C#, HTML, CSS and JavaScript.

Created database, tables and stored procedures.

Interacting with onsite team and end users.

Involved in Unit testing, debugging and deployment of the system in QA, UAT and PROD environments.

Documenting the processes as per standards.

Technologies: ASP, ASP.net2.0, C#, TFS, Web Service, SQL SERVER 2005 and Java Script etc.



Contact this candidate