Post Job Free
Sign in

Data Manager

Location:
Hyderabad, Telangana, India
Posted:
June 24, 2016

Contact this candidate

Resume:

Tarun Kumar

Sr. ETL/Informatica Developer

Phone: 832-***-****

E-mail: ******@********.***

SUMMARY

About 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9.x/8.x/7.1.3/7.1.1/6.2, Power Exchange using Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.

Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.

Expertise in full life cycle of ETL (Extraction, Transformation and Loading) using Informatica Power Center (Repository Manager, Server Manager, Mapping Designer, Workflow Manager, Workflow monitor).

Involved in the data analysis for source and target systems

Implemented performance tuning techniques at application, database and system levels

Experience in data extraction from heterogeneous sources using Informatica Power center

Experience in UNIX shell programming.

Solid understanding of Relational (ROLAP) and Multidimensional (MOLAP) modeling, broad understanding of data warehousing concepts, star and snowflake schema database design methodologies and Meta data management.

Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.

Expertise in working with different data sources like flat files, XML, Teradata, DB2, and Oracle.

Experience in SQL-tuning using Hints, Materialized Views.

Expertise in understanding Fact Tables, Dimension Tables, Summary Tables

Involved in the designing and building of Universes using Designer.

Good exposure to development, testing, debugging, implementation, documentation, user training & production support.

Ability to work effectively in a supervisory andnon supervisory environments as a team member as well as an individual.

Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.

Have good experience with onsite and offshore coordination

TECHNICAL SKILLS

ETL Tools

Informatica (Power Center/Power Mart) 9.6/9.5/9.1/9/8.6.1/8.1/7.1.2/6.2/5.1, Power Exchange, Power Connect, SQL Server SSIS/ DTS, Datastage 8.x/7.x

Databases

Oracle 10g/9i/8i, IBM UD2 DB2, Sybase, MS SQL Server 2008/2005/2000, Teradata v2r12/v2r6, SAP HANA

Programming Languages

C, C++, SQL, PL/SQL, UNIX, XML, Java, .Net, c#

BI Tools

Business Objects XI r3.1/r2/6.5.1/6.1a/5.1, Cognos

Web Technologies

JavaScript 1.2, HTML 4.0

Others

TOAD, SQL Loader, MS Office, Winscp (FTP), Autosys, Rational Clear Case/Clear Quest/ Req.pro, Control-M, Tivoli(IBM), MS.Visio, Harvest, Mercury Quality center(defects), ESP Scheduler

Operating Systems

Windows 2003/2000/NT, Unix - Sun Solaris, Linux, HP

PROFESSIONAL EXPERIENCE

Merkle INC, Denver CO January 2016-present

Role: Sr.Informatica Consultant

Responsibilities:

• Ongoing development and maintenance of several client application designed to provide a wide range of CRM related functionality.

• Gathering requirements and understanding of data warehousing concepts including dimension models.

• Design and development new system modules based on the client end-user business requirements and with focus on application stability and performance.

• Performing Development methodologies, techniques, and tools, including SVN, Issue Tracking, and Software Development Lifecycle (waterfall, agile).

• Experience designing and developing ETL solution using Informatica Power Center 9.x version

• Working with Lead/BA to understand requirement for project development.

• Analyzing NETEZZA stored procedures.

• Database performance-tuning expertise.

• RDBMS experience with Oracle and SQL Server.

• Expertise writing complex SQL.

• Reverse Engineered NETEZZA Stored Procedures to design Informatica Source-Target.

• Used Microsoft Visio to design Informatica mappings

• Used various transformations in Informatica mappings to reflect the logics involved in NETEZZA stored Procedures.

• Worked on SQL tools like SQL Developer Management Studio/visual studio to run SQL Queries and validate the data.

• created multiple views in SQL server to reduce the load on Informatica and rely more on Database

• Performed Unit testing and Integrated testing for the mappings created.

• Fine-tuned the existing Informatica mappings for better performances.

• Used Parameter files to specify DB Connection parameters for sources.

• Used debugger to test the mapping and fixed the bugs.

• Developed Unit Test Cases to ensure successful execution of the data loading processes.

Environment: Informatica Power Center 9.6.1, SQL Server, Oracle 11g, Toad Oracle Exadata

(OLAP), PL/SQL Developer, NETEZZA, Aginity Work Bench, Management Studio, Visual Studio, Windows 7, UNIX

BB&T bank, Winston - Salem, NC November 2014-Dec 2015

Role: Sr.ETL/Informatica developer

Responsibilities:

• Attended Workshops with Users for requirements gathering, business analysis and designing of the Enterprise Data warehouse

• Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like Oracle, SQL server, flat files, and mainframe files and loading into Data Landing Zone (DLZ) and eventually to SAP HANA Database.

• Involved in data profiling prior to data loading.

• worked with Informatica versions 9.5 and 9.6

• Maintained warehouse metadata, naming standards and warehouse standards for future application development.

• Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager, Informatica Data validation (DVO), Informatica Data Quality (IDQ).

• Design and Development of ETL routines, using Informatica Power Center within the Informatica Mappings, usage of Lookups, Aggregator, Mapplets, connected and unconnected stored procedures / functions /Lookups, SQL overrides usage in Lookups were extensively done.

• Created Excel mapping documents using Macros with the information gathered from DAW (Data Acquisition Worksheet)

• Extracted data from various source systems like mainframe files, flat files, DB tables etc.

• Created complex UNIX Wrapper scripts to call Workflows from UNIX to execute Informatica mappings.

•Extensively worked on Control files (Handshake with data files) as they are required for data validation.

• Scheduled UNIX scripts using CA Workload Automation ESP edition scheduler to run daily/monthly jobs.

• Loaded data into SAP HANA from Data landing zone (DLZ) for downstream teams who were trying to load data into bank analyzer.

• Used SAP HANA studio to unit testing and to validate the loaded data using complex queries

• Created several Connect direct (CDD) jobs for flat file source systems that are coming from External Vendors to land them into Informatica server.

•Used HP ALM to track defects during System Integration testing and User Acceptance testing phases.

• Conducted Defect triage calls with testing team for defects encountered while testing

• Created Technical Specification documents for ETL code that has been designed and developed

• Conducted multiple workshops with offshore team regarding Defect triages and ESP scheduling every day.

• Performed match/merge and ran match rules to check the effectiveness of MDM process on data.

• Worked on data cleansing and standardization using the cleanse functions in Informatica MDM

• Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM

From various source systems.

• Supported Informatica jobs after Production deployment for one month with 24*7 support until warranty period.

Environment: Informatica Power Center 9.5/9.6, SAP HANA, MS Access, MS SQL Server 2008, SQL, SQL*Plus, TOAD, CA Workload Automation ESP Edition, Windows XP, UNIX, HP ALM

US Cellular, Chicago, IL February 2014-November 2014

Role: Sr.ETL/Informatica developer

Responsibilities:

• Collaborated with Business analysts for requirements gathering, business analysis and designing of the Enterprise Data warehouse

• Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like Oracle, flat files, XML files and loading into Staging and Data Ware House Atomic and Info delivery layers.

• Involved in massive data cleansing prior to data staging.

• Maintained warehouse metadata, naming standards and warehouse standards for future application development.

• Designed Type1 and Type2mappings

• Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.

• Design and Development of ETL routines, using Informatic Power Center within the Informatica Mappings, usage of Lookups, Aggregator, XML, Ranking, Mapplets, connected and unconnected stored procedures / functions /Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.

• Created complex mappings with shared objects/Reusable Transformations/ Mapplets using mapping/ mapplet Parameters/Variables.

• Configured workflows with Email Task, which would send mail with session, log for Failure of a sessions and for Target Failed Rows.

• Used Server Manager to create schedules and monitor sessions. And to send the error messages to the concerned personal in case of process failures.

• Created sequential/concurrent Sessions/ Batches for data loading process and used Pre&Post Session SQL Script to meet business logic.

• Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter file.

• Worked with UNIX shell scripts extensively for job execution and automation.

• Scheduled Informatica workflows using Tivoli Workflow scheduler to run at regular intervals.

• Extensively worked on Data Indexing and Data Partitioning.

• Used SQL tools like Query Analyzer and TOAD to run SQL queries and validate the data.

• Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.

• Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.

• Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

• Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 9.1.0, Oracle 10g/11g, MS Access, MS SQL Server 2008, SQL,PL/SQL, T-

SQL, SQL*Plus, TOAD, Tivoli Workflow Scheduler (TWS), Windows XP, UNIX, Oracle Applications 11i, Sun Solaris

Windstream Communications, Cedar Rapids, IA August 2012 – January 2014

Role: Sr.ETL/Informatica Developer

Responsibilities:

Responsible for gathering the requirements both functional and technical and documentation.

Worked with the business analysts in requirement analysis to implement the ETL process

Requirement analysis/documentation, developing functional and technical specifications, DW and ETL designing, developing detailed mapping specifications, DFD's and scheduling charts.

Developing data models and designing data warehouse in view of the project requirements.

Designing, developing, testing, performance tuning and scheduling Datastage jobs.

Developing and implementing data masking, encoding, decoding measures using the Datastage and UNIX scripting.

Extensively worked on Datastage routines, custom stage and wrappers to handle complex transformations, calculations, encoding, decoding etc.

Configuring the Datastage server for enhanced performance and resolving memory scarcity issues.

Developed both batch and real time ETL and reporting process in Datastage.

Optimized SQL query and streamlined the data flow processes.

Used IDE for data cleansing and data profiling.

Analyzed the business requirements and functional specifications.

Extracted data from oracle database and spreadsheets, staged into a single place, and applied business logic to load them in the central oracle database.

Extensively worked on Informatica Data Explorer (IDE) to profile data and monitor the data issues.

Used Informatica Power Center 9 for extraction, transformation and load (ETL) of data in the data warehouse.

Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.

Expertise in developing high performance code using Informatica Power center and Teradata load scripts.

Developed complex mappings in Informatica to load the data from various sources.

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Used FTP connections to write the target to a different remote location.

Parameterized the mappings and increased the re-usability.

Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.

Created procedures to truncate data in the target before the session run.

Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.

Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.

Extensive knowledge in creating slowly changing dimensions (Type-1 and Type-2).

Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.

Created a list of the inconsistencies in the data load on the client side to review and correct the issues on their side.

Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Created Test cases for the mappings developed and then created integration Testing Document.

Followed Informatica recommendations, methodologies and best practices.

Environment: Informatica Power Center 8.6/9, Oracle 10g/11g, MS Access, MS SQL Server 2008, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, Erwin, Windows XP, UNIX, Oracle Applications 11i.

Sutter Health Support Services, Mather, CA March 2011 – Aug 2012

Sr. ETL Developer

Responsibilities

Analyzing the source data coming from different sources and working with business users and developers to develop the Model.

Involved in Dimensional modeling to Design and develop STAR Schema, Using ER-win to design Fact and Dimension Tables.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Worked with MLOAD, FASTLOAD, TPUMP and BTEQ utilities of Teradata for faster loading and to improve the performance.

Creation of customized MLOAD scripts on UNIX platform for Teradata loads.

Created Test tables and worktables on development and production on Teradata.

Developed views on departmental and claims engine database to get the required data.

Developed application views in Teradata and using expression and router implemented the Change Data Capture (CDC) process.

Involved in source data profiling & source system analysis.

Involved in designing the Logical and Physical data models for the data warehouse

Developed SQL code for data validations and data computation process on source DB2 transaction system and on target warehouse

Involved in massive data cleansing prior to data staging.

Developed shell scripts, cron jobs for job execution and automation on server side.

Define strategy for ETL processes, procedures and operations

Prepared a handbook of standards and Documented standards for Informatica code development.

Administrated users and user profiles and maintained the repository server.

Tuning the complex mappings based on source, target and mapping, session level

Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files

Extensively used transformations like lookup, router, Aggregator, sequence generator, filter, update strategy, joiner.

Handled slowly changing dimensions of Type1/ Type 2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.

Document the process for further maintenance and support.

Environment: Informatica Power Center 8.5.1/7.1.4, Teradata v2r12, Business Object XI, Oracle 10g, MS SQL Server, Toad, PL/SQL, SQL, XML

ECO-LAB ST. PAUL MN Sep 2009 – Jan 2011

Role: Sr. Informatica Developer

Responsibilities:

Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between ODS, staging tables and data warehouses/data marts.

Designed and developed processes to support data quality issues and detection and resolutions of error conditions.

Working with the Business Analysts and the QA team for validation and verification of the development.

Extract data from flat files, Oracle, DB2, Mainframe files, SQL Server 2008, and to load the data into the target database.

Wrote T-SQL scripts to validate and correct inconsistent data in the staging database before loading data into databases.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Implemented various scenarios related to slowly growing targets and slowly changing dimensions(Type1, Type2, Type3)

Implemented various business rules of data transformations using various Informatica transformations like Normalizer, Source Qualifier, Update Strategy, Look up(connected/unconnected/static cached/dynamic cached), Sequence Generator, expression, Aggregator, XML(source and generator), Stored Procedures.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

Worked with newer Informatica transformations like Java transformation, Transaction Control.

Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure Fast Load and MultiLoad sessions.

Experience with Teradata as the target for the datamarts, worked with BTEQ, Fast Load and MultiLoad

Provided administrative functions like creating repositories, backing up repositories, setting up users, assigning permissions and setting up folders in Repository manager.

Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.

Heavily involved with performance tuning of Oracle database – using TKProf utility, working with partitioned tables, implementing layer of materialized views to speed up lookup queries, working with Bitmap indexes for dimension tables, DBMS Stats package to update statistics, using SQL hints.

Wrote PL/SQL stored procedures/functions to read and write data for the Control Processes at ODS and CDM levels.

Extensively used pmcmd command to invoke the workflows from Unix shell scripts

Scheduled workflows using autosys job plan.

Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.

Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.

Co-ordinated with the offshore teams and mentored junior developers.

Environment: Informatica Power Center 8.6, Oracle 10g, Autosys, Oracle 9i, Erwin 4.5, CMS, MS PowerPoint, MS Visio,

TOAD 9.0, PL/SQL, UNIX, SQL Loader*, SQL server 2005, MS SQL Server 2005/2008.

Citi Bank, NY Oct 08 – Aug 09

Sr. DW Consultant

Responsibilities:

Worked with users to understand their reporting requirements and translate those requirements to extract data, and load data in the form of a report.

Tested Informatica 8.0 with all functionalities for migration of Informatica 6.2/7.1.2 to 8.0

Migrated repository & folders from 7.1.2 to 8.0

Analyzed business and systems specifications and developed logic flowcharts

Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Java, Update Strategy, joiner and Rank transformations.

Implemented Informatica Power Center for building Star Schema in Oracle Data Warehouse from different OLTP systems

Implemented bulk load method with SQL Loader for loading history data to staging area.

Defined UNIX Batch scripts for automation of execution of Informatica workflows.

Executed Multi load scripts for daily batch jobs.

Responsible to tune ETL procedures and schemas to optimize load and query Performance.

Implemented business rules by using database triggers.

Generated Drill Up, Drill Down and Drill Through reports using Business objects based on user requirements.

Improved Application performance by fine-tuning application using TKPROF and EXPLAIN PLAN

Created several materialized views for reporting purpose and better performance.

Expertise in setting up UNIXcronjobs using cron tab in UNIX.

Scheduled informatica jobs using AutoSys scheduler to run at regular intervals.

Experience in SQLtuning using Hints, Materialized Views.

Extensively worked in the performance tuning for the mappings and ETL Procedures both at designer and session level.

Using Dynamic SQL and SQL*Loader in distributed environment.

Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.

Co-ordinated successfully with the offshore clients.

Environment: Informatica Power Center 8.0/7.1.2, Oracle 10g, SQL, PL/SQL, Business Objects, UNIX, Shell Scripts, TOAD

8.0, AutoSys, Sun Solaris.

Providian Financials, India Sept 2006 – Sept 2008

Role: DW Developer

Responsibilities:

Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.

Used Informatica -Designer for developing mappings, using transformations, which includes aggregation, Updating, lookup, and summation. Developed sessions using Server Manager and improved the performance details.

Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.

Created reusable transformations called mapplets and used them in mappings in case os reuse of the transformations in different mappings.

Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.

Involved in creating Technical Specification Document (TSD) for the project.

Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.

Involved in the development of Data Mart and populating the data marts using Informatica.

Created sessions to run the mappings. Created mapplets to improve the Performance.

Worked with offshore clients and maintained a good relation with them.

Environment: Informatica 5.1, Windows 2000, Oracle 8i, PL/SQL, SQL, Sybase, Cognos, Windows 2000



Contact this candidate