Post Job Free

Resume

Sign in

Informatica Developer Etl

Location:
Hyderabad, Telangana, India
Posted:
October 20, 2021

Contact this candidate

Resume:

Mahesh

ETL Informatica Developer

ado3sv@r.postjobfree.com

630-***-****

Professional Summary:

Around 8+ years of professional experience as Software developer in the field of Information Technology, including in analysis, design, development, testing and Data Warehousing and ETL process using Informatics Power Center 9.5.1/10.1/10.2/10.4, Snowflake, DB2, MS SQL SERVER 2008/12 and Oracle 11G.

Over 7+ years of experience in ETL (Extract Transform Load), projects using Data Warehousing tools like Informatica and databases like Oracle, MySQL, SQL Server, and basic knowledge on Teradata.

Experience in change implementation, monitoring and troubleshooting of AWS Snowflake databases and cluster related issues.

Working in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.

Good understanding with all phases of SDLC (System Development Life Cycle) including Planning, Analysis, Design, Implementation and Maintenance.

Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files like fixed width and delimited.

Worked with the T-SQL for developing complex Stored Procedures, Triggers, Functions, Views, Indexes, Cursors, SQL joins and Dynamic SQL queries etc.

Experience with ETL tool Informatica in designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, and scheduling the Workflows and sessions.

Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.

Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center.

Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators and Normalizers.

MFT (managed file transfer) was used for the transmission of the extracted files to the vendor.

Set up SFTP connections to various vendors for receiving and sending of encrypted data as part of MFT Integration team support.

Implemented Slowly changing dimensions and change data capture using Informatica.

Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.

Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.

Strong analytical, problem-solving, communication, learning and team skills.

Experience in using Automation Scheduling tools like Auto-sys and Control-M.

Technical Skill:

ETL Technology

Informatica Power Center 9.5.1/9.6.1/10.2, SSIS

Data Warehouse

Dimensional, Facts, Star and Snowflake Schemes, SCD Types

Databases

T-SQL, Oracle 11g/10g, MS SQL Server 2008R2/2012/14

Scripts

UNIX, Linux, and Shell scripting

Autosys

Control-M, Tidal, Autosys.

MFT

Informatica Manger File Transfer

Operating Systems

Linux, Windows 7/10

Professional Experience

SIGNET JEWELERS LIMITED, Coppell, TX April 2021– present

Application Development Analyst

Responsibilities

Involved in the analysis of the user requirements and identifying the sources.

Created technical specification documents based on the requirements by using S2T Documents.

Involved in the preparation of High-level design documents and Low-level design documents.

Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.

Administered the repository by creating folders and logins for the group members and assigning necessary privileges.

Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.

Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and reusable Mapplets and Transformations.

Used debugger to debug mappings to gain troubleshooting information about data and error conditions.

Involved in monitoring the workflows and in optimizing the load times.

Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions, or system. This led to better session performance.

Interact with the vendor and set up the SFTP connection to the vendor’s ftp site for transferring extracted files.

MFT (managed file transfer) was used for the transmission of the extracted files to the vendor.

Set up SFTP connections to various vendors for receiving and sending of encrypted data as part of MFT Integration team support.

Involved in Unit testing and system integration testing (SIT) of Informatica and MFT projects.

Created the mapping specification, workflow specification and operations guide for the Informatica projects and MFT run book as part of end user training.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-in).

Environment: Windows, Informatica Power center 10.2/10.4, Informatica manager File transfer (MFT), UNIX, Oracle 11g, T-SQL, SQL Developer, CA-Autosys, Shell scripting, UAT and E2E testing.

Centene Corporations, St. Louis, MO Sept 2020 – April2021

ETL DEVELOPER

Responsibilities

Responsible for Business Analysis and Requirements Collection.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Prepared migration document to move the mappings from development to testing and then to production repositories.

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Developed mapping parameters and variables to support SQL override.

Developed mappings to load into staging tables and then to Dimensions and Facts.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Used Debugger to test the mappings and fixed the bugs.

Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Environment: Windows, Informatica Power center 9.6/10.2, UNIX, Oracle 11g, SQL, PL/SQL, SQL Developer, Autosys, Shell scripting, UAT and E2E testing.

Bayview Assets Management Coral Gables, FL OCT 2019 – AUG 2020

Informatica ETL Developer

Responsibilities

Analyzed the business requirements and functional specifications.

Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.

Used Informatica PowerCenter 10.1 for extraction, transformation, and load (ETL) of data in the data warehouse.

Build the Logical and Physical data model for snowflake as per the changes required.

Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.

Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.

Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.

Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.

Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions, and workflows.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Created Test cases for the mappings developed and then created integration Testing Document.

Prepared the error handling document to maintain the error handling process.

Tidal Scheduler was implemented for scheduling of Informatica workflows.

Pre and post session assignment variables were used to pass the variable values from one session to other.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Created jobs for automation of the Informatica Workflows and for DOS copy and moving of files using Tidal Scheduler.

Environment: Informatica Power Centre 10.1/9.6, Teradata 14/12, T-SQL, Oracle11g, SQL, UNIX, PL/SQL, Tidal 3.1.11 jobs.

Union Bank Los Angeles CA SEP 2018 – AUG 2019

Informatica ETL Developer

Responsibilities

Develop database objects like user-defined functions, user-defined procedures using T-SQL scripts.

Develop high-performance T-SQL queries, complex joins and advanced indexing techniques to optimize database operations.

Designed and developed ETL strategy to populate the Data Warehouse from various source systems such as Oracle, Flat files, XML, SQL server.

Used Informatica Designer to create complex mappings using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Filter and Router transformations to pipeline data to Data Warehouse/Data Marts.

Created and monitored Sessions/Batches using Informatica Server Manager/Workflow Monitor to load data into target Oracle database.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, work lets, Assignment, Timer, and scheduling of the workflow.

Designed and developed Informatics Power Center mappings, sessions, and workflows as per requirement and migrated to production environment.

Used Source Analyzer and Target designer to import the source and target database schemas, and the Mapping Designer to develop mapping applying transformations to source and finally map to target tables.

Developed Mappings to extract data from ODS to Data Mart, and monitored Daily, Weekly and Monthly Loads.

Implemented and created slowly changing dimensions of Type1 and Type2 for storing historic data into Data warehouse using Informatica.

Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.

Involved in automation of batch processing to run Informatica Workflows using Autosys.

Environment: Informatica Power Centre 9.5, Teradata 14/12, Oracle11g, SQL, UNIX, T-SQL, PL/SQL, Control-M, Auto-sys.

Questar Assessment, MN Jan 2017 – July 2018

ETL Developer

Responsibilities:

Worked on Informatics Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Parsed high-level design specification to simple ETL coding and mapping standards.

Extensively involved in designing the SSIS packages to export data of flat file source to SQL Server database.

Build efficient SSIS packages for processing fact and dimension tables with complex Transforms and type 1 and type 2 slowly changing dimensions.

Used the row count transformation and event handlers control flow to populate the special Log tables with high level SSIS package execution results.

Creating source and target table definitions using SSIS. Source data was extracted from Flat files, SQL Server and DB2 Database.

Extract Transform Load (ETL) development using SQL Server 2005, SQL 2008 Integration Services (SSIS).

Worked with various tasks of SSIS include Transform Data Task, Execute SQL Task, Active Script Task etc. Migration of Informatics Mappings/Sessions/Workflows from Dev, QA to Prod environments.

Implemented mapping for slowly changing dimensions (SCD) to maintain current data as well as historical data.

Analyzed, designed, developed, implemented, and maintained moderate to complex initial load and incremental load mappings to provide data for enterprise data warehouse.

Environment: Teradata, SQL SERVER Integration Services SSIS, Oracle, MS SQL Server 2008, T-SQL, Windows XP.

Humana, Louisville, KY May 15 – Dec 16

ETL Informatica Developer

Responsibilities:

Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.

Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources like Oracle and Delimited Flat files.

Development of ETL using Informatics 9.6.1

Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.

Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

Developed Workflows using task developer, work let designer and workflow designer in Workflow manager and monitored the results using workflow monitor.

Created various tasks like Session, Command, Timer and Event wait.

Tuned the performance of mappings by following Informatics best practices and applied several methods to get best performance by decreasing the run time of workflows.

Prepared SQL Queries to validate the data in both source and target databases.

Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.

Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.

Created Test cases for the mappings developed and then created integration Testing Document.

Prepared the error handling document to maintain the error handling process.

Automated the Informatics jobs using UNIX shell scripting.

Closely worked with the reporting team to ensure that correct data is presented in the reports.

Interaction with the offshore team daily on the development activities.

Environment: Informatica Power Center 9.6.1, Oracle 11g, delimited files, UNIX Shell Script, Windows 7, Toad for oracle 11G, Teradata, SSIS, T-SQL, SQL server 2008.

BHP Billiton, Houston, TX Mar 14–Jan 15

ETL Developer

Responsibilities:

Extensively used Informatica to load data from Oracle9 and flat files into the target Oracle 10g database.

Used various transformations like Joiner, Aggregator, Expression, Lookup, Filter, Update Strategy and Stored Procedures.

Used Mapp lets and Reusable Transformations to prevent redundancy of transformation usage and maintainability.

Created and scheduled workflows using Workflow Manager to load the data into the Target Database.

Involved in performance tuning of Targets, Sources, and Mappings. Improved performance by identifying performance bottlenecks.

Worked on different tasks in Workflows like sessions, events raise, event wait, e-mail and timer for scheduling of the workflow.

Involved in meetings to gather information and requirements from the clients.

Involved in Designing the ETL process to extract, translate and load data from flat files to warehouse data base.

Used Debugger to validate mappings and to obtain troubleshooting information about data by inserting Breakpoints.

Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the rejected rows.

Created UNIX shell scripting and automation of scheduling processes.

Wrote SQL Queries, PL/SQL Procedures, Functions, and Triggers for implementing business logic and for validating the data loaded into the target tables using query tool TOAD.

Environment: Informatics Power Center 9.6.1, Oracle 10g, Teradata, MS SQL SERVER 2000, T-SQL, SQL, SSIS, PL/SQL, SQL*Loader, UNIX Shell Script.

Lincoln Financial Group, Greensboro, NC Feb 13 – Jan 14

ETL Informatica Developer

Responsibilities:

Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.

Developed ETL mappings, transformations using Informatica Power Center 9.0.1/8.6.1.

Implemented Change Data Capture (CDC) process to load into the staging area.

Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, and Workflow Manager.

Worked on Importing DB2 tables into Informatica ETL power center.

Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, & Excel.

Developed reusable Mapplets, Transformations and user defined functions.

Extensively used Mapping Debugger to handle the data errors in the mapping designer.

Experience using transformations such as Normalizer, Unconnected/Connected Lookups, Router, Aggregator, Joiner, Update Strategy, Union, Sorter, and reusable transformations.

Created event wait and event raise, email, command tasks in the workflow’s manager.

Responsible for tuning ETL procedures to optimize load and query Performance.

Good Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.

Extensively worked with incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.

Used Informatica Power Exchange for loading/retrieving data from mainframe system.

Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.

Involved in writing shell scripts for file transfers, file renaming and concatenating files.

Created debugging sessions for error identification by creating break points and monitoring the debug data values in the mapping designer.

Developed Unit test cases and Unit test plans to verify the data loading process.

Environment: Informatica Power Center 9.5.1, Oracle 11g/10G, UNIX Shell Script, Windows XP, Toad for oracle, SQL server 2008.

IGATE, Hyderabad, India Aug 12 – Dec 12

ETL Informatica Developer

Responsibilities:

Assisted to prepare design/specifications for data Extraction, Transformation and Loading.

Developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.

Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.

Prepared reusable transformations to load data from operational data source to Data Warehouse.

Wrote complex SQL Queries involving multiple tables with joins.

Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.

Used debugger, session logs and workflow logs to test the mapping and fixed the bugs.

Analyzed the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.

Improved the performance of the mappings, sessions using various optimization techniques.

Environment: Informatica Power Center 9.5.1 (Informatica Designer, Workflow Manager, Workflow Monitor), Oracle 10g, Flat files, UNIX, Shell Scripts, Toad 7.5.

EDUCATION

Bachelor’s in computer science May 2012 at JNTU Kakinada, India

Master’s in computer science May 2016 Governors State University IL



Contact this candidate