Post Job Free

Resume

Sign in

Informatica Developer Etl

Location:
Reston, VA
Salary:
150
Posted:
March 27, 2023

Contact this candidate

Resume:

Arun Nakirikanti

PROFESSIONAL SUMMARY:

Around * years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center Informatica 10.x/ 9.x/ 8.x, IDQ, Informatica Developer, MDM, IDE, SSIS, IDS.

Experience in analysis, design and development of enterprise level data warehouses using Informatica. Experience in complete Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing)

Experience in Data Modeling & Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.

Expertise in developing standard and re-usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router.

Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server 2008/2005/2000, XML files, Teradata, Netezza,Sybase,DB2, Flat files, XML, Salesforce sources into staging area and different target databases.

Designed complex Mappings and have expertise in performance tuning and slowly changing Dimension Tables and Fact tables.

Excellent working experience with Insurance Industry with strong Business Knowledge in Auto, Life and Health Care - Lines of Business

Worked on scheduling tools Informatica Scheduler, Autosys, Tivoli/Maestro & CONTROL-M.

Experience in PL/SQL Programming and in writing Stored Procedures, Functions etc.

Experience in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informatica 10.X,9.X/8.X/7.X/6.X

Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 11g/10g/9i/8i, IBMDB2 UDB, XML files.

Extensively worked on Informatica Data Quality and Informatica Power center throughout complete IDQ and MDM projects.

Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.

Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the rejected rows.

MDM Developer with experience in implementing, development, maintenance and troubleshooting on Informatica MDM solutions, Metadata Management, data quality, data integration.

Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).

Worked Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Experience in UNIXshell scripting, Perl scripting and automation of ETL Processes.

Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.

Experience in support and knowledge transfer to the production team.

Prepared user requirement documentation for mapping and additional functionality.

Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.

TECHNICAL SKILLS:

ETL Tools:InformaticaPowerCenter10.2,9.6/9.5/9.1/8.6/7.x/6.x,SalesForce,InformaticaCloud,InformaticaPower Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Information Data Quality

(IDQ) 9.6.1/9.5.1, Informatica Power Connect and Metadata Manager, Informatica MDM 10.1,Informatica Data Services(IDS)9.6.1, DataStage

Databases: Oracle 12g/10g/9i/8i/8.0/7.x,Teradata13,DB2 UDB 8.1, MS SQLServer 2008/2005.

Operating Systems: UNIX (Sun-Solaris, HP-UX), Windows NT/XP/Vista, MSDOS

Programming SQL, SQL-Plus, PL/SQL, Perl, UNIX Shell Scripting

Reporting Tools: Business ObjectsXIR2/6.5/5.0/5.1, Cognos Impromptu 7.0/6.0/5.0,Informatic Analytics Delivery Platform, MicroStrategy.

Modeling Tools: Erwin 4.1 and MS Visio

Other Tools: SQL Navigator, Rapid SQL for DB2, Quest Toad for Oracle, SQL Developer 1.5.1, Autosys, Telnet, MS SharePoint, Mercury Quality center, Tivoli Job Scheduling Console,JIRA,Netezza.

Methodologies: Ralph Kimball.

PROFESSIONAL EXPERIENCE:

Client: High mark health insurance Pittsburgh,PA

Role: MDM/IDQ/ETL Developer

Duration: Jan 2021 – Current

Project Description:The program has been designed to provide a strong value proposition to the various stakeholders including consumers, issuers, merchants as well as other third parties that could benefit from the ease, convenience, and security that the platform provides using Informatica.

Responsibilities:

Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.

Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.

Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.

Coded Teradata BTEQsql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.

Worked with team to convert Trillium process into Informatica IDQ objects.

Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor

Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.

Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.

Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.

Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality

Worked on Informatica Data Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting, and monitoring capabilities of IDQ.

Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.

Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.

Performed match/merge and ran match rules to check the effectiveness of MDM process on data

Use Teradata Utilities to load data to/from tables

Performed ETL code reviews and Migration of ETL Objects across repositories.

Developed ETL's for masking the data when made available for the Offshore Dev. team

Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission

Monitoredday to dayLoads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's

Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters.

Involved in developing Multi load, Fast Load, BTEQ, TPump and TPT scripts on Teradata/UNIX platform.

Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.

Involved heavily in writing complex SQL queries based on the given requirements on Teradata.

Involved in implementing change data capture (CDC) and Type I, II, III slowly changing Dimensions

Developed functions and stored procedures to aid complex mappings

Environment: Informatica Power Center 10.5/9.6, Informatica MDM 9.5, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Teradata, PL SQL, SQL developer, TOAD, Putty, Unix

Client: Lincoln Financial Group Boston, MA

Role: MDM/IDQ/ETL Developer

Duration: Jun 2019 – Dec 2020

Project Description:The application is being built to support this business need. The application will replace the currently legacy reporting application and used for Northland Insurance which serves the trucking and public auto industries.

Responsibilities:

Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.

Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.

Assisted in building the ETL source to Target specification documents

Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ 9.5.1. for the MDM.

Redesign current ETL / DW processes using Teradata Utilities for best performance.

Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.

Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.

Design, develop, test and review & optimize Informatica MDM and Informatica IDD Applications.

Involved in match/merge and match rules to check the effectiveness of MDM process on data.

Work on SQL coding for overriding for generated SQL query in Informatica.

Involve in Unit testing for the validity of the data from different data sources.

Performed application-level DBA activities creating tables, indexes, monitored and tuned Teradata scripts.

Involved heavily in writing complex SQL queries based on the given requirements on Teradata.

Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica MDM using IDQ cleanse Adapters.ImplementingSlowly Changing Dimension (SCD type II) design for the Data Warehouse.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.

Perform Data Conversion/Data migration using Informatica PowerCenter.

Involve in performance tuning for better data migration process.

Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.

Create UNIX shell scripts for Informatica pre/post session operations.

Automated the jobs using CA7 Scheduler.

Document and present the production/support documents for the components developed when handing-over the application to the production support team.

Created Data Model for the DataMarts.

Used materialized views to create snapshots of history of main tables and for reporting purpose

Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5

Contact with Informatica tech support group regarding the unknown problem

Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's

Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

Prepared SQL Queries to validate the data in both source and target databases.

Environment: Informatica 10.0/9.6, Informatica Data Quality (IDQ) 9.6, Informatica MDM, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE, Unix, Flat files, Teradata.

Client: Travelers Insurance Saint Paul, MN

Role: MDM/IDQ/ETL Developer

Duration: Jul 2017 – May 2019

Project Description: This project is created to design and develop Data Mart for the improvement of Agreement and Household Management System and it is built to specifically support analysis facilitating trend and forecasting using wide range of customer data. This Data Mart provides single views across products, channels and households which are used in various enterprise metrics.

Responsibilities:

Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.

Involved in designing dimensional modeling and data modeling using Erwin tool.

Created high-level Technical Design Document and Unit Test Plans.

Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.

Tuning pre and post SQL Syntax.

Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.

Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.

Used Teradata utilities fastload, multiload, tpump to load data.

Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Lookups to improve the performance as the volume of the data was heavy.

Worked on different IDQ / Informatica Developer components/transformations like Case, Comparison, Key Generator, Parser, Standardizer, Weight, Exception, Rule Based Analyzer, Lookup, SQL, Expression etc. and created IDQ mappings.

Worked on IDQ file configuration at user’s machines and resolved the issues.

Used IDQ to complete initial data profiling and removing duplicate data.

Used Informatica EIC (Enterprise Information Catalog) for exploring assets to verify the quality of data like profiling the information.

Performed the tasks to view relationship between the assets by using Informatica EIC.

Extensively used ETL to load data using Informatica Power Center 10.1 from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.

Created customer hub (MDM) – by consolidating and integrating various sources and storing the golden record as individual identity in the hub.

Designed, documented, and configured informatica MDM Hub to load, cleanse, match, merge, and publish MDM Data.

Successfully completed Customer and Product centric Master Data Management initiatives using Informatica Master data management product.

Used IDQ’s standardized plans for addresses and names clean ups.

And extensively worked on IDQ admin tasks and worked as IDQ developer.

Experience in Performance tuning & Optimization of SQL statements using SQL trace

Designed and tested packages to extract, transform and load data (ETL) using SSIS, Designed packages which are utilized for tasks and transformations such as Execute SQL Task, Mapping the right and required data from source to destination, Data Flow Task, Data Conversion, For each Loop Container.

Prepared the complete data mapping for all the migrated jobs using SSIS.

Creating and monitoring TWS (Tivoli Workload Scheduler) jobs so as job runs and delivers the data to users.

Supported the process steps under development, test and production environment

Environment: Informatica Power Center 10.1, 9.6.1, Informatica Power Center 8.6.1, MDM and IDQ, Oracle 11g/10g, TOAD, Business Objects XI3.5, Solaris 11/10, Teradata, clear case, PL/SQL, Tivoli Job Scheduler, SSIS.

Client: First Data Corporation Omaha, NE

Role: Jr. Informatica Developer/ IDQ Developer

Duration: Feb 2016 – Jun 2017

Project Description:AT&T is an American provider of cable TV, broadband Internet, and VOIP telephone services in the U.S. In this project after Data Discovery and Data Cleansing of legacy data, the master data was moved to an interim database. ETL tools were used to transform legacy data into required formats. Then the master data was loaded to the production systems and the reports were updated.

Responsibilities:

Prepared High-level Design and Low-Level Design based on Functional and Business requirements of the project.

Designing & documenting the functional specs and preparing the technical design.

As a team conducted gap analysis and Discussions with subject matter experts to gather requirements, emphasize on problem areas and define deliverables.

Supported the development, optimization, and maintenance of Informatica mappings with various source data including Oracle and SQL.

Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.

Designed and developed UNIX Shell scripts for creating, dropping tables which are used for scheduling the jobs .

Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.

Developed several complex mappings in Informatica a variety of Informatica PowerCenter 9.6 transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files.

Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center 9.6.

Used IDQ to complete initial data profiling and removing duplicate data.

Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.

Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter

Worked on IDQ file configuration at user’s machines and resolved the issues.

Used IDQ to complete initial data profiling and removing duplicate data.

Involved in running and scheduling UNIX Shell scripts, Informatica jobs using Tidal

And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.

Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems like ODS, flat files, HDFS Files and scheduled the same using Data Stage Sequencer for System Integration testing.

Developed, maintained programs for scheduling data loading and transformations using DataStage

Developed mapping parameters and variables to support SQL override.

Worked on import and export of data from sources to Staging and Target using Teradata MLOAD, Fast Export, TPUMP and BTEQ.

Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.

Expertise in Performance Tuning by identifying the bottlenecks at sources, targets, PowerCenter transformations and session’s level. Collected performance data for sessions and performance tuned by adjusting Informatica session parameters.

Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.

Worked with the third-party scheduler Autosys for scheduling Informatica PowerCenter Workflows Involved with Scheduling team in creating and scheduling jobs in Autosys Workload Scheduler.

Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 9.6, Oracle 11g, SQL, IDQ, Teradata, DataStage, PL/SQL, TOAD, Microsoft Visio, Autosys, Unix, SQL Server 2008.

Client: GE Bangalore, IN

Role: Informatica Developer

Duration: Mar 2014 – Nov 2015

Responsibilities:

Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.

Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations, and business units to build and deploy.

Developed mappings in Power centre 9.1 for Job Monitoring and Data Validation.

Developed different mapping logic using various transformations to extract data from different sources like flat files, IBM MQ series, Oracle, IBM DB2 UDB 8 databases that were hosted on HP UX 11i v2 RISC server.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ.

Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.

Performed data profiling to understand the data pattern using Informatica IDQ.

Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, SQL, Update strategy, Salesforce Lookup and Sequence generator.

Actively implemented Informatica performance tuning by identifying and removing the bottlenecks and optimized session performance by tuning complex mappings

Documented Informatica mappings, design and validation rules.

Create new SSIS packages to ETL data from Salesforce to SQL Server and vice versa

Extensively involved in unit testing, integration testing and system testing of the mappings and writing Unit and System Test Plan.

Developed UNIX scripts for scheduling the delta loads and master loads using Autosys Scheduler.

Migrated objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.

Supported the process steps under development, test and production environment

Environment: Informatica Power Centre 9.1.0/8.6.1, Informatica IDQ, MDM, Sales force, JIRA, SQL Server 2008, Toad, SQL Developer, MS Access, Clear Quest, Autosys Job Scheduler.

Education- Bachelor’s completed in 2012 in JNTUH

Master’s completd from bellevueunv in 2017 with computer science.



Contact this candidate