Post Job Free

Resume

Sign in

Informatica

Location:
Bell, CA, 90201
Posted:
May 11, 2020

Contact this candidate

Resume:

Manideep Reddy

Email: adc55t@r.postjobfree.com

Mobile No. +1-205-***-****

Senior ETL/Informatica Developer

Professional summary:

Informatica Power Center (ETL) developer with 7+ years of experience in the field of Data warehouse and Business Intelligence.

Technical expertise in ETL methodologies, Informatica 9.x/10.x - Power Center, Client tools - Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.

Expertise in Data Warehouse, Data Migration, Data Modeling, and Data Cleansing.

Develops PowerCenter Workflows and Sessions, and also sets up PowerExchange connections to database and mainframe files

Develops logical and physical data flow models for ETL applications

Provide System and Application Administration for Informatica PowerCenter and PowerExchange

Designs, develops, automates, and supports complex applications to extract, transform, and load data

Identifies and manages interfaces, service levels, standards and configurations

Converted the business rules into technical specifications for ETL process for populating Fact and Dimension table of Data warehouse.

Directly responsible for the Extraction, Transformation & Loading of data from multiple sources into Data Warehouse. Complete knowledge of data warehouse methodologies (Ralph Kimball, Inmon), ODS, EDW and Metadata repository.

Experienced in integration of various data sources like Oracle 12c, IBM DB2, MS SQL Server, My SQL, Teradata, Netezza, XML files, Mainframe sources into staging area and different target databases

Hands on experience in Tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Data Modeling experience on Data modeling, ERwin 7.1/4.5/4.0, Dimensional Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, DataMart’s, OLAP, FACT & DIMENSIONS tables, Physical & Logical data modeling and Oracle Designer.

Move data from disparate data sources into a target Oracle Data Warehouse.

Created mappings for various sources like DB2, Oracle, and flat files to load and integrate details to warehouse.

Extensive experience is maintaining Informatica connections in all the environments like Relational, Application and FTP connections.

Extensive database experience and highly skilled in SQL in Oracle, MS SQL Server, DB2, Teradata, Sybase, Mainframe Files, Flat Files, CSV files, MS Access

Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.

Develop ETLs to extract, cleanse and load the data into the SQL Server.

Create Tables, Indexes, and Constraints etc.

Data Analysis strong experience on Data Design/Analysis, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.

Extensively worked on developing mappings & workflows as per the requirements of the project using Informatica.

Data modelling knowledge in Dimensional Data modelling, Star Schema, Snow-Flake Schema.

Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.

Experienced in UNIX work environment, File Transfers, Job scheduling and Error Handling.

Experience in writing PL/SQL, T-SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.

Worked on Business Objects (SAP), Cognos to create and run reports by using the data that has been loaded into data warehouse and monitoring these reports on their daily run

Excellent testing and reviewing techniques and capabilities to avoid any slippage of bugs.

Technical Proficiencies:

ETL Tools

Informatica Power Center 10.2/9.6.1/9.5.1, Informatica Cloud, SSIS

RDBMS

Oracle 12C/11g/10g/9i/8i, Teradata 14/12, DB2, SQL Server 2000/2005/2008, MySQL, Sybase, SAP, Teradata, Netezza, IBM DB2

Modeling

Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio.

QA Tools

Quality Center

Operating System

Windows, Unix, Linux, Windows10/8/7/2000

Reporting Tools

Cognos, Business Objects, SSRS

Languages

C, C++, JavaScript, XML, CSS, UNIX Shell Scripting, SQL, PL/SQL

Professional Experience:

Client: Paysafe Group, Westlake Village, CA Dec 2018 to Till date

Role: Senior ETL/Informatica Developer

Description: Paysafe Group Limited is a multinational online payments company. The group offers services

both under the Paysafe brand and subsidiary brands that have become part of the group through several

mergers and acquisitions, most notably Neteller, Skrill and paysafecard.

Responsibilities:

Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources like Oracle and Delimited Flat files.

Created multiple types of tasks in workflows, automate the data flows using command tasks, assignment tasks and event wait.

Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.

Created detailed Technical specifications for Data Warehouse and ETL processes.

Used Informatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.

Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.

Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.

Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.

Worked on IDQ file configuration at user’s machines and resolved the issues.

Created Reusable transformations, Mapplets, Worklets using Transformation developer, Mapplet Designer and worklet Designer.

Used IDQ’s standardized plans for addresses and names clean ups.

Used IDQ to complete initial data profiling and removing duplicate data.

Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.

Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.

Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.

Implemented CDC in Informatica.

Extract data from Oracle using SQL for use in Informatica.

Design, Test, Automate and Schedule workflows.

Environment: Informatica Power Center 10.2/9.6.1, SQL server, TOAD, Quality Center, windows, AWS, AUOT Sys, MS Office Suite, Oracle12C/ 11g, Teradata12/14, Control-M, TWS, UNIX, XML.

Client: AAA Auto Club Group, Tampa, FL March 2017 – Dec 2018

Role: Informatica Developer

Description: The Auto Club operates an auto repair facility. The Company provides insurance, travel, maintenance and repair, auto loan, and related services for its members. The Auto Club Group is one of the largest independent AAA clubs in North America. We serve more than 13 million members across the American Midwest, Southeast, Puerto Rico, and the U.S. Virgin Islands.

Responsibilities:

Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.

Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.

Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.

Created ETL documents (ETL Standards, Migration Manual, and Unit Testing) for the development.

Moving data from disparate data stores, flat files, SQL Server and Oracle to a Target Oracle DW using Informatica and PL/SQL.

Designed Data Quality using IDQ and developed several Informatica mappings.

Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.

Create Informatica mappings to load data from different sources into the data warehouse.

Designing Technical and functional specs based on requirements provided by business and finalizing the data model.

Used Parameter files to specify DB Connection parameters for sources.

Worked on the installation and setup ETL (Informatica Cloud) applications on Linux servers.

Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.

Extensively worked in Informatica cloud using DRT (data Replication Tasks) and DST (Data Synchronization Task).

Extensively used SQL and PL/SQL to retrieve the data from databases to perform data validations and comparisons for regression testing.

Well versed with ETL Methodologies, Data Warehouse, Data Visualization and Business Intelligence toolset.

Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.

Parameterized the mappings and increased the re-usability.

Environment: Informatica Power center 9.6.1, Informatica cloud, UNIX, TOAD, SQL Developer, PeopleSoft, Citrix, OBIEE, Oracle 12C/ 11g, CSV Files, SQL Developer, Teradata 12, Tivoli Workload Scheduler (TWS).

Client: Sprint, Overland Park, KS Oct 2015 - Feb 2017

Role: Informatica Developer

Description: Multi-channel customer management, information sharing, field reporting, and analytics all within a life science-tailored mobile application that is easy to use. The Purpose of this project is to maintain a data warehouse that would enable the home office to take corporate decisions. A decision support system is built to compare and analyze their products with the competitor products and the sales information at territory, district, region and Area level.

Responsibilities:

Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.

Developed high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents.

Worked on Teradata SQL Assistant, Teradata administrator, Teradata viewpoint and BTEQ scripts. Created User exit features extending the functionality and features of MDM HUB.

As an ETL SME involved in preparation of the ETL specification document by working closely with the business team

Provided Guidance in Performance tuning of Informatica mappings and DB2 SQL.

Prepared the Code Review Checklist, Standards document and Reusable components to be used across multiple projects.

Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, Tpump, TPT on UNIX/Windows environments and running the batch process for Teradata.

Involved in Performance Tuning (Both Database and Informatica) and there by decreased the load time.

Exposure towards cloud services like AWS, Azure.

Hands-on experience with AWS while extracting the data from the cloud to Oracle Database.

Implemented Session Partitioning and used Debugger to analyse the mappings and improve performance.

Worked with business SMEs on developing the business rules for cleansing. Applied business rules using Informatica Data Quality (IDQ) tool to cleanse data.

Used the Teradata fast load/Multiload utilities to load data into tables.

Presented Data Cleansing Results and IDQ plans results to the OpCos SMEs.

Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.

Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from various sources such as Teradata, Oracle, ASCII delimited Flat Files, EBCDIC files, COBOL & DB2.

Documented Cleansing Rules discovered from data cleansing and profiling.

Implemented optimization technique like PUSH DOWN optimization from Informatica.

Involved in Design review, code review, Performance analysis.

Updated the schedule of the jobs to maintain business functionality in AUTOSYS.

Created SQL for data migrations and handled Re-Org of DB2 tables.

Analysed the data and provide resolution by writing analytical/complex SQL in case of data discrepancies.

Created Informatica Mappings to load data and used transformations like Stored procedure transformation, Connected and Unconnected lookups, Source Qualifier, Expression, Sorter, Aggregator, Joiner, Filters, Sequence, Router and Update Strategy and XML transformation.

Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings, known as ‘Mapplets’ using Informatica Designer Involved in Design Review, code review, test review, and gave valuable suggestions.

Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.

Created partitions for parallel processing of data and worked with DBAs to enhance the data load during production.

Performance tuned Informatica session, for large data files by increasing block size, data cache size, and target-based commit.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2.

Developing the automated and scheduled load processes using Control-M scheduler and TWS.

Involved in writing a procedure to check the up-to-date statistics on tables.

Used Informatica command task to transfer the files to bridge server to send the file to third party vendor.

Environment: Informatica Power Centre 9.5.1/9.6, ORACLE 11/10g, DB2, SQL Server, TOAD, AWS, Service Center 2.0, AUTOSYS, Control-M, TWS, UNIX, XML, TERADATA 12, flat files, Business Intelligence (BI) launch pad.

Client : Reliance Industries Limited, Hyderabad, India June 2014 - May 2015

Role: Informatica Developer

Description: Reliance Industries Limited (RIL) is an Indian multinational conglomerate company headquartered in Mumbai, Maharashtra. Reliance owns businesses across India engaged in energy, petrochemicals, textiles, natural resources, retail, and telecommunications.

Responsibilities:

Extensively used ETL to load data from Flat file, MS Excel, which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 10g.

Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.

Worked on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.

Designed and developed the data transformations for source system data extraction; data staging, movement and aggregation; information and analytics delivery; and data quality handling, system testing, performance tuning.

Created Several Stored Procedures to update several tables and insert audit tables as part of the process.

Stored reformatted data from relational, flat file, XML files using Informatica (ETL).

Worked with mapping variable, Mapping parameters and variable functions like Setvariable, Countvariable, Setminvariable and Setmaxvariable.

Involved in exporting database, tablespaces, tables using Datapump(11g) as well as traditional export/import.

Scheduled various daily and monthly ETL loads using Control-M

Involved in writing UNIX shell scripts to run and schedule batch jobs.

Involved in unit testing and documentation of the ETL process.

Environment: Informatica Power Center 9.5, Oracle 11g, Sql Server, Teradata 14.0, Flat File, XML, UNIX shell scripting, Teradata

Client: Hinduja Global Solutions, Hyderabad, India Oct 2012 – March 2013

Role: Software Engineer

Description: Hinduja Global Solutions is a business process management organization. HGS combines technology-powered services in automation, analytics and digital focusing on back office processing, contact centers and HRO solutions to deliver transformational impact to clients.

Responsibilities:

Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.

Performed Source System Data Profiling using Informatica Data Explorer (IDE).

Developed reusable transformations and Mapplets to use in multiple mappings.

Created pre-session and post-session shell scripts and email notifications.

Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.

Designed the mappings according to the OBIEE specifications like SDE (Source Dependent Extraction) and SIL (Source Independent Extraction).

Involved in all the phases of SDLC.

Extensively used SQL and PL/SQL, T-SQL Scripts

Responsible for code migration, code review, test Plans, test scenarios, test cases as part of Unit/Integrations testing, UAT testing.

Involved as a part of Production support.

Designed and developed UNIX Shell scripts for creating, dropping tables.

Working closely with Onshore and offshore application development leads

ETL Mappings, Mapplets, Workflows, Worklets using Informatica Power center 9.x

Designed and developed Informatica mappings for data loads and data cleansing.

Designing & documenting the functional specs and preparing the technical design.

Work in a fast-paced environment, under minimal supervision providing technical guidance to the team members.

Environment: Informatica Power Center 9.x, Informatica Developer Client, IDQ, Power Exchange, SAP, Oracle 10g, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, OBIEE.



Contact this candidate