Post Job Free

Resume

Sign in

Data Integration Specialist/ Informatica ETL Developer

Location:
Houston, TX
Posted:
March 02, 2021

Contact this candidate

Resume:

Shashi

adkk9n@r.postjobfree.com

+1-832-***-****

Sr. Data Integration Specialist

PROFESSIONAL SUMMARY

Over 8+ years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation, and support of Data Warehousing applications using Informatica Power Center 9.x/8.x/7.x, Informatica ETL Developer etc.,

Extensively worked in developing XML code and extracting data for PDF files using Informatica.

Extensive experience using tools and technologies such as Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2, Teradata 13/14/15, SQL, PL/SQL, SQL*Plus, Erwin 4.0/3.5, TOAD, Stored procedures, triggers.

Experience in Dimensional data modelling techniques, Slow Changing Dimensions (SCD), Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing), and Data warehouse concepts - Star Schema/Snowflake Modelling, FACT & Dimensions tables, Physical & Logical Data Modelling.

Extensively worked in analyzing data and fix the data issues using DB and Informatica.

Extensively worked on PDF parsing and Excel parsing and converted into XML file using Informatica Developer tool and Data Transformation tool.

Experience in developing, designing, and interfacing the data elements in an Upstream Production Operations, daily Production Reporting Environment are essential to apply the educational and technical skills, to troubleshoot and support the database interfaces.

Experience in providing status on the issues and progress of key business projects.

Extensively worked on Informatica Data Quality on One-to-One mappings.

Experience in Data warehousing, Data Modeling like Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables.

Experience in using Teradata Utilities such as Tpump, Fast Load and Mload. Created BTEQ scripts.

Experience in designing, developing, testing, and maintaining BI applications and ETL applications.

Strong business understanding of Oil & Gas, Banking, Automobile and Healthcare.

Experienced in integration of various data sources like Oracle, MS SQL Server 2005/2000, XML files, Teradata, Flat files, XML into staging area and different target databases.

Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys and Tidal.

Experienced in the development, modification and testing of UNIX Shell scripts.

Experienced in Performance Tuning, identifying and resolving performance bottlenecks at various levels in Business Intelligence applications. Applied the Mapping Tuning Techniques such as Pipeline Partitioning to speed up data processing. Conducted Session Thread Analysis to identify and fix Performance Bottlenecks.

Experienced in creating reports using Tibco Spotfire.

Experienced in working with Agile, Waterfall methodologies

Performance tuning at Mapping, Session and database level.

Worked with the Business Analysts to gather requirements.

Outstanding communication and interpersonal skills, ability to learn quickly grasp new concepts, both technical and business related and utilize as needed.

Experienced with coordinating cross-functional teams, project management and presenting technical ideas to diverse groups and proven ability to implement technology-based solutions for business problems.

TECHNICAL SKILLS

Tools:

Informatica Power Center 10.1, Informatica Developer Tool, Data Transformation Tool, Informatica Data Quality.

Reporting Tools:

Spotfire, Power BI etc.,

Databases:

Toad for Oracle 12C/11g/10g, MS SQL Server 2008, DB2, Teradata 15.0.

Data Modeling:

Data Modeling, Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.

DB Tools:

TOAD, SQL Developer, SQL Assistant, Control-M, Autosys, Tidal.

Languages:

Linux, SQL, PL/SQL, Unix Shell Scripting

Operating Systems:

UNIX, Windows 7/Vista/Server 2003/XP/2000/9x/NT/DOS

Domain Expertise

Oil & Gas, Banking, Automobile and Healthcare.

PROFESSIONAL EXPERIENCE

Client: ConocoPhillips, Houston, TX Jan 18 – Current

Role: Data Integration Specialist IV

Project Description:

ConocoPhillips is a leading independent oil and natural gas exploration and production company. In this project extracted data from Teradata, SQL Server, Oracle and performed Type1 and Type2 mappings.

Responsibilities:

Design and implement interfaces for IO Operations application integration according to user requirements.

Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica Cloud.

Participate in meetings with business users to understand the business requirements.

Analyze the project requirements and providing estimates for the same to the Project Manager.

Experience in developing, designing, and interfacing the data elements in an Upstream Production Operations, Daily Production Reporting Environment are essential to apply the educational and technical skills, to troubleshoot and support the database interfaces.

Experience in extracting data from PDF files using Informatica.

Design and develop integration processes for Well view, Cygnet, daily production reporting, operations, Health, Safety and Environment and capital projects.

Experience in working with AWS cloud.

Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.

Worked on DevOps for defect/issues logging & tracking and documented all my work.

Develop ETL process to integrate Teradata Analytics for SAP data into Data Warehouse.

Co-ordinate with business, Data Architects and Database Administrators to resolve complex technical issues.

Design data conversions from wide variety of source using Informatica PowerCenter, SQL Server, Oracle, Teradata.

Experience in collecting, analyzing, and translating user requirements into effective solutions.

Extracted data from XLSX, PDFs and Excel sheets and converted into XML files using Informatica DQ tool and Data transformation tool.

Experience in working with JSON file using Informatica BDM tool. Also worked on JSON files in converting into CSV format and imported into Informatica Power Center as a source using Amazon Athena ODBC connection.

Hands on training provided by ConocoPhillips in IICS Cloud from Informatica University team and gained good understanding knowledge on IICS platform.

Have good understanding and hands on training on data integration using Snowflake, Salesforce and AWS cloud.

Experience in working in Azure using Informatica tool to integrate the data from Teradata DB to Microsoft Azure.

Experience in analyzing the data using Spotfire and fix the data issues using Informatica tool.

Used Deployment group to move Informatica objects from DEV to TEST and from TEST to PROD.

Perform unit, integration testing of each interface and provide support in User Acceptance Testing.

Client: Devon Energy, Oklahoma City, OK Jan 17 – Dec 17

Role: Data Integration Specialist

Project Description:

Devon is a leading independent oil and natural gas exploration and production company. In this project extracted data from relational databases DB2, Oracle, performed Type1 and Type2 mappings. Used the debugger to test the mapping and fixed the bugs. Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.

Responsibilities:

Developing and maintaining enterprise data services which enable users to access business entities data through API layer and ODBC connection.

Implemented Exception Handling Mappings by using Data Quality, data validation by using Informatica tool.

Involved in the installation and configuration of Informatica Power Center 10.1 and evaluated Partition concepts in Power Center 10.1

Expertise in Informatica Data Quality and Performed Profiling, Score carding, Creating Rules, Mapplets and Mappings as part of the development process.

Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.

Used Informatica DQ to perform Unit testing and create Mapplets that are imported and used in Power center Designer.

Continuously & consistently developing variety of data quality rules for departments to support their business data and activities.

Worked on Performance Tuning Informatica Targets, Sources, Mappings and Sessions.

Working in UNIX work environment, File Transfers, Job Scheduling and Error Handling.

Loaded data to the interface tables in oracle from multiple data sources using informatica.

Created Indexes to speed up the queries.

Creating Technical Specifications and Mapping Documents.

Responsible for optimizing the ETL mappings and monitoring the load process in order to check the data.

Working on Oracle PL/SQL programming, Stored Procedures, Creating Triggers and Sequence Generator on the Target Table.

Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations.

Written PL/SQL Procedures, Function and Packages to perform database operations and pre-& post session commands.

Developed and modified complex SQL Queries and Stored Procedures as per business requirements.

Extracted data (Unstructured data parsing) from PDFs using Informatica DQ tool.

Environment: Informatica Power Center 10.1, Oracle12C, IDQ 9.6.1, SQL Server 2008, Toad, Tidal, UNIX, PL/SQL, DB2.

Client: Bank of America, Charlotte, NC Jan 16 – Dec 16

Role: ETL/Application Developer

Project Description:

Finance department of the bank categorized customers based on a significant set of portfolio services that includes personal loans, credit cards etc. The purpose of the project is to obtain the data about the customers from different regions and different databases and aggregate within the data warehouse using Informatica Power Center and Informatica Developer Tools.

Responsibilities:

Requirement gathering and Understanding of the Functional Business processes and requirements given by the Business Analyst.

Involved in Designing High level Technical Documentation based on specification provided by the Manager.

Experience in Agile methodologies.

Converted business requirements into highly efficient, reusable and scalable Informatica ETL processes.

Experience in Scheduling Informatica sessions for automation of loads in Autosys.

Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.

Derived out requirements and designs and worked with business and technical teams.

Performed tuning of sessions in Target, Source, Mappings and Session areas.

Created Complex SQL queries for reporting against tables.

Designed and developed Informatica mappings for data loads and data cleansing.

Worked with various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Stored procedure, Router and Normalizer etc.

Involved in dealing with performance issues at various levels such as target, sessions, mappings and sources.

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.

Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.

Provided production support as required.

Worked on special assignments such as targeted data loads for business partner needs.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.

Managed the Metadata associated with the ETL processes used to populate the data warehouse

Independently perform complex troubleshooting, root-cause analysis, solution development

Involved in end to end system testing, performance and regression testing and data validations and Unit Testing.

Used Deployment group to move Informatica objects from DEV to TEST and from TEST to QA/PROD.

Suggested best practices in Informatica and ETL processes to improve efficiency of the ETL process.

Environment: Informatica Power Center 9.6.1, Teradata 14.0, Oracle 11g, SQL Server 2008, SSIS, DB2, PL/SQL, Visual Studio 2012, UNIX, TOAD, XML.

Client: Volkswagen of America, Auburn Hills, MI Aug 14 – Dec 15

Role: Informatica Developer

Project Description:

Volkswagen is one of the leading Automobile Company. It is the one of the largest automotive systems suppliers in the world - Sells to 19 largest vehicle manufacturers in the world, having 205 total facilities in 25 countries and six continents. Volkswagen contains various domains like MP&L (Manufacturing planning& Logistics), PD, PURCHASE and FINANCE etc.

Responsibilities:

Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.

Extracted data from relational databases DB2, Oracle and Flat Files and Mainframe

Performed Profiling, Score carding, Creating Rules, Reference data tables, Mapplets and Mappings as part of the development process.

Developed mapping logic using various transformations like Expression, Lookups, Joiner, Filter, Sorter, Update strategy and Sequence generator.

Built the Logical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality based on requirements to profile, validate and cleanse the Data.

Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using Informatica.

Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.

Expert in writing SQL, PL/SQL Stored procedures using Toad. Experience with Teradata utilities Fast Load, Multiload, BTEQ scripting & TPUMP.

Assisted in upgradation of informatica system from Informatica PowerCenter 8.6 to Informatica PowerCenter 9.1.

Extensively worked on the performance tuning of the Informatica Mappings as well as the tuning of the sessions.

Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.

Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.

Involved in Unit, System integration, User Acceptance Testing of Mapping.

Worked with Connected and Unconnected Stored Procedure for pre & post load sessions

Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.

Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.

Worked on production tickets to resolve the issues in a timely manner.

Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing.

Environment: Informatica Power Center 9.5, Informatica IDQ, Oracle 10g, Teradata, SQL Server 2008, DB2, PL/SQL, Shell Scripting, Tivoli Job Scheduler.

Client: Kaiser Permanente, Oakland, CA Feb 13-Jun 14

Role: Informatica ETL Developer

Responsibilities:

Documented user requirements translated requirements into system solutions and developed implementation plan and schedule.

Involved in the data modeling and designed the Data Warehouse using Star Schema methodology. Also, responsible for all the ongoing data model design decisions and database implementation strategies.

Extracted data from different source systems such as Oracle, SQLServer, MS Access, DB2, Mainframes, XML and Flat Files.

Worked on several transformations such as Filter Transformation, Joiner Transformation, Sequence Generator Transformation, Aggregator Transformation, Source Qualifier Transformation, Expression Transformation, Lookup Transformation (Connected and Unconnected), Joiner Transformation, and Router Transformation, Web services Transformation, XML transformation, Normalizer Transformation and Java Transformation in Informatica.

Assisted the QC team in carrying out its QC process of testing the ETL components.

Setting up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager and external scheduler.

Developed Complex transformations, Mapplets using Informatica to Extract, Transform and Load Data into DataMart, Enterprise Data warehouse (EDW) and Operational data store (ODS).

Used Metadata Manager to manage the Metadata associated with the ETL processes.

Used error handling mapplets to capture error data into PMERR tables for handling nulls, analysis and error remediation process.

Trouble issues in TEST and PROD. Do impact analysis and fix the issues.

Defined the test strategy, created unit test plans and supported UAT and regression testing using HP Quality Center.

Used Informatica Developer client to set up Address validation, name matching using Match transformation, etc.

Identified slowly running ETL jobs and reduced the run time drastically by applying performance tuning techniques.

Advocated best practices for ETL Processes and conducted Data Integration meetings.

Environment: Informatica Power Center 9.5/9.1.1/8.6, Oracle 11g, Informatica Data Quality, SQL SERVER 2005/2008, PL/SQL, MS Access, XML, T-SQL, Shell Scripts, Control-M, Teradata.

EDUCATION:

Bachelor of Engineering in Computer Science – Graduated

Master’s in computer science and Engineering – Graduated



Contact this candidate