Post Job Free
Sign in

ETL Developer

Location:
Birmingham, AL
Posted:
January 07, 2018

Contact this candidate

Resume:

SASI

********@*****.***

980-***-****

Sr. Informatica ETL Developer

PROFESSIONAL SUMMARY

Over 8 years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center 9.x/8.x/7.x, IDQ, Informatica ETL Developer etc.,

Extensive experience using tools and technologies such as Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2 UDB, Sybase, Teradata 13/12/ V2R5/V2R4, MS Access, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x, Erwin 4.0/3.5, SQL* Loader, TOAD, Stored procedures, triggers.

Experience in Dimensional data modelling techniques, Slow Changing Dimensions (SCD), Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing), and Data warehouse concepts - Star Schema/Snowflake Modelling, FACT & Dimensions tables, Physical & Logical Data Modelling.

Expertise in developing standard and re-usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router and filter transformations.

Extensively worked on Informatica Data Quality transformations like Address validator, Parser, Labeller, Match, Exception, Association, Standardizer and other significant transformations.

Experience in Data warehousing, Data Modeling like Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, Data Vault Modeling, FACT and Dimensions Tables.

Experience in using Teradata Utilities such as Tpump, Fast Load and Mload. Created BTEQ scripts.

Experience in designing, developing, testing, and maintaining BI applications and ETL applications.

Strong business understanding of Banking, Retail, Finance and Healthcare and Telecom sectors.

Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server 2005/2000, XML files, Teradata, Netezza, Sybase, DB2, Flat files, XML, Salesforce sources into staging area and different target databases.

Strong data modeling experience using Star/Snowflake schemas, Re-engineering, Dimensional Data modeling, Fact & Dimension tables, Physical & logical data modeling.

Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys and Tidal.

Experienced in the development, modification and testing of UNIX Shell scripts.

Experienced in Performance Tuning, identifying and resolving performance bottlenecks at various levels in Business Intelligence applications. Applied the Mapping Tuning Techniques such as Pipeline Partitioning to speed up data processing. Conducted Session Thread Analysis to identify and fix Performance Bottlenecks.

Proficient in various Data Quality transformations like Standardization, parser, merger, case converter, Match, etc., in the mappings.

Experienced in creating reports using Qlikview, Oracle Analytic functions, Tableau.

Experienced in working with Agile, Waterfall methodologies

Performance tuning at Mapping, Session and database level.

Advanced Knowledge of Oracle PL/SQL programming, stored procedures & functions, indexes, views, materialized views, triggers, cursors and tuning the SQL query.

Worked with the Business Analysts to gather requirements.

Outstanding communication and interpersonal skills, ability to learn quickly grasp new concepts, both technical and business related and utilize as needed.

Experienced with coordinating cross-functional teams, project management and presenting technical ideas to diverse groups and proven ability to implement technology based solutions for business problems.

TECHNICAL SKILLS

ETL Tools:

Informatica Power Center 10.1/9.6/9.5/8.6/8.1/7.1, Metadata Manager, IDQ, SSIS, DataStage.

Reporting Tools:

Business Objects XIR2/6.1/5.0, Qlikview, OBIEE, Microstrategy, Oracle Analytics, etc.

Databases:

Oracle 12C/11g/10g, MS SQL Server 2008/2005/2000, MS Access, IBM, DB2, Teradata 14.0, Netezaa.

Data Modeling:

Data Modeling, Dimensional Data Modeling, Star Schema Modeling, Snow Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.

DB Tools:

TOAD, SQL Developer, SQL Assistant, Visio, ERWIN, Tivoli Job Scheduler, Control-M, Tidal.

Languages:

C, C++, Java, SQL, PL/SQL, Unix Shell Scripting

Operating Systems:

UNIX, Windows 7/Vista/Server 2003/XP/2000/9x/NT/DOS

Domain Expertise

Oil and Natural Gas, Finance, Automobile, Insurance, Telecom, Health Care

PROFESSIONAL EXPERIENCE

Client: Devon Energy Corporation, Oklahoma City, OK Jan 17 – Current

Role: Sr. Informatica ETL Developer

Project Description:

Devon Energy Corporation is a leading independent oil and natural gas exploration and production company. In this project extracted data from relational databases DB2, Oracle, performed Type1 and Type2 mappings. Used the debugger to test the mapping and fixed the bugs. Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Responsibilities:

Developing and maintaining enterprise data services which enable users to access business entities data through API layer and ODBC connection.

Experience in Web services that makes data available over the internet where a client can connect to a web service to access, transform, and deliver data.

Implemented Exception Handling Mappings by using Data Quality, data validation by using Informatica Analyst.

Involved in the installation and configuration of Informatica Power Center 10.1 and evaluated Partition concepts in Power Center 10.1

Expertise in Informatica Data Quality (IDQ) and Performed Profiling, Score carding, Creating Rules, Reference data tables, Mapplets and Mappings as part of the development process.

Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.

Implemented SSIS Package Configuration using XML configuration files and SQL Server configurations.

Used IDQ to perform Unit testing and create Mapplets that are imported and used in Power center Designer.

Continuously & consistently developing variety of data quality rules for departments to support their business data and activities.

Worked on Performance Tuning Informatica Targets, Sources, Mappings and Sessions.

Working in UNIX work environment, File Transfers, Job Scheduling and Error Handling.

Loaded data to the interface tables in oracle from multiple data sources using informatica.

Created Custom plans for product name discrepancy check using IDQ and incorporated the plan as a Mapplet into Power Center.

Created Indexes to speed up the queries.

Creating Technical Specifications and Mapping Documents.

Responsible for optimizing the ETL mappings and monitoring the load process in order to check the data.

Working on Oracle PL/SQL programming, Stored Procedures, Creating Triggers and Sequence Generator on the Target Table.

Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations.

Written PL/SQL Procedures, Function and Packages to perform database operations and pre-& post session commands.

Developed and modified complex SQL Queries and Stored Procedures as per business requirements.

Enforced data governance to provide external bulk loading option & controlled access to users.

Experienced in massive data profiling using IDQ prior to data staging.

Extracted data (Unstructured data parsing) from Image PDFs using Informatica DQ tool.

Developing and supporting jobs in Tidal enterprise scheduler for application workflow automation.

Environment: Informatica Power Center 10.1, Oracle12C, IDQ 9.6.1, SQL Server 2012, Toad, Control-m Tool, Netezza, UNIX, Toad, PL/SQL, SSIS, DB2.

Client: Bank of America, Charlotte, NC Jun 16 – Dec 17

Role: Sr. ETL/Application Developer

Project Description:

Finance department of the bank categorized customers based on a significant set of portfolio services that includes personal loans, credit cards etc. The purpose of the project is to obtain the data about the customers from different regions and different databases and aggregate within the data warehouse using Informatica Power Center and Informatica Developer Tools.

Responsibilities:

Requirement gathering and Understanding of the Functional Business processes and requirements given by the Business Analyst.

Involved in Designing High level Technical Documentation based on specification provided by the Manager.

Experience in Agile methodologies.

Converted business requirements into highly efficient, reusable and scalable Informatica ETL processes.

Developing, executing & monitoring Autosys (Process Scheduler) jobs.

Experience in Scheduling Informatica sessions for automation of loads in Autosys.

Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.

Derived out requirements and designs and worked with business and technical teams.

Performed tuning of sessions in Target, Source, Mappings and Session areas.

Created Complex SQL queries for reporting against tables.

Worked on Teradata BTEQs and Teradata Load Utilities like FLoad, MLoad and Tpump.

Designed and developed Informatica mappings for data loads and data cleansing.

Worked with various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Stored procedure, Router and Normalizer etc.

Involved in dealing with performance issues at various levels such as target, sessions, mappings and sources.

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.

Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.

Provided production support as required.

Worked on special assignments such as targeted data loads for WCDST and business partner needs.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.

Managed the Metadata associated with the ETL processes used to populate the data warehouse

Independently perform complex troubleshooting, root-cause analysis, solution development

Involved in end to end system testing, performance and regression testing and data validations and Unit Testing.

Used Deployment group to move Informatica objects from DEV to TEST and from TEST to QA/PROD.

Suggested best practices in Informatica and ETL processes to improve efficiency of the ETL process.

Environment: Informatica Power Center 9.6.1, Teradata 14.0, Oracle 11g, SQL Server 2008, SSIS, DB2, PL/SQL, Visual Studio 2012, UNIX, TOAD, XML.

Client: Volkswagen of America, Auburn Hills, MI Apr’15 – Jun’16

Role: Sr.Informatica ETL/IDQ Developer

Project Description:

Volkswagen is one of the leading Automobile Company. It is the one of the largest automotive systems suppliers in the world - Sells to 19 largest vehicle manufacturers in the world, having 205 total facilities in 25 countries and six continents. Volkswagen contains various domains like MP&L (Manufacturing planning& Logistics), PD, PURCHASE and FINANCE etc.

Responsibilities:

Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.

Extracted data from relational databases DB2, Oracle and Flat Files and Mainframe

Performed Profiling, Score carding, Creating Rules, Reference data tables, Mapplets and Mappings as part of the development process.

Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.

Built the Logical Data Objects (LDO) and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the Data.

Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.

Used Labeler, Match, Address validator, Parse Transformation, Join Analysis Transformation to perform the Informatica IDQ development activities.

Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.

Expert in writing SQL, PL/SQL Stored procedures using Toad. Experience with Teradata utilities Fast Load, MultiLoad, BTEQ scripting, fast Export, SQL Assistant, TPUMP.

Assisted in upgradation of informatica system from Informatica PowerCenter 8.6 to Informatica PowerCenter 9.1.

Extensively worked on the performance tuning of the Informatica Power Center Mappings as well as the tuning of the sessions.

Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.

Creating and managing Virtual machines in Windows Azure and setting up communication with the help of Endpoints and also VM migrations from Transactional hosts.

Worked on Inbound, Outbound and Carve-out Data Feeds and developed mappings, sessions, workflows, command line tasks etc. for the same.

Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.

Used Data stage to manage the Metadata repository and for import /export for jobs

Involved in Unit, System integration, User Acceptance Testing of Mapping.

Worked with Connected and Unconnected Stored Procedure for pre & post load sessions

Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.

Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.

Worked on production tickets to resolve the issues in a timely manner.

Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing.

Environment: Informatica Power Center 9.5, Informatica IDQ, Oracle 10g, Teradata, SQL Server 2008, Toad, SQL Plus, SQL Query Analyzer, SQL Developer, PL/SQL, MS Access, Windows NT, Shell Scripting, Clear Quest, Tivoli Job Scheduler.

Client: Blue Cross Blue Shield, Buffalo, NY Jun’13-Mar’15

Role: Sr. Informatica ETL/IDQ Developer

Project Description:

Blue Cross Blue Shield, with nearly three million members have access to the high-quality, affordable health care they expect and deserve. As a health care leader in New York for more than 75 years, to provide access to high quality, affordable health care medical insurance for individuals, families, and businesses.

Responsibilities:

Documented user requirements, translated requirements into system solutions and developed implementation plan and schedule.

Involved in the data modeling and designed the Data Warehouse using Star Schema methodology. Also, responsible for all the ongoing data model design decisions and database implementation strategies.

Extracted data from different source systems such as Oracle, SQLServer, MS Access, DB2, Mainframes, XML and Flat Files.

Worked on several transformations such as Filter Transformation, Joiner Transformation, Sequence Generator Transformation, Aggregator Transformation, Source Qualifier Transformation, Expression Transformation, Lookup Transformation (Connected and Unconnected), Joiner Transformation, and Router Transformation, Web services Transformation, XML transformation, Normalizer Transformation and Java Transformation in Informatica.

Developed Complex transformations, Mapplets using Informatica to Extract, Transform and Load Data into DataMart, Enterprise Data warehouse (EDW) and Operational data store (ODS).

Used Metadata Manager to manage the Metadata associated with the ETL processes.

Used Power Center Data Masking Options (Random Masking, Blurring, SSN, and Credit Card) in mappings.

Used error handling mapplets to capture error data into PMERR tables for handling nulls, analysis and error remediation process.

Defined the test strategy, created unit test plans and supported UAT and regression testing using HP Quality Center.

Used IDQ to do data profiling of the master data. Used Informatica Analyst to get an overview of the accuracy of the data and percentage populated.

Used Informatica Developer client to set up Address validation, name matching using Match transformation, etc.

Identified slowly running ETL jobs and reduced the run time drastically by applying performance tuning techniques.

Advocated best practices for ETL Processes and conducted Data Integration COP meetings.

Created slice and dice and drill down Reports, Dashboards, and Scorecards using OBIEE 11g for various subject areas.

Environment: Informatica Power Center 9.5/9.1.1/8.6, Informatica Developer client, Power Exchange CDC, Oracle 11g, Erwin, Informatica IDQ, Quality Center, SQL SERVER 2005/2008, PL/SQL, MS Access, XML, T-SQL, Shell Scripts, Control-M, Teradata.

Client: Cable One Inc, Kansas, KS Aug’12-May’13

Role: Sr. ETL Developer

Project Description:

Cable One is an American provider of cable TV, broadband Internet, and VOIP telephone services in the U.S. In this project after Data Discovery and Data Cleansing of legacy data, the master data was moved to an interim database. ETL tools were used to transform legacy data into required formats. Then the master data was loaded to the production systems and the reports were updated.

Responsibilities:

Analyzed the functional specifications provided by the data architect and created Technical System Design Documents and Source to Target mapping documents.

Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.

Performed Source System Data Profiling using Informatica Data Explorer (IDE).

Involved in designing Staging and Data mart environments and built DDL scripts to reverse engineer the logical/physical data model using Erwin.

Extracted data from SAP using Power Exchange and loaded data into SAP systems.

Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.

Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.

Designed Audit table for ETL and developed Error Handling Processes for Bureau Submission.

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.

Used Informatica IDQ to do data profiling of the source and check for the accuracy of data using dashboard.

Managed Change Control Implementation and coordinating daily, monthly releases and reruns.

Responsible for Code Migration, Code Review, Test Plans, Test Scenarios, Test Cases as part of Unit/Integrations testing, UAT testing.

Used Teradata Utilities such as Mload, Fload and Tpump.

Used UNIX scripts for automating processes.

Environment: Informatica Power Center 9.1.1, Informatica Developer Client, IDQ, Power Exchange, SAP, Oracle 11g, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

Client: Kaiser Permanente, Pleasanton, CA Mar’11-Aug’12

Role: Sr. Informatica ETL Developer

Project Description:

MIA-IDR is one of the Data Warehouse projects in Kaiser Foundation Health Plan, Inc. MIA stand for Management Information & Analysis and IDR stands for Information Delivery and Reporting. The scope of this project is to get all the Kaisers Inpatient and Outpatient data from a single uniform data repository known as CLARITY (Teradata). The clarity data base is a standard EPIC system which is used by many Health care organizations.

Responsibilities:

Translated the business processes/SAS code into Informatica mappings for building the data mart.

Used Informatica power center to load data from different sources like flat files and Oracle, Teradata into the Oracle Data Warehouse.

Implemented pushdown, pipeline partition, persistence cache for better performance.

Developed reusable transformations and mapplets to use in multiple mappings.

Implementing Slowly Changing Dimensions (SCD) methodology to keep track of historical data.

Assisted the QC team in carrying out its QC process of testing the ETL components.

Created pre-session and post-session shell scripts and email notifications.

Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.

Created mappings using Data Services to load data into SAP HANA.

Involved in data quality checks by interacting with the business analysts.

Performing Unit Testing and tuned the mappings for the better performance.

Maintained documentation of ETL processes to support knowledge transfer to other team members.

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.

Involved as a part of Production support.

Environment: Informatica Power Center 9.1, Oracle11g, PL/SQL, Teradata, SAP HANA, UNIX Shell Scripts.

Client: ApolloMD, Atlanta, GA Aug’10-Feb’11

Role: Informatica ETL Developer

Project Description:

This project deals with building a real time view of enterprise level data. A decision support system is built to compare and analyze product prices, their quantities and patient profiles. This Enterprise Data Warehouse (EDW) is used to deliver reports and information to sales and marketing management.

Responsibilities:

Responsible for requirement definition and analysis in support of Data Warehousing efforts.

Developed ETL mappings, transformations using Informatica Power Center 8.6.

Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.

Developed data Mappings between source systems and target system using Mapping Designer.

Responsible for Test and Production issue resolutions.

Developed shared folder architecture with reusable Mapplets and Transformations.

Extensively worked with the Debugger for handling the data errors in the mapping designer.

Created events and various tasks in the work flows using workflow manager.

Responsible for tuning ETL procedures to optimize load and query Performance.

Setting up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager I and external scheduler.

Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.

Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.

Taken part of Informatica administration. Migrated development mappings as well as hot fixes them in production environment.

Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.

Migrating Informatica Objects using Deployment groups.

Trouble issues in TEST and PROD. Do impact analysis and fix the issues.

Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.

Developed Unit test cases and Unit test plans to verify the data loading process and Used UNIX scripts for automating processes.

Environment: InformaticaPowerCenter 8.6, Power Exchange, Teradata, Oracle 10G, PL/SQL, MS Access, UNIX, Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, Qlikview, Teradata, SQL Assistant, Netezza, DB2.

Client: First Citizens Bank, Chapel Hill, NC Mar’09-June’10

Role: ETL Developer

Responsibilities:

Requirement gathering and Understanding of the Functional Business processes and Requirements given by the Business Analyst.

Involved in Designing High level Technical Documentation based on specification provided by the Manager.

Created Complex SQL from Technical requirement document and optimized it.

Created Complex SQL queries for reporting against tables.

Designed and developed Informatica mappings for data loads and data cleansing.

Performance tuned ETL processes at the mapping, session and Database level.

Integrated sources from different databases and flat files.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.

Independently perform complex troubleshooting, root-cause analysis, solution development

Involved in end to end system testing, performance and regression testing and data validations and Unit Testing.

Extensively used SQL and PL/SQL Scripts.

Extracted data from SAP using Power Exchange and loaded data into SAP systems. Used SAP BO as the reporting tools.

Used Deployment group to move Informatica objects from DEV to TEST and from TEST to QA/PROD.

Suggested best practices in Informatica and ETL processes to improve efficiency of the ETL process.

Environment: Informatica Power Center 8.1/8.6, Oracle 10g/9i, SQL Server 2000/2005, SSIS, PL/SQL, SQL, Visio 2003, UNIX, TOAD, Datastage.

EDUCATION:

Bachelors of Engineering in Computer Science



Contact this candidate