Post Job Free
Sign in

ETL DEVELOPER

Location:
Worthington, OH, 43085
Salary:
65/hour
Posted:
April 21, 2022

Contact this candidate

Resume:

GANGADARI V

Sr ETL Informatica Consultant

Phone: +1-609-***-****

Email: *****@******************.***

Professional Summary

●Around 7+ years of experience in IT industry, related to various aspects involving Data integration and Data warehousing techniques, using ETL tools like Informatica Power Center 10.2/9.6/9.1/8.6, Informatica Power Exchange 10.2/9.6, Microsoft SSIS, Informatica Intelligent Cloud Services (IICS)

●Experience integrating data to/from On-premises database and cloud-based database solutions using Informatica intelligent cloud services

●Experience working with cloud-based database solutions including Azure Synapse, Azure Data Lake store, AWS Redshift and Snowflake

●Experience working with traditional on-premises database including Oracle, Sql Server and Teradata

●Experience working a ETL conversion project from SSIS to Informatica cloud

●Expert in all phases of Software development life cycle (SDLC) – Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance

●Worked with different non-relational Databases such as Flat files, XML files, Mainframe Files

●Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse

●Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, maps, workflow names, log files, bad files, input, variable, output ports)

●Worked with different Informatica performance tuning issues like source, target, mapping, transformation, session optimization issues and fine-tuned Transformations to make them more efficient in terms of session performance

●Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings, PL/SQL Stored Procedure and Triggers

●Experience in Creating ETL Design Documents, strong experience in complex PL/SQL packages, functions, cursors, indexes, views, materialized views

●Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team

●Extensive experience in UNIX Shell Scripting, AWK and file manipulation techniques

●Demonstrated ability in defining project goals and objectives, prioritizing tasks, developing project and providing framework for effective communication while maximizing responsiveness to change

●Possess experience in working on concurrent projects in very demanding and high-pressure situations

Technical Skills:

Package & ETL Tools

Informatica Power Center 10.x/9.x/8.x/7.x, IDQ, TOAD, OLAP, OLTP

Data Bases

SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v10 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Languages

SQL, PL/SQL, TSQL, TeraData, UNIX Shell Scripting.

Data Modeling

Star-Schema Modeling, Snowflakes Modeling

Méthodologies

Agile, Waterfall

Operating Systems

Unix, Win NT, MS-DOS, Solaris

PROFESSIONAL EXPERIENCE:

Key Bank, Cleveland, OH Nov 2019-– Present

Sr Informatica Developer

Project Description:

KeyCorp is a bank holding company. It is a bank-based financial services company. KeyCorp is the parent holding company for KeyBank National Association (KeyBank), its principal subsidiary, through which most of its banking services are provided

The project defines the goal of migration of on premise Datawarehouse to Cloud based Datawarehouse solution. Multiple integration systems like Oracle, Sql Server, Snowflake, Files are identified to push the data into Azure Sql Datawarehouse for reporting and analytics

Responsibilities:

●Worked with the IT architect, Program managers in requirements gathering, analysis, and project coordination

●Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL Datawarehouse, AzureData lake Store and Azure Blob Storage technologies

●Analyzed existing ETL Datawarehouse process and ERP/NON-ERP Applications interfaces and created design specification based on new target Cloud Datawarehouse (Azure Synapse) and Data Lake Store

●Created ETL and Datawarehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies

●Created mapping documents with detailed source to target transformation logic, Source data column information and target data column information

●Designed, Developed and Implemented ETL processes using IICS Data integration

●Created IICS connections using various cloud connectors in IICS administrator

●Extensively used performance tuning techniques while loading data into Azure Synapse using IICS

●Extensively used cloud transformations – Aggregator, Expression, Filter, Joiner, Lookup (connected and unconnected), Rank, Router, Sequence Generator, Sorter, Update Strategy, Union Transformations

●Extensively used cloud connectors Azure Synapse (Sql DW), Azure Data lake Store V3, Azure BLoB Storage, Oracle, Oracle CDC and SQL Server

●Converted legacy SSIS packages into Informatica mappings

●Analyzing the SSIS code for Informatica conversion and reviewing design solution with the team lead. Developing the STM based on the approved solution

●Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes

●Extensively used Parameters (Input and IN/OUT parameters), Expression Macros and Source Partitioning Partitions

●Extensively used Push Down Optimization option to optimize processing and use limitless power of Azure Synapse (SqlDW)

●Extracted data from Snowflake to push the data into Azure warehouse instance to support reporting requirements

●Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics and insight use case for Sales team

●Created PYTHON scripts to create on demand Cloud Mapping Tasks using Informatica REST API

●Created PYTHON scripts which will used to start and stop cloud Tasks (the scripts use Informatica Cloud API calls)

●Developed CDC load process for moving data from Peoplesoft to SQL Datawarehouse using “Informatica Cloud CDC for Oracle Platform”

●Developed complex Informatica Cloud Task flows (parallel) with multiple mapping tasks and task flows

●Developed MASS Ingestion tasks to ingest large datasets from on-prem to Azure Data Lake Store – File ingestion

●Designed Data Integration Audit framework in Azure SqlDw to track data loads, data platform workload management and produce automated reports for SOX compliance

●Worked with a team of 4 onshore and 6 offshore development teams and prioritizing project tasks

●Involved in Development, Unit Testing, SIT and UAT phases of project

Environment: Informatica Intelligent Cloud Services, Informatica PowerCenter 10.2, Informatica Power Exchange 10.2, SSIS, Windows Secure Agent, Teradata v1310, Azure Synapse (Azure SqlDW), Azure Data Lake Store, SQL Database, Power BI reporting

Comcast, Philadelphia, PA Sep 2018 --Nov 2019

Informatica Developer

Project Description

Comcast Corporation is an American multinational telecommunications conglomerate headquartered in Philadelphia, Pennsylvania. This Project wants to extend their business in North America, so it gathers the data from customers and do the analysis on top it. It uses Oracle and Salesforce to store the customer data. Virgin Media maintenance project involves support and maintenance activities over Informatica application for ETL from the data warehouse. It also involves resolution of defects created through HP QC

Responsibilities:

●Developed a standard ETL framework to enable the reusability of similar logic across the board Involved in System Documentation of Dataflow and methodology

●Extensively developed Low level Designs (Mapping Documents) by understanding different source systems

●Designed complex mappings, sessions, and workflows in Informatica PowerCenter to interact with MDM and EDW

●Design and develop mappings to implement full/incremental loads from source system

●Design and develop mappings to implement type1/type2 loads

●Responsible for ETL requirement gathering and development with end-to-end support

●Responsible to coordinate the DB changes required for ETL code development

●Responsible for ETL code migration, DB code changes and scripting changes to higher environment

●Responsible to support the code in production and QA environment

●Developed complex IDQ rules which can be used in Batch Mode

●Developed Address validator transformation through IDQ to be interacted in Informatica PowerCenter mapping

●Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter

●Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly

●Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer, and sequence generator

●Created reusable mapplets, reusable transformations and performed Unit tests over Informatica code

●Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & code migration during deployments

●Responsible for reloads of Informatica applications data in production and closing user tickets and incidents

●Identify performance issues and bottlenecks

Environment: Informatica Big Data Management, Informatica PowerCenter 10.2, Informatica Power Exchange 10.2, SSIS, Windows Secure Agent, Teradata v1310, Azure Synapse (Azure SqlDW), Azure Data Lake Store, SQL Database, Power BI reporting

Tata Consultancy Services,INDIA . Jun 2015 – Mar 2018 ETL Developer

Project Description:

This project is created to capture data for annual cycle for Budget Plan and monthly/quarterly cycle for forecasting Hyperion Planning Platform requires data from various company's source systems to be extracted, cleansed, validated, and transformed into required model of Oracle Hyperion Planning It involves creating ETL code to read data from various source systems and load it into oracle which would then be consumed by Oracle Hyperion Planning Team for Reporting

Responsibilities:

●Lead and/or participate in gathering and evaluating requirements, working with application / Data Warehouse team and project managers to provide solutions to end users

●Develop Technical design and reporting solutions to influence business results Oversee the performance of the project throughout the life cycle from initiation till completion stage

●Proficient in translating user’s statements of needed system behavior and functionality into

●Business and Functional Requirement

●Involved in data modeling using ER, star schema and dimensional modeling Excellent understanding of OLTP/OLAP System Study, Analysis and developing Database Schemas like Star and Snowflake schema Exposure to Reporting tools OBIEE, BI Publisher

●Developing ETL mappings from the given requirements and unit testing them accordingly

●Creating Technical Design Documents from Business Requirements

●Created data maps in Informatica Power Exchange to read and write data to the mainframe datasets

●Creating batch scripts for different requirement of the project such as file validation, moving files from share point to Informatica server, archiving files with date time stamps etc

●Performance tuning various mappings, Sources, Targets and transformations by optimizing caches for lookup, joiner, rank, aggregator, sorter transformation and tuned performance of Informatica session for data files by increasing buffer block size, data cache size, sequence buffer length and used optimized target-based commit interval and Pipeline partitioning to speed up mapping execution time

●Reviewing Informatica ETL mappings/workflows and SQL that populates data warehouse and data mart Dimension and Fact tables to ensure accuracy of business requirements

●Created Informatica Source & Targets Instances and maintain shared folders so that shortcuts are used in project

●Responsible for Unit Testing and Integration testing of mappings and workflow

Environment:

Informatica Power Center, Oracle, MS SQL SERVER, SQL, PL/SQL, SQL*Loader, UNIX Shell Script.



Contact this candidate