Post Job Free
Sign in

Data Developer

Location:
Louisville, KY
Posted:
October 05, 2020

Contact this candidate

Resume:

Satish Gaddam

Email: adgoau@r.postjobfree.com

Phone: 502-***-****

SUMMARY

Over 8+ years of IT experience in databases and ETL tools like Oracle, SQL Server, Netezza, Informatica in Data Warehouse environment and Tableau for Visualization.

Knowledge of Hadoop ecosystem and different frameworks inside it – HDFS, Hive, Sqoop, Oozie, Impala and reporting tools like Tableau, Business Objects

Experience in data migration from various OLTP/OLAP databases to Oracle, Netezza, Hadoop Hive

Good domain knowledge in Health Care and certified in AHM 250

Very strong experience in RDBMS and Analytical(OLAP) databases.

Experience in developing Visualization using Tableau and Business objects

Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation and support

Strong knowledge of Dimensional Modeling, Star and Snowflake schema. Designed Fact and Dimension Tables as per the reporting requirements and ease of future enhancements.

Extensive Experience in designing and developing complex mappings applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter joiner and sorter transformations.

Extensive experience in developing the Workflows, Worklets, Sessions, Mappings using Informatica Power Center.

Excellent knowledge in identifying performance bottlenecks and also in tuning the Mappings and Sessions by implementing various techniques like partitioning techniques and pushdown optimization.

Experience in data mart life cycle development, performed ETL process to load data from different sources into Data warehouse, Data marts using different ETL tools in the market.

Good knowledge of customization of knowledge modules using PL/SQL, T-SQL and shell scripting.

Extensively used Informatica power center and Data Integration Hub(DIH) to facilitate data flow between heterogeneous data bases.

Experience in Developing and customizing PL/SQL packages, procedures, functions, SQL Loader, triggers and reports

Experience in converting Database objects like Stored procedures, functions from T-SQL, PL/SQL, NZSQL

Building Datamart by extracting datafrom RDMS databases to Hadoop for reporting

Good knowledge on Amazon Web services in creating instances and hosting applications on EC2 instances

Experience in multiple projects on Waterfall and Agile methodologies

Experience interacting with Business users and Architecting high level requirements, high level design and detailed level design documents.

Developed Implemented Slowly Changing Dimension(SCD) methodology for accessing the full history of accounts and transaction information

Experience in implementing historical loads, incremental loads and change Data capture.

Good Exposure to AWS EC2, S3 and Azure Data Factory and BLOB Storage

TECHNICAL SKILLS

ETL Tools

Informatica Power center and Data Integration Hub, Tableau

Databases

Oracle 8x/9x/10/11x/12x, SQL Server 2008, Netezza stripper and Mako, Hadoop HDFS, Hive, Sqoop

Languages

SQL, PL/SQL, Shell scripting, Python

Database Tools

SQL developer, Toad for MySQL & Oracle, SSMS, SQL*Loader, Aginity workbench for Netezza

Operating Systems

Windows & Redhat Linux

Defect Tracking

Squids

BigData and Cloud

Hadoop ecosystem and AWS

Agile

Azure

Marketing tools

HCL(IBM) Unica Campaign and Interact

Certifications

1.IBM Certified Specialist - Puredata System for Analytics v7.1

2.Oracle PL/SQL Developer Certified Associate

3.Informatica PowerCenter Developer

PROFESSIONAL SUMMARY

Universal Resorts/K-Force

Orlando, FL

Sr. ETL Developer Mar 2020 – Present

Project description: Building datamart for consumption in Marketing tools and maintenance for smooth functioning of Unica Interact and Campaign tools.

Involved in requirement analysis, ETL design and development for extracting data from the source systems like SQL server, Oracle, flat files and loading into Hadoop Hive

Migrated marketing data to Hadoop ecosystem using sqoop and Azure Data Factory for connecting Campaign from Azure data warehouse.

Worked rigorously with Impala for executing ad-hoc queries

Very strong experience in loading data to and from RDBMS and Analytical databases like Oracle 12C, SQL Server, creating PL/SQL scripts, objects.

Implemented ETL Informatica designs and processes for loading data from the sources to target warehouse Troubleshooting connectivity issues between Interact Run time and design time servers.

Responsible for developing Tableau reports for Campaign and Interact Contacts and Responses

Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning using SQL Server Database.

Created PL/SQL scripts/procedures to test the data loads

Created Stored Procedures and Triggers using SQL Servers for Business applications.

Extensive experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor

Involved in preparing ETL design documents and Unit Test Plans for Mappings.

Prepared the code migration document and worked with release team in Migrating the code from Development to Test and Production environments

Involved in answering the change request as part of the integration testing

Responsible for Unit testing and Integration testing of mappings and workflows.

Attended daily status call with internal team and weekly calls with client and updated the status report

Extensive knowledge in Unica campaigns and Interact channels and tools functionality.

Worked on Amazon Web Services(AWS) EC2 Instances for hosting Interact application

Experience with pre-session and post-session SQL commands to drop indexes on the target before session runs, and then recreate them when the session completes.

Tuned various SQL queries in the system for better performance.

Extensive experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor

Worked with Session Logs and Workflow Logs for Error handling and troubleshooting in DEV environment.

Worked on POC to migrate data to Azure date Warehouse using Azure Data factory and Azure BLOB storage

Tools/Languages: HCL Unica Campaign, Interact, Azure data warehouse,Informatica power Center, AWS, PL/SQL, SQL Server, Tableau, Sqoop

Humana Inc

Louisville, KY

Sr. ETL Informatica/Hadoop Developer Nov 2016 – Mar 2020

Project description: Build Campaign management data marts in Netezza, Hadoop and develop campaigns and reporting as per business needs.

Involved in requirement analysis, ETL design and development for extracting data from the source systems like SQL server, Oracle, flat files and loading into Netezza and Hive

Implemented EMM Unica Campaign and Interact inbound and outbound campaigns

Migrated marketing data to Hadoop ecosystem using sqoop for connecting Campaign from Azure data warehouse.

Extensively Used Sqoop to import/export data between RDBMS and hive tables, incremental imports and created Sqoop jobs for last saved value

Created Tableau reports on Business Intelligence datamart

Worked rigorously with Impala for executing ad-hoc queries

Used various Informatica transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

Have configured Interactive Channels and Interaction strategy to build real time campaigns.

Created and configured new audience, segmentation and global suppressions for audience.

Worked in troubleshooting and resolving Interact and Campaign product issues and getting fixes for product from IBM/HCL.

Worked rigorously in loading data to and from RDBMS and Analytical databases like Oracle 12C, Netezza, SQL Server, creating PL/SQL scripts, objects.

Worked in tandem with the business teams to gather requirements, execute using IBM UNICA and post campaign analysis

Created campaign triggers, offers and configured them for campaigns, automated Interact white list processes for ease of deployments

Troubleshooting connectivity issues between Interact Run time and design time servers.

Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning with Netezza, SQL Server and Oracle Databases.

Created Oracle/Netezza Sql scripts to test the table loaded correctly

Worked on Amazon Web Services(AWS) EC2 Instances for hosting Interact application.

Developed code for bulkload and Incremental load.

Tuned various sql queries, mappings, targets and sources to improve the performance.

Used session and workflow logs for debugging the session.

Used debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

Involved in preparing ETL design documents and Unit Test Plans for Mappings.

Prepared the code migration document and worked with release team in Migrating the code from Development to Test and Production environments

Involved in answering the change request as part of the interaction testing

Responsible for Unit testing and Integration testing of mappings and workflows.

Attended daily status call with internal team and weekly calls with client and updated the status report

Tools/Languages: HCL Unica, Informatica Power Center v10.1, Azure data warehouse, AWS, PL/SQL, NZSql, Tableau, Sqoop, SQL Server, Oracle, Tableau, Hadoop HDFS, Hive

Tata Consultancy services/Humana Inc

Hyderabad, India

Sr. ETL Informatica Developer Feb 2015 –Oct 2016

Project Description: Build Analytics data mart for running Health Care Predictive models

Worked with Informatica power center, Oracle 11g, Netezza, SQL Server, Flat files and other sources to integrate data and also worked in Data warehouse Development life cycle.

Key Player in the team, assigned with designing, extracts/mappings having millions of data and complex logics.

Technical consultant for the team to provide optimal solutions, design reviews, code reviews and for any Informatica related issues.

Worked with Informatica power center, Oracle 11g, SQL server, Flat files and other sources to integrate data and also worked in Data warehouse Development life cycle.

Worked with different mappings like getting data from Oracle tables, SQL Server and flat files, getting data from dimension and fact tables in 2 separate data marts and loading to oracle target tables. Also used pre-loaded stored procedures.

Used Informatica features to implement Type I, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.

According to transformation logic, we used various transformations like sorter, aggregator, lookup, Expression, filters, Update Strategy, Joiner, Router etc in Mappings.

Configured the Reusable command task for sending the triggers, invoking different scripts and Reusable Email tasks for sending failure mail along with session details and appending logs with it and success mails. Also used Event wait, Event Raise, Scheduler options in the workflows.

Designed and developed many Oracle stored procedures, triggers, Views, Indexes queries for enhancements and maintenance of various database modules

Improved performance of Stored Procedures in Oracle PL/SQL while converting from SQL Server

Created Mapplet for error handling process to deal with data audit

Developed code for landing environment and then for staging and finally developed Incremental load populate the target tables for atomic model.

Tuned various sql queries, mappings, targets and sources to improve the performance.

Used session and workflow logs for debugging the session.

Used debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

Configured the workflows for running concurrently, used various types of migration techniques for moving code to Test environment from Development environment.

Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects

Performance tuning on sources, targets mappings and SQL (Optimization) tuning.

Provided 24/7 Active Data Warehouse Production Support as part of on-call rotation for both Incremental and Complete refresh.

SME for application, provide assistance to support team during issues in production

Lead and provided guidance to both on-site and off-shore teams for various EDW initiatives with design and coding standards and best ETL practices for using normalized data structures, de-normalized structures, and dimensional structures, best practices for the design, development, deployment, change management, support and successfully present the ETL Data Integration efforts to external departments to influence, appreciate and gain support for the projects from client departments.

Worked with the QA team to resolve and fix the defects found

Tools/Languages: Informatica Power Center, Toad, PL/SQL, NZSql, Sqoop, DIH, SQL Server

Tata Consultancy Services/Humana Inc

Hyderabad, India

ETL Developer Mar 2014 - Jan 2015

Project Description: Migration data from various SQL servers, Flat Files to Oracle for Medicare Risk Adjustment analytics

Designed & developed ETL processes based on business rules, job control mechanism using Informatica Power Center 9X and Oracle stored procedures/scripts

Responsible for converting SQL server stored procedures and functions to Oracle PL/SQL

Re-engineered on existing Mappings to support new/changing business requirements.

Good Experience with data migration from Oracle to Sql Server

Worked extensively on Ref Cursor, External Tables, Materialized Views and SQL Loader

Expertise in Dynamic SQL, Collections and Exception handling.

Experience in SQL performance tuning using Cost-Based Optimization (CBO).

Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.

Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.

Used Mapping, Sessions Variables/Parameters, and Parameter Files to support change data capture and automate workflow execution process to provide 24x7 available data processing.

Tuned SQL Statements, Mappings, Sources, Targets, Transformations, Sessions, Database, Network for the bottlenecks, used Informatica parallelism options to speed up data loading to target.

Extensively worked on Expressions, Source Qualifier, Union, Filter, Sequence Generator, sorter, Joiner, Update Strategy Transformations.

Developed SFTP/FTP re-usable processes to pull the files from External Systems.

Developed Informatica Mappings to populate the data into dimension and Fact tables for data classifications to end developers.

Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.

Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions (SCD).

Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, and Update Strategy for Designing and optimizing the Mapping.

Created various tasks like Pre/Post Session, Command, Timer and Event wait.

Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.

Prepared SQL Queries to validate the data in both source and target databases.

Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.

Prepared the error handling document to maintain the error handling process.

Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.

Monitored batches and sessions for weekly and Monthly extracts from various data sources to the target database.

Tools/Languages: Informatica Power Center, Toad, PL/SQL, DIH, SQL Server, SQL Server Management Studio

Tata Consultancy Services/Humana Inc

Hyderabd, India

ETL Developer Oct 2012 - Feb 2014

Project Description: Migrate New member Predictive Model data from Oracle 11g /SAS to Netezza for faster performance.

Implemented complex SQL logics for generating health parameters from clinical data

Involved in daily standup calls for updates and issues

Co-coordinating with business for gathering requirements

Converted SAS code to Netezza procedures.

Integrating all health parameters for creating scoring tables using analytical and aggregate function.

Loading data from multiple databases like Oracle, SQL Server, flat files, SAS to Netezza database

Converted PL/SQL procedures to NZ/SQL procedures to migrate data to Netezza

Loading flat files to Netezza using NZLOAD utility

Generating scoring table for predictive analysis on members data

Prepared Unit test cases and performed Unit testing for validating the data to meet expected results.

Re-engineered on existing Mappings to support new/changing business requirements.

Used Mapping, Sessions Variables/Parameters, and Parameter Files to support change data capture and automate workflow execution process to provide 24x7 available data processing.

Tuned SQL Statements, Mappings, Sources, Targets, Transformations, Sessions, Database, Network for the bottlenecks, used Informatica parallelism options to speed up data loading to target.

Developed Informatica Mappings to populate the data into dimension and Fact tables for data classifications to end developers.

Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.

Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions (SCD).

Created various tasks like Pre/Post Session, Command, Timer and Event wait.

Prepared SQL Queries to validate the data in both source and target databases.

Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.

Prepared the error handling document to maintain the error handling process.

Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.

Monitored batches and sessions for weekly and Monthly extracts from various data sources to the target database.

Tools/Languages: Informatica Power Center, Toad, PL/SQL, NZSql, Sqoop, DIH, SQL Server, SAS

Tata Consultancy Services

Hyderabad, India

ETL/BI Developer Trainee Jul 2012 - Oct 2012

Training mainly focused on Java, Web basics, PL/SQL, Informatica and Business objects

Built a web project in the first phase with HTML, CSS, Java, Java script

Second phase started with PL/SQL and Informatica power center tool and built reports using Business Objects

Created a Datamart for a retail chain with branches in multiple locations using PL/SQL and Informatica power centre

Performance and stock report are generated from the data mart using Business objects

Presented the project at the end of the training.

EDUCATION:

Course

University/Institute

Year

Bachelor of Technology in Computer Science and Engineering

SASTRA University, Thanjavur, India

2008-2012



Contact this candidate