Post Job Free

Resume

Sign in

Data Warehouse Business Intelligence

Location:
Scottsdale, AZ
Salary:
100000
Posted:
January 13, 2024

Contact this candidate

Resume:

Name : Karthik Kuragayala

: ad2qp8@r.postjobfree.com

: +1-978-***-****

Objective:

To pursue a growth-oriented and challenging career in IT industry, utilizing my profound analytical and communication skills to deliver outstanding and timely solutions.

Professional Summary:

Around 7+ years of professional experience in Business Intelligence solutions in developing large-scale Data warehouse and Client/Server Applications using Informatica PowerCenter, Teradata, IBM DB2 and Oracle.

Worked on Power Center Designer tools like Source Analyzer, Target Designer, imported data from various Sources, transformed and loaded into Data Warehouse Targets using Power Center.

Involved in Requirement Analysis, High- and low-level design, Project / Release planning, Software Development and testing of ETL processes, handling the onsite-offshore coordination, implementation, and warranty Support.

Experience working in Agile and SDLC Methodologies.

Strong Design & Implementation experience in Informatica PowerCenter 10.

Good knowledge in Informatica Transformations like Aggregator, Expression, Filter, Joiner, Lookup, Router, Sequence Generator, Sorter etc.

Expertise in working with Tables, Views, Joins and Indexes in IBM DB2, Teradata, Oracle Databases.

Scheduled and maintain packages by daily, weekly, and monthly using Control-M tool.

Experienced at manual testing, writing unit test cases, and generating reports.

Documentation to describe Application development, Logic, Coding, Testing, Changes and corrections

Extensively used SQL to queries against Database.

Having good understanding and knowledge on BI Reporting tools – MicroStrategy and Tableau.

Worked exclusively in implementing the types of slowly changing dimensions (SCDs) - Type I and II in different mappings as per the requirements.

Good exposure to development, testing, debugging, implementation, documentation, end user training and production support.

Good experience writing technical specification documents, ETL Design documents, Test plans and Deployment plans.

Aptitude for analyzing, identifying problems, and coming with out of the box solutions.

Ability to achieve organizational integration, assimilate job requirements, employ new ideas, concepts, methods, and technologies.

More than a year of experience on Teradata platform using Teradata utilities such as Teradata sql assistant, load utilities such as Teradata parallel transponder, FastLoad, FastExport.

Worked with various sources as inbound (xml, flatfile(.csv,.dat), sysbase, oracle, db2)

Generated various data extracts for outbound flat files(.csv,.dat).

Worked on Dimensional Modeling, Star Modeling, Data marts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling, Star and Snowflake Schemas.

Experience on mapping and task flows using AWS as S3 connectors.

Achievements and Accomplishments

Awarded Annual Team Excellence (Best Performer) Award 2019 for HIX in Legato.

Received R&R Award for quality deliveries.

Professional Qualification:

Completed B. Tech (ECE) From Anurag college Engineering.

Completing Master of Computer Science in Rivier University, Nashua, NH.

Technical Skills:

ETL Tool : Informatica PowerCenter 10.4

Database and Tools : IBM DB2, Teradata, Oracle, SQL server, Micosoft Azure

Scheduling Tool : Control-M, Autosys

Reporting Tool : Business Objects, MicroStrategy and Tableau

Operating System : UNIX, Windows 07/10

Utility Tools : ClearCase, Putty

DevOps Apps : Bitbucket, Jira

Test Methodology : Agile using Scrum

Professional Experience:

Client: Capital Group, Irvine, CA (Mar 2023 – Till Now)

Informatica ETL Developer

Title : Investment Data warehouse

Location : Irvine, CA

Platform : Oracle, Autosys, Microsoft Azure, SQL server, Informatica 10.4, UNIX, Airflow, AWS

Project Description

As part of this project, we process the data for all the funds which come from different upstream and IDW is the warehouse or book of records for all the holdings for the funds, all this data would be used in generating the reports which would be used by the business user and sales team to show the fund performance over the time.

Responsibilities:

Preparation of mapping documents, ETL specification documents and unit test case documents.

Handling the ad hoc requests for ETL changes, new ETL mapping development in Informatica PowerCenter.

Design, Develop and Implementation of Informatica Mappings and Workflows and changing existing Informatica mapping as per client’s requirements.

Creation of Mappings using Different Transformations like Source Qualifier, Expression, Aggregator, Router, Sorter, Lookup and Update Strategy

Hands on experience in tuning mappings, identifying, and resolving performance bottlenecks in various levels like source, targets, mappings, and sessions.

Design and Develop Informatica mappings, workflows to collect data from multiple data sources based on mapping specifications and load into Teradata database.

Create Tables, Views, stored procedures, other database objects on Teradata, write SQL scripts.

Prepared ETL Strategy document and Unit test plan.

Analyze and tune long running SQL using Teradata query performance tuning techniques like collecting statistics, identifying the data skew, remediate spool space issues.

Scheduled and Monitored jobs using Autosys scheduling tool.

Migrating all Informatica, UNIX, DB objects from lower Environment to higher environments.

Preparing various data flow diagrams and high-level design documents Microsoft word documents.

Rising/closing defects, executing test cases and creating decks for status purpose in JIRA.

Involved in giving internal and external reviews to the client for developed projects.

Loaded data from various data sources into Teradata production and development warehouse using BTEQ, Fast Export, multi load and Fast Load.

Created mappings to extract data from different source files, transform the data using Aggregator, Expression, Filter, Joiner, Lookup, Normalizer, Router, Update Strategy Transformations and loading into data warehouse tables.

Worked with various sources as input (xml, flatfile(.csv,.dat), sysbase, oracle, db2)

Generated various data extracts for outbound flat files(.csv,.dat).

Performed data cleansing and performance tuning using Informatica PowerCenter.

Experience on mapping and task flows using AWS as S3 connectors.

Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information.

Developed various Mapplets that were included into the mappings.

Unit Test Case Preparation, unit testing, deployment group creation and ETL code migration.

Impact analysis and data sampling,

Used Autosys Job Scheduler to schedule jobs.

Freddie Mac, VA (April 2022 – Dec 2022)

ETL Developer

Title : BI-Operations

Platform : IBM DB2, Oracle, Informatica 10.4, UNIX, Control-M.

Responsibilities:

Created mappings to extract data from different source files, transform the data using Aggregator, Expression, Filter, Joiner, Lookup, Normalizer, Router, Update Strategy Transformations and loading into data warehouse tables.

Performed data cleansing and performance tuning using Informatica PowerCenter.

Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information.

Developed various Mapplets that were included into the mappings.

Unit Test Case Preparation, unit testing, deployment group creation and ETL code migration.

Impact analysis and data sampling.

Used Control-M Job Scheduler to schedule jobs.

Design and Develop Informatica mappings, workflows to collect data from multiple data sources based on mapping specifications and load into Teradata database.

Create Tables, Views, stored procedures, other database objects on Teradata, write SQL scripts.

Prepared ETL Strategy document and Unit test plan.

Analyze and tune long running SQL using Teradata query performance tuning techniques like collecting statistics, identifying the data skew, remediate spool space issues.

Ivy Comptech, India (June 2021 – August 2021)

ETL Developer

Title : BI-Team

Client : Entain

Platform : IBM DB2, Teradata, Informatica 10.4, UNIX.

Project Description

As part of the project, We help Entain reach millions of users through a platform that supports end-to-end online gaming, transaction processing, real-time CRM and data intelligence services. We offer fully customized solutions that enable Entain to connect with millions of users and through big sporting events, like the English Premier League, football World Cup, Formula One and NFL, create great digital experiences.

Responsibilities:

Experience in Development the mappings using needed transformations in Informatica tool according to technical specifications.

Resolved the existing bugs and developed the mappings, sessions, and workflows within SLA.

Unit testing and maintaining of Informatica components.

Legato Health Technologies, India (July 2019 – June 2021)

Informatica ETL Developer

Title : Health Insurance Exchange

Client : Anthem

Platform : IBM DB2, Teradata, Informatica 10.2, UNIX, Control-M.

Project Description

As part of the project, we process the eligible/non-eligible members and submit them to CMS according to HCA act. Also, the enhancements and maintenance of the comprehensive health care reform law enacted in March 2010.

Responsibilities:

Preparation of mapping documents, ETL specification documents and unit test case documents.

Handling the ad hoc requests for ETL changes, new ETL mapping development in Informatica PowerCenter.

Design, Develop and Implementation of Informatica Mappings and Workflows and changing existing Informatica mapping as per client’s requirements.

Creation of Mappings using Different Transformations like Source Qualifier, Expression, Aggregator, Router, Sorter, Lookup and Update Strategy Transformation.

Hands on experience in tuning mappings, identifying, and resolving performance bottlenecks in various levels like source, targets, mappings, and sessions

Scheduled and Monitored jobs using Control-M scheduling tool

Migrating all Informatica, UNIX, DB objects from lower Environment to higher environments.

Preparing various data flow diagrams and high-level design documents Microsoft word documents.

Rising/closing defects, executing test cases and creating decks for status purpose in JIRA.

Involved in giving internal and external reviews to the client for developed projects.

Loaded data from various data sources into Teradata production and development warehouse using BTEQ, FastExport, multi load and Fast Load.

Innova Solutions, India (Aug 2015 – April 2019)

Informatica ETL Developer

Title : AMS

Client : Hyundai

Platform : Oracle, Informatica 9.6, UNIX.

Project Description:

AMS is production support project in which we supported 18+ Applications, to enhance business.

Responsibilities:

Experience in Development the mappings using needed transformations in Informatica tool according to technical specifications.

Experience development of SCD Type I and Type.

Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklets Designer.

Extracted/loaded data from/into diverse source/target systems like Oracle, DB2, Mainframe and Flat Files.

Resolved the tickets raised with in SLA.

Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept.

Developed mappings to load data into landing layer, staging layer and ODS layer with extensive usage of SCD Type I and SCD Type II development concept.

Unit testing and maintaining of Informatica mappings.

Title : KMA CDM

Client : Hyundai

Platform : Oracle, Informatica 9.6, UNIX.

Project Description:

KMA has many business scenarios in which providing a complete view of a customer’s journey will be valuable. Consumer Affair agents need more context of the customers experience with KIA to help with handling customers. Knowing a customer’s loyalty (owns multiple vehicles, goes to Kia dealer for service) provides enhanced understanding. CDM project focuses on improving quality, enriching with 3rd parties, and matching and linking customer data to create Golden Records and 360 views of the customer to organize all transactions. This in turn will allow the Call center agents, and other areas like marketing, to have complete customer profile.

Responsibilities:

Create an Operational Data Store (ODS) to integrate customer/VIN/dealer data from multiple sources into single database.

Developed the framework for creation of customer golden record from various sources after applying necessary Data quality rules.

implemented the framework to provide VIN 360 and Customer 360 view which includes all the transactions that a given Customer/VIN has encountered. Call center agents will have visibility to a complete customer profile and touch points. This will provide insights into current customer profile and enable the agent to provide better customer service.

Involved in migration of code and code testing.

Optimizing the mapping to load the data in slowly changing dimensions.



Contact this candidate