Post Job Free

Resume

Sign in

Data Warehouse Production Support

Location:
Irving, TX
Posted:
February 07, 2024

Contact this candidate

Resume:

Upendra Reddy Chinthala Cell: +1-945-***-**** Email: ad3ghc@r.postjobfree.com

Skills

Year of Exp.

Last time used

Total IT Experience

12+

2023

ETL

12+

2023

Informatica Power center

12+

2023

Oracle, Db2

11+

2023

PLSQL,Snowflake

5+

2023

Unix Shell Scripting

10

2023

Autosys,Control-m

6+

2023

Summary

Having 12+ years og experience indesigning, developing, Implementing,Production Support and Maintenance of Data Warehouse Business Applications

Enhancing current technical expertise, and applying Informatica 9.1,9.6,10.2,10.5,Oracle 11g, PLSQL,DB2, UNIX, Snowflake, Datamodelling,transferable skill sets.

Experience with Snowflake Virtual Warehouses, Database, Schema and Table Structures.

Performed Loads into Snowflake instance using Snowflake connector in IICS to support data analytics.

Strong knowledge in UNIX shell scripts, Autosys and Support tools.

Having good knowledge of ETL Concepts like Lookup Caches, Mapping Parameters and Variables, Informatica functions, User defined functions.

Experience in Extraction, Transformation, Loading of data directly from different heterogeneous source systems like Flat File & Mainframe (VSAM) Files and Oracle.

Extensively worked on Informatica Designer, Workflow Manager and Monitor tools to load data.

Have good problem solving and analytical skills. Highly motivated professional, Team Player and the ability to effectively Work with, communicate and mentor people.

Having Good knowledge on dimensional data modelling.

Quick learner and adaptive to new and challenging technological environments.

Expertise in production support, handling production issues and Onsite-Offshore coordination.

Education & Certifications

Master of Computer Applications (MCA) from S.V University, Tirupati. 2009

Professional Experience

Client: HMS, USA June 2023 to Present

August 2018 to June 2023

SENIOR ETL Consultant

Project Name: HMS VERIFICATIONS

Environment: Languages: SHELL SCRIPT

Database: DB2,Oracle

Tools: INFORMATICA 10.5,IICS,Snowflake,IDMC

O/s: LINUX

HMS Verification helps to validate the policy information with carrier automated process, authenticates the eligibility coverage of a policyholder and/or his /her dependents.EDI Console Application is responsible for validating the policy information with carrier automatically. Based on the input record, this application creates the EDI 270 file and sends that to carrier. Once carrier sends back the response in EDI 271 format, this application should read and process the records

Responsibilities:

Analyze the business requirement assigned from business stakeholders through Service-Now tool and raise the queries for any further clarifications.

Prepare low level and high-level specifications documents such as design/technical documentation, unit test cases documents and conduct walkthrough of the documents with business users to get approval.

Analyzed existing ETL Data warehouse process and created designed specification based on new

target cloud Data warehouse.

Redesigned the existing Informatica PowerCenter logic to IICS with target as Snowflake.

Break Fixes and Permanent resolutions for repetitive production failures.

Keep abreast of new jobs going to production by attending weekly ETL Implementation review meetings.

Involved in developing of the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems (DB2), Mainframe feeds using Informatica Power center.

Designed various mappings for extracting data from various sources involving flat files, Mainframes and relational tables.

Validating the Data before loading into the target.

Checked and tuned the performance of Informatica Mappings.

Responsible to look into production issues and resolved them in timely manner.

Involved in creating Sessions & Workflows and execution of Unit Test Conditions.

Perform unit testing & integration testing and provide the results to Quality Assurance for their review.

Involved in creation of PLSQL stored procedures and triggers.

Support data analysis and fixes in production environment whenever needed.

Perform code review of development tasks.

Client: SINGTEL, Singapore March 2016 to July 2018

Senior ETL Consultant

Project Name: BCC- Billing and Customer Care

Environment: Languages: SHELL SCRIPT

Database: ORACLE 11g

Tools: INFORMATICA 9.6

O/s: LINUX

SingTel is one of the biggest telecommunication service providers in Singapore. The project of billing and customer care is end to end new data warehouse building for SingTel by Tech-Mahindra which serves their analytics need to understand the business and provide the feed for reporting to the business. The main objective of the project is to maintain the sales and to explore avenues for new business opportunities. Data Warehouse team is responsible for building enterprise data warehouse to maintain and track the customer’s orders and bill payments. The data is extracted and transformed and applied business logic to load into Data warehouse using Informatica Power Center 9.6.1.

Responsibilities:

Interacting with client regularly for better understanding of the requirements and providing possible solutions and there by developing code.

Created mappings using the transformations such as the Source qualifier, Aggregator, Expression, Router, Filter, Sequence Generator and Joiner.

Implemented SCD TYPE-2 logic while loading data into dimension tables.

Involved in Creating Sessions, Workflows Using Workflow Manager

Worked on creation of Unix shell scripts for control framework programs.

Supporting the application and resolving the issues within timelines and providing solutions to client for better enhancements.

Worked on creating PL SQL Stored procedures.

Worked on Informatica Power Center tool –Source Analyzer, Mapping Designer, Mapplet Designer, Transformation Developer, and Workflow Designer, Workflow Monitor.

Involved in Test case preparation and maintained the Quality Standards throughout the project.

Involved in production support for monitoring jobs and production issues.

Provide post release/production support.

Client: GE Corporate, US April 2014 to February 2016

ETL Developer

Project Name: IBS- Internal Billing System

Environment:Languages: SHELL SCRIPT

Database: ORACLE 11g, DB2,

Tools: INFORMATICA 9.6, Mainframes,

O/s: LINUX

The IBS process runs in on daily, weekly and monthly basis. In case of IBS, CIA doesn’t get any bookkeeping file directly from IBS. Instead, CIA extracts the invoices from the IBS tables present on Blue mainframe and processes them and sends the bookkeeping file to the DXL server and distributed to various businesses as requested.CIA has three Extract groups that are defined in the IBS. They are ESHEL1, ESHELW and ESHEL5.The data belonging to ESHEL1 are extracted daily. The data is stored into server. At the end of month all the GDG’s are extracted for the month and are consolidated to create a single file and used for further processing. ESHELW, which has the weekly BUC’s, are processed every week and the bookkeeping file is created every week. ESHEL5 also has weekly BUC’s, are processed every week and the bookkeeping file is created every week. ESHEL5 process has been created specifically for GE Money.

Responsibilities:

Interacting with client regularly for better understanding of the requirements and providing possible solutions and there by developing code.

Assigning the tasks to team basing on the change requests obtained and enhancing the application for better performance.

Extracting source data from mainframe server, transforming to intermediate file structure using Informatica and converting to CCL format.

Created mappings using the transformations such as the Source qualifier, Aggregator, Expression, Router, Filter, Sequence Generator and Joiner.

Checked and tuned the performance of Informatica Mappings.

Involved in Creating Sessions, Workflows Using Workflow Manager

Usage of Autosys for production runs schedule and triggering UNIX shell scripts to validate source data and to start Informatica Workflows.

Supporting the application and resolving the issues within timelines and providing solutions to client for important enhancements.

Involved in Test case preparation and maintained the Quality Standards throughout the project.

Client: GE Corporate, US January 2013 – March 2014

ETL Developer

Project Name: Corporate Freight (C&F)

Environment:Languages: SHELL SCRIPT

Database: ORACLE 10g

Tools: INFORMATICA 9.1, Mainframes,

O/s: LINUX

Corporate Freight provides the GE businesses the financial data necessary to book to their ledgers. Sometimes this is done by Corporate Freight on behalf of the business, and it depends on their preference. Each business has a set of instructions of how this process should be done. Files received from the vendors all use a standard JE file layout. Once the JE files (Entered, Paid) are received by Corporate Freight, validations are performed to ensure compliance of format, data, and structure before processing them as per the individual business rules. Creates the bookkeeping file and sends the file to CCL through DXL server.

Responsibilities:

Requirement Analysis and Design: Understanding the requirements of the client and act as a sole functional resource

Designing & Development of Power Center mappings, sessions, workflows

Automations for running the jobs using UNIX Shell Scripting.

Unit testing of the DWH data in DEV, QA and Production ensuring everything works fine.

Doing Performance Testing and enhancing the existing ETLs for better performance.

Documentation of the Design which included preparation of Mapping docs and Low Level Designs (LLD).

Client: GE Corporate, US November 2011 to December 2012

ETL Developer

Project Name: Travel & Living (T&L)

Environment:Languages: SHELL SCRIPT

Database: ORACLE 10g

Tools: INFORMATICA 9.1, Mainframes,

O/s: LINUX

The Travel and Living process runs weekly as well as monthly. There are two types of files that are sent by T&L. Common format file and Accrual file both are bookkeeping files. The common format files are checked in every week. The Accrual file is obtained on last fiscal Friday. Then all the common format files and the accrual files are processed to create the monthly bookkeeping for T&L and then the file is converted to standard bookkeeping format using Informatica and sent to the respective businesses.

Responsibilities:

Understanding existing business model and customer requirements.

Involved in developing of the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems (DB2), Mainframe feeds using Informatica Power center.

Designed various mappings for extracting data from various sources involving flat files, Mainframes and relational tables.

Validating the Data before loading into the target.

Involved in creating Sessions & Workflows and execution of Unit Test Conditions.

Created Mapping, Transformations and Monitored Sessions using Informatica.

Scheduled sessions to update the target data using Informatica workflow manager.



Contact this candidate