Post Job Free

Resume

Sign in

Data Project

Location:
Irving, TX
Posted:
February 12, 2021

Contact this candidate

Resume:

Ponnaganti Gopi

Mail id : adj4t4@r.postjobfree.com

Mobile: +1-469-***-**** ETL Developer

Experience Summary:

Over 8+ years of experience in the field of Information Technology Data warehousing mainly emphasizing in Extraction, Transformation and Loading (ETL) mechanism using Ab Initio Maintenance & Development projects

Analyze functional and business requirements and prepare high level and low-level design document.

Integrate multiple systems by building ETL procedures using Ab Initio.

Define and design data transformations using different Ab Initio Components like Input, Output (Table/File), Filter by expression, Rollup, Scan, Normalize, Sort within groups, De-dup sort, Reformat, Split, Read Json and Join.

Good Knowledge on generic graphs build and Parameter sets.

Good Knowledge on working with the conduct >it plans.

Sound Knowledge on reading and parsing JSON data using components.

Good Knowledge on implementing both Batch, Continuous flows with webservices.

Knowledge on Big Data technologies and cloud technology.

Implemented the Parallel application by replicating the components and processing modules into number of partitions.

Strong working experience in Scrum/ Agile framework and Waterfall project methodologies.

Good knowledge on implementing service graphs using continuous flows to respond back to the requests that came from the applications.

Good Experience working with various Heterogeneous Source Systems: Oracle, Teradata and DB2, and in various Scheduling tools like TWS master and Control Center

Good Knowledge on Technical repository management console.

Tuned the Graphs by removing unnecessary Sorts, used Lookup files, to enhance the performance.

Good knowledge of ETL and Data warehousing concepts.

Ability to handle process related tasks like code migration, tag creation, access request.

Good knowledge on Ab Initio MDH domain.

Excellent knowledge and experience in SQL and UNIX shell scripting.

Good knowledge on writing complex Teradata recursion SQL’s.

Have good understanding on Teradata utilities (Fast Load, Fast Export, PT_Load and Multiload)

Hands on experience in NO SQL data bases like MongoDB and AWS DynamoDB.

Handled all aspects of software development lifecycle - Developing, Production Support and QA.

Expertise in Bug Tracking Process, Familiar with Bug Reporting and Tracking using Bug tracking tools like Mantis.

Education:

Bachelor of Technology from JNT University, Kakinada, INDIA

Technical Skills:

ETL Tool : Ab Initio GDE 3X/3.1.5/3.2.5 Co>operating system 3X/3.1.5/3.2.5,

Pentaho PDI, Apache NIFI, Alteryx

Database : Oracle 10g/11g, Teradata, Redshift, MongoDB

Operating System : UNIX, Windows XP Professional.

Scheduler : IBM TWS master, Control Center

Language : Unix Shell scripting and Python

Achievements:

Certified professional on data analytical tool “Alteryx”.

Project summary:

Project #5 May 2020 to Till date

Project : Next Gen: HEDIS Gaps In Care

Client : BCBSA, USA

Description:

The term “Gaps In Care” is defined as any discrepancy in the health care of an individual between recommended best practices and the care that is actually provided. An example of gap in care would be individuals missing age-based screenings like a mammogram or colonoscopy exam. Identifying gaps in care as early as possible and taking proactive measures to close them is a critical component to improving the overall hath of members and reducing future medical costs for BCBSA.

Responsibilities:

Developed batch graph to create an Index Compressed flat file by applying the transformation rules.

Built a continuous flow web services which deliver results to a request for gaps in care information for a given member within 300 milli seconds.

Good knowledge on developing online Restful web services using components like RPC Subscribe and RPC Publish.

Good knowledge on micrographs development.

Wrote shell scripts to automate the project setup along with the data directories creation.

Attending scrum meetings like stand ups and planning sessions.

Involved in the high-level design discussions of the project.

Good knowledge on using the Air commands for the versions check, tag creation and identifying the object differences.

Worked on MDH for storing the reference data and lineage tracking.

Attending scrum meetings like stand ups and planning sessions.

Migrated the objects to higher environments using CICD process.

Used control center as a schedular to schedule the batch jobs.

Extensively worked on conduct > IT plans to sequence the steps.

Project #4 August 2018 to April 2020

Project : Crew Pay Analytics

Client : Southwest Airlines, USA.

Description:

Crew Pay Analytics project aims to provide the accurate, stable and consistent Pay data to Crew Analytics and other end users for the analytical and reporting purpose.

To overcome the issues in the existing EDW IF/FO People productivity application, SWA Crew Analytics Business Team had come up with Crew Pay Analytics project having improved visibility into current costs that will allow for

Better contractual negotiations

Higher confidence planning future growth strategy

Analyzing data to make informed decisions with regards to Crew.

Responsibilities:

Created mapping document as per the functional requirements coordinating with team.

Attending the business meetings to discuss about the complex problems and gathering requirements.

Attending scrum meetings like stand ups and planning sessions.

Extracting, Transforming and Loading the data from Source to Staging and Staging to Target according to the Business requirements.

Involved in writing Shell Scripts to transfer files from FTP servers to ETL servers with some validations.

Extensive knowledge on parsing JOSN messages and converting data into JSONs messages using Abinitio.

Extensively used Multifile System across the project.

Implemented ICFF’s for history data storage.

Developed the generic procedure to compare the database DDL’s in Abinitio.

Understanding the code issues and providing the break fixes in short time.

Good knowledge on creating the tags for migrations to higher environments.

Good Knowledge in technical repository management console.

Involved in code migrations to higher environments.

Writing Technical design document and ETL design document as per the business requirements.

Good at understanding and implementing the recursive loop logics in Teradata.

Implemented the CDC logic’s in Teradata.

Writing Tivoli Jobs/Job streams to automate the graph runs.

Wrote Teradata SQL queries and shell scripts as per functional requirements.

Build Complex SQL Queries to Join Different Input Teradata Tables, ensure that there are Indexes applied, if there is not, then coordinate with the Business Architect/Modelers to get them added.

Project #3 Feb 2017 to July 2018

Project : Reactor CX

Client : Seven Eleven, USA.

Description :

Reactor CX is a real time, cloud-based, multi-channel customer engagement and loyalty platform. It is built on a scalable big data technology stack and leverages an adaptive integration layer to enable quick and easy multi-channel solutions that integrate well within client infrastructure.

Responsibilities:

Design and implemented lot of functionalities on Data migration from 711 to RCX.

Co-ordinate and execute the loads to get the data from 711 to RCX.

Identify and address the gaps between the client system and RCX provided the solutions for the gaps.

Extensively worked on SQLite3 to convert csv file into JSON object during the migration.

Conducted the user testing through interactive sessions for the UAT AND CRP phases.

Requirement analysis and prepared Design Document.

Perform Delta and full refresh load.

Extensively used git repository for the code migrations.

Written UNIX shell scripts to enable the checkpoint features in the Pentaho ETL tool.

Written SQL queries to load data from stage to target tables in redshift DB.

Writing complex SQL statements to solve business problems.

Migrate DB objects and ETL code from Development to till PROD.

Prepare design documents and Visio that describe input, output, logical operations and convert them into deliverable code

Project #2 Mar 2015 to Jan 2017

Project : CDW Phase2 2016

Client : Southwest Airlines, USA.

Description:

CDW 2016 includes multiple business requirement documents.

BNPS SURVEY: The BNPS survey will recruit Customers from the CDW to participate into the survey. Customers will need to be flagged in the CDW so that they do not receive other research surveys. Every month, when Customers respond, a list of respondents and the accompanying data will be loaded into the CDW.

This project will import Brand NPS (BNPS) score data from Hall & Partners (H&P) into the CDW, enabling additional insights from SWA Customers’ NPS scores.

NUBILL: The purpose of this document is to detail the epics and acceptance criteria that will enable bringing in WiFi purchase data from the vendor, Nubill. Transaction-level data from Nubill will be stored in the CDW. In the CDW, transaction-level data will be matched up to Customer-level data wherever matches can be made.

.

Responsibilities:

Created mapping document as per the functional requirements coordinating with onsite.

Created generic graphs and parameter sets.

Extracting, Transforming and Loading the data from Source to Staging and Staging to Target according to the Business requirements using Ab-initio.

Good knowledge on using transform components like sort, De-dup sort, filter by expression, rollup, scan, normalize etc.

Extensively using partition and Departition components.

Actively participated in migration tasks from DEV to PROD.

Involved in writing Shell Scripts as per the business requirement.

Involved in creating Technical design document and ETL design document.

Involved in creating DOU document for EPS team.

Involved in performance tuning for the Graphs and SQL queries.

Created tags and migrated code from one environment to another.

Created and scheduled Jobs using TWS master.

Build Complex SQL Queries to Join Different Input Teradata Tables, ensure that there are Indexes applied, if there is not, then coordinate with the Business Architect/Modelers to get them added.

Writing complex SQL statements to solve business problems.

Project # 1 May 2012 to Feb 2015

Project : Customer Data Warehouse (CDW)

Client : Southwest Airlines, USA.

Description:

CDW project is about loading Siebel Loyalty Transaction, Member data, Partner data and SR data into Enterprise Data Warehouse 2.0 (commonly known as IDW).

Roles and Responsibilities

Created mapping document as per the functional requirements coordinating with onsite.

Extracting, Transforming and Loading the data from Source to Staging and Staging to Target according to the Business requirements.

Developed number of Ab-Initio Graphs based on the mapping and business requirements using various Ab-Initio Components.

Created a customized component which is used in all the graphs of this project.

Prepared technical design document as well as test case document.

Involved in writing Teradata SQL queries as per functional requirements.

Involved in performance tuning for the Graphs and SQL queries.

Created and scheduled Jobs & Job Streams from Tivoli UI & Unix.

Involved in resolving the issues found during Unit Testing & User acceptance Testing.



Contact this candidate