Resume

Sign in

Data Project

Location:
McLean, Virginia, United States
Posted:
February 13, 2018

Contact this candidate

Ravi Kanneboina ac4gp4@r.postjobfree.com

952-***-****

Professional Summary

** + years of experience in IT as an application developer with experience in the total life cycle of Software Development Applications involving Requirements gathering, Design, Development, Testing.

5+ years working as a Solution Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations.

Have 11 years of in-depth experience in Extraction Transformation and Loading (ETL) processes using IBM Infosphere Information server DataStage 11.3, 8.5, 8.0.x (Parallel Extender), Oracle, Teradata, autosys,TWS(Tivoli)Control-m.

5 year of experience in Data warehousing tool like Informatica 6.1/9.6.1 and Oracle.

Good experience in Data Warehouse Development, Enhancement, Migration, Maintenance and Production support projects.

13 years of experience with retail and health care domain.

Coordination with offshore team.

Upgrading datastage from 8.5/8.7 to 11.3.1.2

Worked on upgrade projects which involve migration from DataStage 8.0.1 to DataStage 11.3 version and from Oracle 10g to Teradata.

4 Years of experience in Agile Methodology.

Involved in full Software Development Life Cycle (SDLC) of application software on multiple platforms including system study, analysis, design, development, unit, integration, acceptance testing and implementation.

Extensively involved in the development of Datastage ETL process for extracting data from different data sources, data transformation and loading the data into data warehouse for data mart operations.

Extensively worked with various Transformations using Join, Lookup, Sort, Filter, Funnel, Merge, Aggregator, Remove Duplicates, ODBC various File stage like Sequential file, datasets etc. for storing intermediate data in between the transformations.

Designed the ETL process using Informatica tool to load data from source to target Oracle Database.

Developed mapping and sessions using Informatica power center for data loading.

Developed mapping to load the data in slowly changing dimension.

Created various Transformations like Joiner, Aggregate, Expression, Filter, Update Strategy, and Lookup.

Involved in Unit Testing.

Experience in Designing, Testing, Scheduling and Running DataStage jobs.

Designed Mapping documents, ETL architecture documents and specifications, analyzed the Source Data and designed the source system documentation.

Setting up development, QA & Production environments. Migrated jobs from development to QA to Production environments.

Good Experience with Version Controls like Cvs.

Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancement.

Developed Datastage jobs in Server and Mainframe environment.

Worked and extracted data from various database sources like Oracle 10/9i/8i/8.x/7.x, DB2, Sequential files and Flat files.

Strong working knowledge with Oracle database Design and Teradata.

Experienced in UNIX Shell scripting (Korn, BASH, and CSH) as part of file manipulation, Scheduling and text processing.

Excellent analytical, communication and interpersonal skills. Versatile team player with proven problem solving skills and the ability to interact with individuals at all levels.

Monitoring daily ETL jobs and taking actions based on failures.

Support prod and non-prod jobs based on tickets and priority

Knowledge on QualityStage.

Education:

MCA (Master of Computer Applications) from Bangalore University in 2002.

Bachelors of Science (Mathematics), Kakatiya University, WARANGAL.

EMPLOYMENT HISTORY:

Freddiemac from October 2016 to till date through compugain LLC.

UnitedHealth group from July 2006 to October 2016 through Cognizant technology solutions.

Ivector Technologies from Dec 2002 to June 2006.

Technical Skills:

Tools

DataStage EE 7.5.2, IBM Information Server 8.0.x and 11.3, Informatica 6.1/9.6.1

Databases

Oracle 10g/11g,DB2, Teradata V13.0/14.0,sql server, MS access

Languages

UNIX Shell Scripting, SQL, PL/SQL

Tools/ Utilities

Erwin, SQL Developer, TOAD, Mercury Quality Center,Rapid sq

Job Schedulers

Autosys 4.5, Cron, IBM Tivoli

Version control

WinCvs, 0020SVN Tortoise version control

Operating Systems

Linux, Windows XP/2000/NT/98, AIX, Unix (Sun Solaris, HP/UX)

Primary Technical Skills

Freddiemac October 2016 – Till date

Mclean, VA

Senior ETL/Datawarehouse Developer

Project: Corporate Data Warehouse

Involves application development, prototyping, modeling and high level technical consulting. Perform "what-if" analysis to simulate network stress test and survivability. Predict effects of configuration changes, identify applications bottlenecks and opportunities to optimize performance. Provides resolutions to an extensive range of complicated problems. Solutions are innovative, thorough, and practicable. Works under limited direction independently determines and develops approach to solutions. Work is evaluated upon completion for adequacy in satisfying objectives. Represents the organization as the principal customer contact on contracts and often performs project leadership role. Interacts with senior customer personnel on significant technical matters frequently requiring coordination across organizational lines.

Roles and Responsibilities:

Analyses, design, develop and test the ETL applications using DataStage,DB2,Mainframe,Unix and Oracle.

Involved in Software Development Life Cycle (SDLC) and end to end testing of the code.

Analyze and enhance existing applications, problem solving and root cause analysis.

Translate business requirements into integration specifications.

Software development, code walkthrough, test, deployment and supporting the jobs.

Test and debug the application to ensure compliance with project specification.

Working with ETL developers to oversee the implementation of ETL designs.

Working on processing the load cycles to have the test data ready for SIT and UAT

Working with the technical and data architect group to make sure jobs are developed as per the standards used

Standard complex data conversion to flat file using datastage.

Datastage upgrade from 8.5 to 11.3

Unix scripting for batch process, file validations, monitor scripts

Following the DataStage job standards to not use the datasets in the process instead use the temporary tables for processing.

Developed datastage jobs to load the data in slowly changing dimension.

Environment: IBM Web Sphere DataStage 8.x/9.1/11.3, Informatica 9.6.1, DB2, Oracle 10g, Teradata V14.0, PL/SQL, SQL * Plus, UNIX Shell Scripts, TOAD, Main frame, Rapid sql, Cygwin, MS access.

Optum technology solutions Jan 2013 – October 2016

Mineapolis, MN

Solution Architect/Data Stage Lead/Senior ETL/Datawarehouse Developer

Project: BRONX

A Bronx Regional Health Information Organization (Bronx RHIO) is a type of health information exchange (HIE) that brings together health care stakeholders within a defined geographic area and governs health information exchange among them for the purpose of improving health and care in that community., in partnership with its member organizations, is receiving an award to create the Bronx Regional Informatics Center (BRIC).

This initiative will develop data registries and predictive systems that will proactively encourage early care interventions and enable providers to better manage care for high-risk, high-cost patients. The project will improve patient outcomes, improve overall health for Bronx residents and reduce the cost of care for Medicare and Medicaid.

Roles and Responsibilities:

Experience in Requirements Engineering, Solution Architecture, Design, Development and Deployment

Roadmaps Experience in designing and implementing the data architecture (conceptual, logical, physical & dimensional models).

experience in hardware, software, systems and solutions development and a across more than one technical domain.

Experience in relational and dimensional data modeling and database design.

Experience in ERwin tool.

Analyses, design, develop and test the ETL applications using DataStage and Oracle.

Analyze and enhance existing applications, problem solving and root cause analysis.

Translate business requirements into integration specifications.

Software development, code walkthrough, test, deployment and supporting the jobs.

Test and debug the application to ensure compliance with project specification.

Working with ETL developers to oversee the implementation of ETL designs.

Working on processing the load cycles to have the test data ready for SIT and UAT

Working with the technical and data architect group to make sure jobs are developed as per the standards used

Standard complex data conversion to flat file using datastage.

Datastage upgrade from 8.5 to 11.3

Unix scripting for batch process, file validations, monitor scripts

Following the DataStage job standards to not use the datasets in the process instead use the temporary tables for processing.

Coordination with offshore team.

Designed the ETL process using Informatica tool to load data from source to target Oracle Database.

Developed mapping and sessions using Informatica power center for data loading.

Developed mapping to load the data in slowly changing dimension.

Created various Transformations like Joiner, Aggregate, Expression, Filter, Update Strategy, and Lookup.

Involved in Unit Testing.

he is involved in day to day production activities to load data into two environment Oracle and sql server, logging defects in HPSM, production job failure fixes within service level agreement, data analysis of production data issues and fixes, prioritize end user requests, assigning tasks to offshore team member and delivery the tasks as per schedule and well versed in onsite-offshore model.

His assignments include working with business team and gathering requirements, analyze upstream sources for changes and impact, provide technical solutions, Design, Develop, Test, and Implement changes. Post load validation support, participate in internal audits. he is most experienced on this project and has been engaged as the project technical lead for this project. Analyze the data source systems and business requirements, and prepare functional requirement and design high-level and low level technical designs. he is also involved in performance improvement to reduce load cycle and successfully .

he has reduced total number of defects from 50 to less than 12 tickets per month.

he is managing off-shore team engaged in building the application based on the user requirements and track change requests raised by Business and end user of the project, incorporate them into the existing phases of the project. Manage and track of deliverables as per PMO guidelines, provide monthly governance report and weekly PMO status report. Identifying and managing data quality issues and oversees configuration, administration, and monitoring of data warehouse processes, scheduling and ETL data loading and server resources.

he is responsible for designing the technical solutions for an ETL Data Stage environment, performance improvements using parallel technology and multi-instance modes,maintain and report of QA activities in HPSM, create and execute processes via ITG, Create and maintain HPSM tickets for all enhancements, break fix tasks and to handle job failures, deploy changes to TWS, integrate new changes to DeCIDE tool, validate source system data using TOAD, SQL Plus and profiling the data. Create Unix scripts for data cleansing, passing job level parameters, sending automated emails. Execute scripts to monitor server bottlenecks and report performance issues to management. Develop productivity tools in UNIX for user data and project data archive/restore automation as per legal and audit needs.

he is involved in reviewing the production tickets each month and creating preventive measures to avoid recurring tickets in future. he is involved in overseeing all the process and procedures to confine with quality audit requirement.

Environment: IBM Web Sphere DataStage 11.3, Informatica 9.6.1,DB2,Oracle 10g, PL/SQL, SQL * Plus, Teradata V13.0, UNIX Shell Scripts, TOAD,Convert EDI Formats (HL7, sql server,TWS, sql server, pentaho 5.0, Agile Methodology.

UnitedHealth group June 2011 – Dec2012

Mineapolis, MN

Solution Architect/Senior ETL/Datawarehouse Developer

Project: NHI (Release-5)

The National Health Informatics database (NHI) houses claims, membership, provider, and ancillary data for over 25 million current members and includes records going back to 1994. NHI includes data from both UnitedHealth Group as well as non-UnitedHealth Group sources such as Medical Mutual of Ohio (MMOH). This data can be used for a multitude of purposes including benchmarking, extraction of data for products, and research. It is subject to HIPAA privacy protections and any requests for PHI or identifiable data must be approved. The data mart is an Ingenix asset and any access or sale of data to a UnitedHealth Group company must be pre-approved.

This project is for the integration of multiple data source code data into NHI repository. This implementation will consist of the extraction of the requested data from the sources identified below (Galaxy), and the loading of the data into the appropriate NHI tables.

Roles and Responsibilities:

Experience in Requirements Engineering, Solution Architecture, Design, Development and Deployment

Roadmaps Experience in designing and implementing the data architecture (conceptual, logical, physical & dimensional models).

experience in hardware, software, systems and solutions development and a across more than one technical domain.

Experience in relational and dimensional data modeling and database design.

Experience in ERwin tool.

Strong working experience in the data analysis, SQL design and development, implementation and testing of data warehousing using extraction, transformation and loading (ETL) Tools and Teradata.

Expertise in Teradata table data manipulation using SQL queries and UNIX commands/UNIX scripts.

Expertise in Teradata Tools & Utilities – Teradata parallel transformer using the load operator, update operator and stream operator, Fastload, Multiload, Fastexport, BTEQ

Expertise in Datamart analysis and Optimization Techniques in the Teradata Datamart

Responsible for analysis of existing application process and prepare the data stage design document and develop the data stage jobs based on functional specification document.

Responsible to construct the ETL Development using Data Stage Enterprise Edition

Involved in Analysis, Design and Development of End - End Data Warehousing Solutions

Involved in Integration Testing, Preparation of Test Cases and Test Data

Have Experience with development of Low Level Design and Tech Specification Document.

Involved in staging and loading process.

Coordination with offshore team.

Environment: IBM Web Sphere DataStage 8.5, Oracle 9i to Teradata V13.0, PL/SQL, SQL * Plus, UNIX Shell Scripts, TOAD, Autosys.

UnitedHealth group May 2008 – May 2011

CHENNAI, Tamil Nadu

Senior ETL/Datawarehouse Developer

Project: NHI(Release-1,2,3 and 4)

The National Health Informatics database (NHI) houses claims, membership, provider, and ancillary data for over 25 million current members and includes records going back to 1994. NHI includes data from both UnitedHealth Group as well as non-UnitedHealth Group sources such as Medical Mutual of Ohio (MMOH). This data can be used for a multitude of purposes including benchmarking, extraction of data for products, and research. It is subject to HIPAA privacy protections and any requests for PHI or identifiable data must be approved. The data mart is an Ingenix asset and any access or sale of data to a UnitedHealth Group company must be pre-approved.

This project is for the integration of multiple data source code data into NHI repository. This implementation will consist of the extraction of the requested data from the sources identified below (Galaxy), and the loading of the data into the appropriate NHI tables.

Roles and Responsibilities:

Responsible for analysis of existing application process and prepare the data stage design document and develop the data stage jobs based on functional specification document.

Responsible to construct the ETL Development using Data Stage Enterprise Edition

Involved in Analysis, Design and Development of End - End Data Warehousing Solutions

Involved in Integration Testing, Preparation of Test Cases and Test Data

Have Experience with development of Low Level Design and Tech Specification Document.

Involved in staging and loading process.

Environment: IBM Web Sphere DataStage 7.5, Oracle 9i, PL/SQL, SQL * Plus, UNIX Shell Scripts, TOAD, Autosys.

UnitedHealth group April 2007 – Aril 2008

CHENNAI, Tamil Nadu

ETL Developer

Project: Research data mart

The Research Data mart is a longitudinal data mart, containing 10 years of historical data that consists of provider, member, and medical claims for specific plans. The application is very critical to the business and external users (I3 personnel and others).

This project is for the integration of multiple data source code data into Research Data mart repository. This implementation will consist of the extraction of the requested data from the sources identified below (Galaxy), and the loading of the data into the appropriate Research Data mart tables.

Roles and Responsibilities:

Responsible for analysis of existing Server Data stage,Syncsort & Cosort code and extracts the logic and prepare the data stage design document and develop the data stage jobs

Responsible to construct the ETL Development using Data Stage Enterprise Edition

Involved in Analysis, Design and Development of End - End Data Warehousing Solutions

Involved in Integration Testing, Preparation of Test Cases and Test Data

Have Experience with development of Low Level Design and Tech Specification Document.

Environment: DataStage 7.5, Oracle 9i, PL/SQL, SQL * Plus, UNIX Shell Scripts, TOAD, Autosys.

UnitedHealth group July 2006 – March 2007

CHENNAI, Tamil Nadu

ETL Developer

Project: Next Gen 2.2.5

NextGen2.2.5 is an internal system to ingenix where they load Data into the main data mart called galaxy from several of its subsystems. This data has to be validated cleansed and Conformed according to various rules before loading it into the final data mart. Since Galaxy is a Data mart with 300 Terabytes of data, one has to be careful even when querying that system.

The information we gather from a database like Galaxy is only as good as the quality of the data it contains. Ingenix’ Shared Data Warehouse team takes a proactive, comprehensive approach to assessing, researching, and monitoring the data continuously flowing into Galaxy from over 350 source input files from UnitedHealth Group and external vendor systems.

Roles and Responsibilities:

Responsible for their entire ETL development using DATASTAGE Version 7.5.1A.

Designing in Technical Specifications & Low Level Design Documents.

Perform job classification and subsequently assign the tasks accordingly.

Preparing Test cases, Test data for testing the Data quality checks and Data stage code preprocessing.

Environment: DataStage 7.5, Oracle 9i, PL/SQL, SQL * Plus, UNIX Shell Scripts, TOAD, Autosys.

Ivector Technologies Jan 2005 – June 2006

BANGALORE, Karanataka

ETL Developer

Project: RMS

Gap Inc, it is one of the leading retail industries in USA having 3500 stores all over the world. The RMS system involves Data Acquisition process by connecting to the different legacy sources, ODBC systems existed in different instances and transforming the data into homogeneous format and load it into target system to make availability for the business users to analyze the business and taking decisions.

Roles and Responsibilities:

Responsible for their entire ETL development using DATASTAGE Version 7.5.1A.

Designing in Technical Flow Diagrams & Technical Design Documents.

Developed Jobs according to the FD requirement & constructing shared containers.

Used Job control and Job Sequencer to run the jobs by using parameters.

Performing Unit Testing using the Gap standard Templates.

Environment: DataStage 7.5, Oracle 9i, PL/SQL, SQL * Plus, UNIX Shell Scripts, TOAD, Autosys.

Ivector Technologies Dec 2002 – Dec 2004

BANGALORE, Karanataka

ETL Developer

Project: PROVIDENT BANK BUSINESS PERFORMANCE

Provident Bank is a $15 billion financial services company that provides full-service retail and commercial banking operations regionally and nationally. This project is intended to develop a customer data warehouse to understand customer behavior to marketing programs and identify critical customers in support of improved decision-making across the organization. Involved in the extraction, transformation and loading data into the customer data warehouse from flat files and relational sources.

Roles and Responsibilities:

Designed the ETL process using Informatica tool to load data from source to target Oracle Database.

Developed mapping and sessions using Informatica power center for data loading.

Developed mapping to load the data in slowly changing dimension.

Created various Transformations like Joiner, Aggregate, Expression, Filter, Update Strategy, and Lookup.

Involved in Unit Testing.

Environment: Informatica 6.1, ORACLE 8i, Windows 2000.



Contact this candidate