Venkata Babu
************@*****.***
Career Objective:
Energetic and Knowledgeable Computer Science graduate seeking a challenging position in the field of Software Development where I can utilize my technical and interpersonal skills to develop myself professionally while strongly contributing towards organizational goals.
EXPERIENCE SUMMARY:
DW/ETL consultant having 2+ years of IT experience and delivered multiple assignments in ETL and Data ware housing using tools like Abinitio, UNIX and Oracle.
Good experience in Requirements Gathering, Source system analysis and ETL Development.
Has worked extensively on Ab Initio (Co>Op 3.0.4, GDE 3.1.3/1.15)
Has worked with different source/target systems and written complex transformations using PDL and Meta programming techniques available in higher version of Abinitio.
Has developed AbInitio Graphs using components like Joins, Lookups, Reformat, Filter by expression, various types of Sorts, Rollup, Dedup Sorted, Normalize, De normalize and Partition components as well.
Has good exposure in Performance tuning and troubleshooting methods in Abinitio and created reusable Ab initio code for handling CDC, validations of source files.
Good knowledge on Data, Pipeline and Component Parallelism techniques available in AbInitio.
Experience in writing wrapper scripts in UNIX, also created reusable scripts to execute Abinitio code.
Extensively used EME AIR utilities and has knowledge about environment settings.
Hands on experience on Ab Initio BRE.
Proficient in Data Warehousing Concepts like Data marts, EDW and worked in creating EDW.
Proficient in Oracle SQL Development Toad, SQL Navigator, UNIX and Korn Shell Scripting.
Tested UNIX shell scripts written for ETL Processes to schedule workflows on Autosys.
Familiar in Creating Secondary indexes, and join indexes in Teradata.
Strong hands on experience using Teradata utilities (Fast Export, Multi Load, Fast Load, T pump, BTEQ and Query Man).
Proficient in Teradata TD12.0/TD13.10/14 database design (conceptual and physical), Query optimization, Performance Tuning.
RELEVANT EXPERIENCE:
Comcast Center, Philadelphia, PA. Jan 2017 – Present
Ab initio Developer/ETL Developer
Platform Environment: Ab Initio (GDE 3.0.6, Co-Operating System 3.1.5), Teradata 14/13.10, Oracle 10g/9i, Teradata, UNIX, Windows XP/2000, Control-M.
Responsibilities:
Involved in regular interaction with the client team to understand the client requirement and make the design documents.
Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.
Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.
Developed ETL code based on Business requirements using various AB Initio components.
Creating reusable graph and Script that can be used in different projects.
Creating Design Document template to represent before the client.
Development of various graphs for different source and target transformations using Abinitio tools.
Taking part in peer review of the code.
Doing code transition to the client team.
Fixing defects that came out of the assemble test.
Handling critical scenarios and minimizing the risks by suggesting the client different approaches to achieve the target.
Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning. Involved in all the phases of SDLC for the different ETL projects.
Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
Johnson & Johnson, Raritan, NJ May 2014 – July 2015
Ab initio Developer/ETL Developer
Platform Environment: AbInitio (GDE1.15, Co>operating system 2.15), Teradata 12, UNIX, Control-M, Unix Shell scripts.
Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
Met with business groups to understand the business process and gather requirements.
Extracted and analyzed the sample data from operational systems (OLTP system) to validate the user requirements.
Participated in data model (Logical/Physical) discussions with Data Modelers and created both logical and physical data models.
Extensively used the Ab initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and Denormalize.
Responsible for Performance-tuning of Ab Initio graphs. Written UNIX shell scripts in Batch scheduling.
Used Abi features like MFS (8-way partition), check point, phases etc.
Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
Involved in writing complex SQL queries based on the given requirements and Created series of Teradata Macros for various applications in Teradata SQL Assistant and performed tuning for Teradata SQL statements using Teradata Explain command.
Created several SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub queries, EXISTS, COALESCE, NULL etc.
Involved in after implementation support, user training and data model walkthroughs with business/user groups.
Coded and tested Ab Initio graphs to extract the data from Oracle tables and MVS files.
Collected, analyzed the user requirements of the existing application and designed logical, physical data models.
Jr. Ab initio Developer Intern
WellPoint Health Networks, Indianapolis, IN
Jr. Ab Initio Developer Aug 2013 – Dec 2013
Environment: Ab Initio- GDE 1.13, CO>Op 2.13, DB2, UNIX, SQL, UNIX Shell Programming
Responsibilities:
Developing Ab Initio graphs as daily and monthly cycle for loading, partitioning, cleaning and populating the data based on legal and business requirements.
Worked with Partition components like partition by range, Partition by Round Robin, partition by Expression efficient use of Multifile system which comes under Data Parallelism.
Performing transformations of source data with Transform components like Replicate, Denormalize, Redefines, Reformat, Filter-by-Expression, Rollup etc.
Used Lookups with Reformat component to fetch matching records based on for the down stream process.
Used Sort component to sort the tables, and used Dedup Sort to remove duplicate values.
Used Rollup component to populate monthly, quarterly and annual summary tables.
Used Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multifiles and each segment is processed simultaneously.
Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects. Sandbox features like checkin and checkout were used for this purpose.
Worked with departition components Gather, Merge which will be used for add the files which is done partition for fast process of data files.
Used Ab Initio Components like Sort, Partition, Rollups, Reformat and Merge to build complex graphs.
HIGHER EDUCATION:
Master’s in Computer Science
Rivier University, New Hampshire, USA
Bachelor of Science & Technology in Computer Science
KONERU LAKSHMAIAH UNIVERSITY, VIJAYAWADA, INDIA
Technical Acumen
Primary Tools:
Ab Initio (Co>Op 3.0.3.9/2.15/2.14/2.13/2.10, GDE 3.0.4/1.15/1.14/1.13/1.10), Application Configuration Environment (ACE 0.12.3), BRE, Teradata SQL, Teradata Tools and Utilities, Oracle 10g/9i, MS SQL Server 6.5/7.0/2000, DB2
Languages:
Teradata SQL
Teradata Utilities:
BTEQ, Fast Load, Multiload, TPump, SQL Assistant
Databases:
Teradata V2R/13/12/6, Oracle 10g/9i
Operating Systems:
Windows 95/98/NT/2000/XP, UNIX, Linux
Scheduling tools:
Control M, Autosys
References: Provided upon request.