. 408-***-**** KIRAN abinvh@r.postjobfree.com
Sr. ETL Consultant
Professional Summary
. 7 years of total IT experience in the Requirements Analysis, Data
Analysis, Application Design, Application Development, Implementations
and Testing of Client-Server, Internet and Data warehousing business
systems.
. Solid experience in developing Ab Initio applications for Extraction,
Transformation, Cleansing and Loading into Data Warehouse/Data mart. Used
Ab-Initio with Very Large Database Systems (VLDB) that are Massively
Parallel (MPP) by utilizing the concepts of Multifiles, Partitioning and
Departitioning components etc and fine tune the applications for optimal
performance.
. Sound Knowledge on Dimensional Modeling concepts and design of Data
warehouse with the elements of STAR and SNOWFLAKE schemas.
. Involved in designing the Data model.
. Fair understanding of business flow in Data warehouse projects.
. Worked on large Enterprise Data Warehouse databases with clients like
FEDEX, Wall greens
. Developed graphs to fetch data from sources: DB2, Oracle, Teradata, excel
flat files, XML, MQ, and SAS.
. Technology forte is Ab Initio, Teradata V2R5/V2R6 13.0, Oracle 8i/9i/10G.
. Data warehousing domain exposure in Pharmaceutical, Manufacturing,
Retail, Supply chain and Telecommunication industries.
. Sound knowledge of Teradata Architecture and hands on in writing Teradata
SQL's, BTEQ scripts, Macro's, Tuning Teradata DB queries and making use
of Teradata utilities like Fast Load, Fast Export, Multi Load and TPUMP
. Extensive experience in writing quality SQL, Oracle PL/SQL procedures,
functions, SQL Loader scripts
. Proficient in performance tuning of ETL process.
. Involved in Conduct>IT for planning framework, scheduling and custom
scripts.
. Thoroughly versed in UNIX Shell Scripting to support ETL Process
Automation, data warehousing tasks
. Hands on experience with scheduling tools like Control-M, Autosys,
$UNIVERSE.
. Involved in Warehouse Production Support where business & time critical
issues are addressed
. Experience in Offshore and Onsite model. Received good customer feed back
for timely delivery and fulfilled the expectations
. Demonstrated ability in understanding and grasping new concepts in Data
Warehousing space (Both Technical & Business) and utilize them as needed
. Highly motivated team player with strong communication, Organizational
and analytical skills and with a passion to work in challenging
environment and adaptable to varying circumstances
. Strong Technical Writing and user documentation Skills
Technical Knowledge
ETL Ab Initio GDE 1.15, Co>Operating System 2.15
Databases Oracle 10G/9i/8i, TeradataV2R6, DB2, MS-Access
OLAP Cognos, Microstrategy, SAS
Languages/Scripting C, C++, JAVA, SQL, PL/SQL, HTML, Java Script,, UNIX
Shell Script,XML.
Scheduling Control-M, Autosys,$Universe.
GUI Visual Basic 6
Operating Systems Windows NT/98/2000, UNIX, LINUX, Sun Solaris 9/8/7
Version Control Ab Initio EME, VSS
Tools BTEQ, TOAD, Telnet.
Data Modeling Tools Erwin.
Data Minning Tools Intelligent Miner.
Education & Achievements
. Bachelor of Engineering in Computer Science.
. Master of Science in computer Science.
Professional Exp rience
FEDEX, Memphis, TN Dec '09
- Present
Sr.Ab Initio Developer
Client: FedEx provides customers and businesses worldwide with a broad
portfolio of transportation, e-commerce and business services. We offer
integrated business applications through operating companies competing
collectively and managed collaboratively, under the respected FedEx brand.
Consistently ranked among the world's most admired and trusted employers,
FedEx inspires its more than 275,000 employees and contractors to remain
absolutely, positively focused on safety, the highest ethical and
professional standards and the needs of their customers and communities.
Reporting Data Warehouse (RDW): The duty tax messages will be published to
EDW through the Java messaging Service backbone by DLRS thru PubData
interface. The messages will be published thru the message schema ADTtoEDW.
EDW will subscribe the messages from JMS queue and load them in Teradata
tables. The GMD functionality will be added to the JMS subscriber to
acknowledge each transaction back to the publishing server via JAVA method.
Responsibilities as Sr.Developer
. Analyzed Business and Accounting requirements from the Accounting and
Business Detail level Process design.
. Involved in understanding the Requirements of the end Users/Business
Analysts and Developed Strategies for ETL processes.
. Responsible for the detailed design and documentation. Provided
technical solutions for the Process requests raised by Data team to fix
the issues in the existing system.
. Designed, developed and Unit tested Ab Initio graphs using GDE for
Extraction, Transformation and Loading of data from source to target.
. Extracted data from Oracle legacy Data source tables and created various
loan lookups, commitment type lookups, and security type lookups.
. Created and modified various Loan, Property and Asset graphs based on
the business rule enhancements.
. Involved in writing test cases to validate the code changes.
. Involved in designing the Data model.
. Developed graphs to fetch data from sources: DB2, Oracle, Teradata,
excel flat files, XML, MQ, and SAS.
. Involved in Conduct>IT for planning framework, scheduling and custom
scripts.
. Extensively used Database and Dataset components like Input file, Input
table, and Output table and Transform components like Join, Rollup,
Scan, Filter by expression, Reformat and other components like Merge,
Lookup, Input/output table and Sort.
. Extensively used continuous components like JMS Subscribe, Multi
publish, Batch Subscribe.
. Implemented component level, pipeline and data parallelism using Ab
Initio for ETL process. Extensively involved in performing EME
dependency analysis.
. Used Partition components like partition by expression, partition by
key, etc., to run the middle layer processing parallel.
. Extensively used various inbuilt transform functions like
string_substring, string_lpad, string_index, lookup functions, date
functions, error functions.
. Extensively used Ab Initio Co>OS commands like m_ls, m_wc, m_dump,
m_copy, m_mkfs, etc. Utilized multi file system (MFS) to execute the
graph parallel.
. Worked on improving performance of Ab Initio graphs by using various Ab
Initio performance techniques like using lookups, in memory joins and
rollups to speed up various Ab Initio graphs. Designed and developed
parameterized generic graphs.
. Closely monitored the Autosys batch jobs in ETL batch run during System,
Integration and Acceptance test runs.
. Developed, Tested and Validated Teradata-BTEQ scripts and Teradata
Macros.
. Created Staging and Target Teradata/Oracle objects..
. Extensively worked on FastExport and Fast Load utilities to unload and
load data.
. Worked closely with CM team to migrate the ETL code changes from
development environment to System, Integration and Acceptance
environments.
. Used Rational Clear case to migrate non Ab Initio code by creating a
baseline and checking in the scripts.
. Extensively worked in the UNIX environment using Shell Scripts. Created
test cases and performed unit testing for the Ab Initio graphs.
Documented Unit testing. Logged and resolved defects in the roll out
phase. Responsible for supporting the CM team and trouble shooting any
production issues.
. Created a Production support document and documented the Test case work
book, High level Design and Detail Design documents.
Environment: Ab Initio GDE 1.15.2, Co>Operating System 2.15.8, Oracle 10g,
Teradata 13.0, $Universe,TOAD, UNIX Shell Scripting, Sun OS 5.8.
E Bay,San Jose CA
Oct' 2008-Nov' 2009
Ab Initio Developer
Corporate Data warehousing
Bringing together the different source systems in the staging environment
and processing, loading into Data warehouse environment.
Responsibilities:
. Performed Analysis, designing and preparing the functional, technical
design document, and code specifications.
. Developed and supported the extraction, transformation and load process
(ETL) for a Data Warehouse from their OLTP systems using Ab Initio and
provide technical support and hands-on mentoring in the use of Ab Initio.
. Responsible for all pre-ETL tasks upon which the Data Warehouse depends,
including managing and collection of various existing data sources
. Involved in developing UNIX Korn Shell wrappers to run various Ab Initio
Scripts.
. Developed Ab Initio XFR's to derive new fields and solve various business
requirements.
. Developed number of Ab Initio Graphs based on business requirements using
various Ab Initio Components such as Partition by Key, Partition by round
robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge
etc.
. Worked on improving the performance of Ab Initio graphs by using various
Ab Initio performance technique's like using looks instead of Join's etc.
. Implemented Lookup's, lookup local, in-Memory Joins and rollup's to speed
up various Ab Initio Graphs.
. Design Documentation for the developed graphs.
. Created UNIX shell scripts to automate and schedule the jobs.
. Created the migration scripts, test scripts for testing the applications,
creating and supporting the Business Objects reports.
. 24*7 production support includes monitoring batch jobs, investigating and
resolving the problems.
. Performed physical data modeling, performing regular refreshes of
development and test database environments using the Export/Import
utility.
Environment: Ab Initio GDE 1.14, Co>Operating System 2.14, Oracle 10g,
Teradata V2R6, Control-M, TOAD, UNIX Shell Scripting, Cognos, Sun OS
Solaris 9.
Client : AT&T, NY
Nov '2007 - Sep '2008
Position : Ab Initio Developer
The largest communications company in the United States. And the world. We
are the industry leaders in providing wireless service, high speed internet
access, local and long distance voice, and directory publishing and
advertising services across the US. We are also developing our business to
include next-generation television services with our new AT&T U-VerseSM TV
Worked on AT&T Telecom Business Solution (TBS) to keep repository of all
customer data, including rate plan, contracts, features, account balance,
payments, adjustments, invoices etc.
Responsibilities:
> Gathered Business requirements and Mapping documents.
> Multiple graphs were built to unload all the data needed from
different source databases by configuring the dbc file in the Input
Table component.
> Created AB Initio multifile system using m_mkfs commands which can
process multiple partitions of data at the same time using AB Initio
m_type and mp_type shell commands.
> Developed the graphs using the GDE, with components partition by round
robin, partition by key, rollup, sort, scanning, dedupsort, reformat,
join, merge, gather, concatenate components.
> Used components like run program and runsql components to run UNIX and
SQL commands in AB Initio. Also used the components like filter by
expression, partition by expression, replicate, partition by key and
sort Components.
> Working with Departition Components like Gather, Interleave in order
to departition and Repartition the data from Multi File accordingly.
> Performing transformations of source data with Transform components
like Join, Match Sorted, Dedup Sorted, Denormalize, Reformat, Filter-
by- Expression.
> Create Summary tables using Rollup, Scan & Aggregate.
> Deploy and test runs the graph as executable k-shell scripts in the
application system.
> Modified the AB Initio components parameters, utilize data parallelism
and thereby improve the overall performance to fine-tune the execution
times.
> Have written wrapper scripts to run the AB Initio scripts.
> Used Data Parallelism, Pipeline Parallelism and Component Parallelism
in Graphs, where huge data files are partitioned into multi-files and
each segment is processed simultaneously.
> Was involved in using the built-in AB Initio functions to build the
Custom components, which will help in implementation of the complex
business logic.
> Performed complex queries involving large data volumes of data. Some
of these queries run parallel against the same table to improve the
performance.
> Developed UNIX Korn shell wrapper scripts to accept parameters and
scheduled the processes using Control M.
> Design and writing of teradata macros
Environment: Teradata V2R5, Abinitio GDE 1.13, SQL Server, DB2, Oracle,
Sybase, UNIX, XML, HTML, Sunsolaris.
Key bank, OH
Dec'2006- Oct '2007
Ab Initio Developer
Building Data Mart for their Accounts division
The bank's goal is to build the capability to better analyze the bank's
accounts. One of the bank's major objectives is to market more effectively
by offering additional products to households that already have one or more
accounts with the bank. Users want the ability to slice and dice individual
accounts, as well as the residential household groupings to which they
belong. We created BDW and developed a reporting system and scheduling of
the Reports generation as per the requirement of the users.
Responsibilities:
. Involved in Logical & Physical Data modeling using Erwin tool.
. Created Process Data Flow diagrams in Ab Initio GDE for data
conditioning, transformation, and loading.
. Generated Configuration files, DML files, xfr files specifies the Record
format, which are used in components for building graphs in Ab Initio.
. Involved in creating Flat files using dataset components like Input file,
Output file, Intermediate file in Ab Initio graphs.
. Extensively Used Transform Components: Aggregator, Match sorted Join,
Denormalize sorted, Reformat, Rollup and Scan Components.
. Implemented the component level, pipeline and Data parallelism in Ab
Initio for ETL process for Data warehouse
. Extensively used Partitioning Components: Broad Cast, partition by key,
partition by Range, partition by round robin and Departation components
like Concatenate, Gather and Merge in Ab Initio.
. Worked on Multi file systems with extensive parallel processing.
. Responsible for the automation of Ab Initio graphs using Korn shell
scripts.
. Created Ab Initio Data flow processes for the loading data into Data
marts.
. Created documentation for ETL Ab Initio standards, procedures and naming
conventions.
. Developed Ab Initio scripts for data conditioning, transformation,
validation and loading.
. Oracle PL/SQL functions, Procedures, Korn Shell scripts were used for
staging, transformation and loading of the Data into Data mart.
. Implemented the business rules by writing Oracle PL/SQL Functions and
Procedures.
. Developed various Business Objects reports using PL/SQL Stored
Procedures.
. Designed and created the Universes in Business Objects.
. Created contexts and aliases for resolving loops in the Universes
. Extensively used calculations, variables, sorting, drill down, slice and
dice, alerts for creating Business Objects reports.
. Scheduled BO reports through Broad Cast Agent.
. Monitored and Involved in the reports that were scheduled through Broad
Cast Agent console.
Environment: Ab Initio GDE 1.13, Teradata V2R5, Oracle 9i, Control-M, Toad,
Red Hat Linux Advanced Server
Client : Vodafone, UK
Project : Vodafone Business Intelligence Programme - Cromwell
Oct'2005-Nov'2006
Vodafone is a leading provider of mobile telecommunications services,
including voice and data communications in UK. Vodafone has engaged
Accenture as the solution provider for the 'Cromwell Phase 2 programme' as
part of the overall Vodafone Business Intelligence Programme.
Cromwell Data Warehouse is one source of customer information that reduces
the existing large customer information gap for marketing purposes and
replaces the multitude of information systems in existence around IT.
Cromwell is used for storing a variety of customer information from various
source systems throughout Vodafone, allowing Vodafone to provide a higher
quality of service to its customers and a range of business users dealing
with customers, with the ability to enhance marketing, profiling, analysis
and reporting systems.
Responsibilities as Developer
. Understanding and analyzing the Data Model and Requirements in Mapping
doc
. Designed and developed highly scalable Ab Initio graphs to process heavy
loads of data.
. Used Partition components like Partition-by-Key, Partition-by-Expression,
Partition-by-Round robin; Departition Components like Gather, Concatenate
and Merge
. Applied the biz logic on source data using components like Join,
Normalize/Demoralize, Reformat, Rollup, Filter by Expression, Sort, and
Dedup Sorted etc.
. Created Summary tables using Rollup, Scan, Assign Keys components.
. Worked with Ab Initio EME for version control - Check in/Check out the
project in/from to the EME
. Created temporary tables and views for intermediate calculations in
Teradata
. Developed Teradata BTEQ scripts & Macros to load data into
Incremental/Staging tables and then move data from staging into Base
tables
. Write and modify application specific Config scripts in UNIX in order to
pass the Environment variables.
. Review, making changes and executing ddl scripts.
. Designed and developed operational document for the production support.
. Preparation of Unit test documents and test data.
. Unit testing the application; code review.
Environment: Ab Initio GDE 1.13, Teradata V2R5, Oracle 9i, Red Hat Linux
Advanced Server
Client: DHL
Oct '2004 - Sep' 2005
Position:Oracle Developer
Designed and developed the order Processing system which involved the all
the orders from dealers. It starts from dealer order and ends at invoicing.
Responsibilities :
. Created several Procedures, Functions, Packages and Database
Triggers to implement the functionality in PL/SQL and Database
Administration
. Designed the User Interfaces using Forms.
. Performed coding and testing of the Application
. Interacted with the users during testing phase.
. Involved in Training of application to Users.
. Created Tables and relationship among the tables so as to maintain the
Referential Integrity.
. Worked with Table constraints, Views and Sequences.
. Wrote Stored Procedures, Database triggers.
. Created Menus and indexes.
. Prepared User Manual.
. Involved in training of application to Users.
Environment: Windows 98, Oracle 7.1, Forms 3.0, SQL * PLUS 8.0, PL/SQL,
Developer 2000
Client : Work Life Solutions, Inc; CMP Media LLC; Employment.COM,
Inc;
Project : Job Board Manager Software
Jun'2003-Sept'2004
Job Board Manager Software is an Oracle application for recruitment web
systems. It can be used by a portal site to run a job board for multiple
recruiters. The system consists of 3 basic areas. A front-end where
candidates can register their CV, find the perfect job quickly using
powerful search tool, apply for jobs, and set-up search agent profiles for
use with the optional email agent. A restricted area where recruitment
consultants, companies can login, post jobs, search for candidate CVs, view
stats and carry out other activities. Finally, the back end is the system
administrator's area from where site owner can setup companies with access,
allocate contracts to them, view stats, 'clean' the CV database and perform
other essential duties.
XML Auto Migrate: This application is a user-friendly GUI tool targeted for
the B2B, B2C and e-Business Applications where there is vast Data Exchange.
It deals with exchange of data between different applications and different
databases on any platform using XML. This tool can be used for migrating
data from mdb, xls, csv files to xml and from xml to oracle database.
Responsibilities as Developer
. Developed the interfaces, prepared cascading style sheet, embedded java
script coding, Unit testing of the developed pages
. Responsible for writing migration procedures.
. Loaded data into database tables from excel sheets using SQL Loader.
. Responsible for writing stored procedures for Searches, using
Concatenated Data store
. Implemented the look id concept through which the same schema can be used
with different looks for their child sites
Environment: Oracle 8i, PL/SQL, Toad, Oracle 9i AS, Java, JavaScript, HTML,
XML, AWT, SAX, Jdbc
[pic][pic][pic]