Post Job Free
Sign in

ab initio

Location:
atlanta, GA, 30339
Salary:
Negiotable
Posted:
April 27, 2010

Contact this candidate

Resume:

Ram Reddy

Cell:317-***-****

Email: ************@*****.***

Professional Summary:

 Nine + years of IT experience in software development, designing and implementing with major focus on data warehousing and database applications.

 8 Years of experience in using of SQL.

 6+ years of experience with Database relational models, stored procedures, and views.

 6 years of strong data warehousing experience using ETL tool Ab initio.

 4 Years of experience in Korn shell script program.

 3+ years of proven experience in data modeling, data management, data warehousing, data transformation, metadata and reference data management and Erwin tools.

 3 +years of experience in designing, developing, implementing and maintaining applications in risk management, fraud control and online bill payment.

 3+ years of experience in using of Teradata,DB2.

 1 year of strong data warehousing experience using ETL tool Informatica.

 Experience working with various Heterogeneous Source Systems like Oracle, SQL Server, Teradata, DB2 UDB and Flat files.

 Experience in coding SQL, PL/SQL, T-SQL. Good knowledge in working with Relational database utilities like TOAD, SQL Assistant, SQL Loader.

 Worked with continuous components and XML components in Ab Initio.

 Strong Knowledge with Batch Jobs atomization tool Maestro.

 Knowledge in EME Check –in and check –outs and Sand-Box Creation.

 Worked with using EME for Version control, Impact analysis, Dependence analysis for Common projects, higher environment migrations.

 Vivid experience in using Korn Shell Scripting to maximize Ab-Initio parallelism capabilities and developed numerous Ab-Initio Graphs using Data Parallelism and Multi File System (MFS) techniques.

 Experience in design, development, and test applications using Ab Initio Software.

 Sound Knowledge in data warehouse concepts, STAR Schema, SNOW FLAKE Schema designs and dimensional modeling.

Technical Skills:

Operating Systems DOS, UNIX, Windows: 95/98/2000, NT 4.0, XP

Database Management Systems Oracle8i/9i,SQLServer2000,2005,MS Access,DB2,Teradata

Programming Languages C, C++, Java Script, PL/SQL

GUI and Web Technologies Visual Basic 5.0,6.0, VB.NET and HTML, XML,Crystal Reports 11.0,9.0

ETL Tools INFORMATICA,5i,AbInitio 1.13,1.14,1.15.6,

Co>OpSystem 2.13,2.14,2.15.6,Conduct>It

Project Profile

WMS, Homedepot, Atlanta,GA Apr’2009 - Till date Sr.Ab Initio developer :

warehouse management system is a key part of the supply chain and primarily aims to control the movement and storage of materials within a warehouse and process the associated transactions, including shipping, receiving and picking. The systems also direct and optimize stock based on real-time information .Warehouse management systems to collect the items information through barcode scanners in the Home depot sales centers and network source system. The objective of a warehouse management system is to provide a set of computerized procedures to handle the receipt of stock and returns into a warehouse facility, model and manage the logical representation of the physical storage facilities (e.g. racking etc), manage the stock within the facility and enable a seamless link to order processing and logistics management in order to pick, pack and ship product out of the facility.

Functionality of the WMS: The primary purpose of a WMS is to control the movement and storage of materials within a warehouse – you might even describe it as the legs at the end-of-the line which automates the store, traffic and shipping management.

Responsibilities:

 Developed high level and low level design document for processing each feed and documenting the various implementation done during the course of the Project.

 Developed ETL Data Flow process for POS Project by using Microsoft Visio.

 Involved in source to target mapping discussions.Participating in various data cleansing and data quality exercises.

 Developed various graphs which include extracting various XML files and loading it into the database.

 Phasing and check pointing were used in the graphs to avoid deadlock and recover completed stages of the graph, in case of a failure.

 Various ab initio commands such as m_ls, m_cp, m_mv, m_db, m_touch were used extensively to operate with multifiles.

 Performed data cleansing and data validation by using various ab initio functions like is_valid, is_defined, is_error, is_null, string_trim etc.

 Involved in code reviews, performance tuning strategies at ab initio and Database level.

 Involved in moving applications from AIX to Linux server and modified the shell scripts for that application in order to run on Linux server.

 Involved in Migration of code from DEV to QA and also from QA to PROD by using Heats (Home Depot Utility).

 Involved in processing large - scale Data Processing Systems using Ab Initio Conduct>It.

 Created HDDTM (Home Depot ETL process Dynamic Tast Manager) plans, Go Scripts and child plans to run the jobs in sequence.

 Involved in Production support to monitor the jobs and schedules through IBM Maestro 8.5 Tivoli Workload Scheduler.

 Involved in unit testing of the graphs and also prepared test cases document.

Environment Ab Initio GDE 1.15.7, Co>operating system 2.15, Teradata, Unix, Db2, Sql Assistant, Putty, SQL\PL-SQL, Windows 2000/XP, MS Office, Visio, Shell Scripts, XML, XSD, Maestro 8.5.

Cash, FedEx, Mem,TN AUG ’07 – MAR’2009

Sr. Ab Initio Developer:

CASH: Phase 1 is the first of 4 phases of a tool suite that will include sales definition of customers, potential entry, sales segmentation and territory alignments.

Functionality of the CASH includes:

 Integration of web-based alignments including account, entity, facility and territory building blocks that support existing alignment rules and functionality currently implemented for sales.

 To retirement of the existing(QUASAR) alignment system

 A utility to centrally store and manage alignment files by legal entities’ sector, segment, account, territory, etc.

 Mass upload of alignment entry, mass upload of alignment change.

Responsibilities:

 Development of source data profiling and analysis - review of data content and metadata will facilitate data mapping, and validate assumptions that were made in the business requirements.

 Created the mini specs for different applications.

 Involved in review the data Analysis, best practices.

 Developed various Ab-Initio graphs to validate using data profiler, comparing the current data with previous month data.

 Used different Ab-Initio components like partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.

 Also used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio.

 Written several unix control scripts, specific to application in order to pass the environment variables.

 Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing the data and applying transformation rules and loading to staging area.

 To insert data into Teradata Data warehouse using utilities, FastLoad MultiLoad, BTEQ Scripts Tools.

 Used Ab initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.

 Involved in Design best practices and Coding and documentation best practices.

 Writing the .dbc files for the Development, Testing and Production Environments.

 Expertise in unit testing, system testing using of sample data, generate data, manipulate date and verify the functional, data quality, performance of graphs.

 Performing transformations of source data with transform components like join, match sorted, dedup sorted, de-normalize, reformat, filter-by- expression.

 Wide usage of lookup files while getting data from multiple sources and size of the data is limited.

 Written several unix control scripts, specific to application in order to pass the environment variables.

 Development of UNIX Korn shell scripts to automate job runs or Support the redaction infrastructure and SQL and PL/SQL Procedures to load the Data into Database.

 Involved in project promotion from development to UAT and UAT to promotion.

 Involved in Production implementation best practices.

 Used different EME Air commands in project promotion like air tag create, air save, air load, air project export etc.

Environment: AbInitio GDE1.14,Co-Op2.14,UNIX, Oracle 10.x,TOAD, DB2, SQL Server 2005,Windows’ XP, Teradata, Maestro

CBG- Citigroup, Harrison, NY & Irving,TX APR ’06 – JUL ‘07

Ab Initio Developer:

Customer Hub: Customer-Hub will allow for the population and de-duplication of

Commercial customer demographic data within n the source systems from leasing

back- Office, including infoLease dallas, infoLease harrison, and ELMR as well as

commercial banking via the sector CID data warehouse and commercial real estate systems. The Intranet-based user interface is customized for CBG administrators who will maintain the data, and for relationship managers who will view the data.

Functionality of the CBG Customer Hub includes:

 Integration of customer information from multiple source systems into one single data store.

 Cleansing, matching, linking, and identifying master data.

 A utility to centrally store and manage multiple hierarchies by legal customer entity, product, relationship manager etc.,

 A system of record for all master customer data.

 A service-oriented architecture to add new channels, data sources, and touch points.

Responsibilities:

 Involved as designer and developer for commercial business group data warehouse (CBGDWH).

 Development of source data profiling and analysis - review of data content and metadata will facilitate data mapping, and validate assumptions that were made in the business requirements.

 Created the minispecs for different applications.

 Making automate the development of Ab initio graphs and functions utilizing the meta data from EME to meet SB data redaction requirements.

 Developed various Ab-Initio graphs to validate using data profiler, comparing the current data with previous month data, applying the AMS standards.

 Involved in AMS installation in dev, testing the AMS and promote to SIT, SIT to UAT and UAT to production.

 Used Different Ab-Initio components like partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.

 Also used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio.

 Used USPS address mapping system for correcting the customer address.

 Performing transformations of source data with transform components like join, match sorted, dedup sorted, reformat, filter-by- expression.

 Wide usage of lookup files while getting data from multiple sources and size of the data is limited.

 Involved in project promotion from development to UAT and UAT to promotion.

 Involved in Production implementation best practices.

 Using modification of the Ab intio EME to house the required redaction Mata data.

 Used different EME air commands in project promotion like air tag create, air save, air load, air project export etc.

Environment: AbInitio GDE1.13, Co-Op2.13, UNIX, Oracle 9i, SQLServer Navigator 5.0,SQL Server2000, Cygwin, AMS Software,Maestro, DB2, Teradata, Windows2000 ,Crystal Reports 9.0

AT&T, Plano,TX APR ’05 – FEB ‘06

Ab Initio Developer:

Description:

This project consists of AT&T products information, sales information and billing information.

All accounts within a given billing product subscription must be billed to the same customer.

Responsibilities:

 Involved with business users to prepare functional specification and technical design documents for ETL process for complete data warehouse cycle for AT&T wireless customer support and sales.

 Developed Jobs and used Ab-Initio as an ETL tool to load the final loads.

 The project is to centralize the reporting system, previously used a hard coded program.

 Developed various Ab-Initio graphs for customer credit, contact, account detail, account XRef, data inserts & updates, data validate graphs, TN inserts & updates, calling plan, network service XRef graphs and refresh graphs.

 Used various Ab-Initio components of partition, de-partition, database, datasets, transform, FTP and sort to build graphs in GDE

 Developed Ab-Initio graphs for data validation using validate components.

 Created Ab-Initio multi file systems (MFS) to take advantage of partition parallelism.

 Developed several stored procedures and functions to make the existing code to be executed and make it more dynamic which is now considered to be the centralized program for the existing running programs and the new EDW project.

 Developed several database triggers to maintain the data integrity.

 Developed and loaded parameters called during the execution of the EDW program for dynamic execution.

 Wrote several SQL* loader programs to load small loads into temporary tables for the data, which may be required to move among different unix servers and databases while the program is running.

 Developed several KShell(Korn-Shell) programs, functions and packages to do the pre - inspection, post – inspection, pre - extraction inspection of the data getting loaded every week and populate errors, which are again used to generate error reports responsible for the automation of Ab-Initio graphs using KShell(korn-shell) scripts.

 Expertise in unit testing, system testing using of sample data, generate data, manipulate date and verify the functional, data quality, performance of graphs.

 Documentation of complete graphs and its components.

Environment AbInitio GDE1.12,Co-Op2.12, UNIX, Oracle 9i, SQLServer Navigator 5.0, DB2, Teradata, Windows2000

Nationwide Auto Insurance, Columbous, OH APR ’04 – FEB ‘05

Ab Initio Developer:

Description:

The largest publicly held personal auto insurance in U.S. The Nationwide Insurance financial pilot module is mainly to convert already existing PL/SQL procedures to Ab Initio graphs. Applications are aimed to pull the data from legacy data source (EDW) to landing zone.

Responsibilities:

 Extracted data from oracle database and the extracted data are used to populate the data warehouse tables.

 Associated with financial data mart.

 Used Ab-Initio as an ETL tool to pull data from source systems, cleansing, transform and load data into databases.

 Developed transformation logic and business rules for ETL purpose.

 Involved in Ab-Initio graph design and performance tuning to load graph process.

 Convert the date formats from yymmdd to oracle standard date format.

 Used Ab-Initio repository to store the developed graphs, for future dependency analysis when needed.

 Uncompress the source data using the GUNZIP component and translate the data from EBCDIC to ASCII.

 Responsible for creating test cases to make the data originating from the source was making it into target properly in the right format.

 Created Ab-Initio multi file systems(MFS) to take advantage of partition parallelism.

 Implemented a 4-Way multi file system that was composed of the individual files on different nodes that were partitioned and stored in distributed directories.

 Designed and developed Ab Initio graphs, using different components viz. Reformat, rollups, scan, and Join etc. Performed functional testing of graphs.

 Created summary tables using ROLLUP and SCAN components and verified graph accuracy based on business logic.

 Responsible for making sure that data is clean, consistent, and synchronized across platforms.

 Identified ways to speed up current Ab-Initio graphs to maximize performance.

 Automated both monthly and weekly refresh (data load) using the cron utility.

 Created crontab jobs to run the different application at a time.

Environment Ab Initio GDE1.12, Co-Op 2.12, UNIX, PL/SQL, Oracle 9i, SQL Server7.0, WindowNT

Visa International, San Francisco, CA FEB ’03 – FEB ‘04

Ab Initio Developer:

Description:

Visa International is a brand of credit card and debit card operated by the Visa International Service Association of San Francisco, California, USA, an economic joint venture of 21,000 financial institutions that issue and market Visa products.

Verified by Visa can provide merchants with significant savings in fraud-related costs. Merchants who use Verified by Visa are protected from fraud-related chargebacks on all personal Visa cards—credit.

Responsibilities:

 Created Ab-Initio graphs to load large volume of data around several GB to TB.

 Used the Ab-Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects.

 Extracted data from Oracle and used them to populate Teradata Data Warehouse tables associated with Data Mart.

 Created Korn Shell scripts and cron jobs to refresh the load on weekly basis.

 Developed complex Ab Initio XFR’s to derive new fields and solve various business requirements.

 Created test scenarios that were used to validate the Ab-Initio graphs.

 Designed and developed complex Ab-Initio graphs using Aggregate, Join, Rollup, Scan, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions. Developed complex graphs in normalizing the 3-D data in the Excel Spread Sheet into 2-D flat file. Later this flat file was in to Oracle Data Marts.

 Created various Ab-Initio Multi File Systems (MFS) to run graphs in parallel.

 Responsible for cleansing the data from source systems using Ab-Initio components such as reformat and filter by expression.

 Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions and tested various graphs for their functionalities.

 Developed several partition based Ab-Initio Graphs for high volume data warehouse.

Environment Ab Initio GDE1.12, Co-Op 2.12, UNIX, PL/SQL, Oracle 9i, SQL Server7.0, Windowing

Mesniaga Group of Industries, Hyd, IN APR ’02 – JAN ‘03

Informatica Developer:

MLM information system description:

EBiz system based on the MLM sites and Product sales of OLTP systems. The main objective of this project is to provide integrated information on subscriptions and product sales data for better decision-making. The system developed using oracle as the RDBMS.

Responsibilities:

 Interacted with business analysts to translate business requirements into technical specifications and involved in extraction, transformation and loading of data.

 Extracted source data from sybase, flat files, XML, TeraData, oracle, IMS and VSAM and loaded to a relational warehouse.

 Implemented SCD methodology including type 1, type 2 and type 3 changes to keep track of historical data.

 Implemented aggregate, sorter, router, filter, joiner, expression, lookup and update strategy, normalizer, sequence generator transformations.

 Extensively worked on mapping variables, mapping parameters, workflow variables and session parameters.

 Used the workflow manager to create workflows, worklets and tasks.

 Involved, created UNIX shell scripting and automation of ETL processes using crontab

 Used debugger to test the mapping and fixed the bugs.

 Involved in design of logical and physical data models using Erwin designed and developed impromptu distributed catalogs (involving joins, conditions and calculations).

 Created multi-dimensional cubes using power play transformer.

 Performance tuning of queries using ANALYZE tables and SQL Trace.

 Documented informatica mappings, design and validation rules.

 Involved in design of logical and physical data models using Erwin designed and developed impromptu distributed catalogs (involving joins, conditions and calculations). Created multi-dimensional cubes using powerplay transformer.

Environment Informatics Power Center 5.0, Cognos Impromptu 6.6, Power Play, Transformer 6.6, ERWIN 4.0, Oracle 8i/9i, MS SQL Server 2000, PL/SQL, Toad 7.0, Unix 5.0.5, Windows 2000

Redland Inc. Hyd, IN DEC’98 – MAR ‘02

Developer:

Real Estate information System Description:

This is financial software aimed at automating the complex business of lending home

loans. It is a three-tier client server design based on the Microsoft’s component object

Model. The front end is designed in Visual Basic 5.0/6.0. It has OCXs designed for each

module, which is then invoked from a ActiveX control container called central park,

developed in Visual C++5.0/MFC. They contain all the business logic of home loans.

The database used is oracle 8i. Product aims at performing all operation in a life cycle

of home loan right from the step of pre qualification to funding of the loan.

Responsibilities:

 Physical Database Design and development, plementation.

 Responsible for creating Database objects like Tables, Cluster/Non-cluster index, Unique/Check Constraints, Stored Procedures, Triggers, Rules, Defaults and Views.

 Responsible for database Backups and Recovery.

 Responsible for Query optimization and Performance tuning.

 Used SQL Server Performance monitor to find out the I/O, CPU issues.

 Monitor NT Event log for system, application errors, CPU & Memory usage Backup registry, keep emergency repair disk as current and server available disk space.

 Extensively worked with BCP to Import & Export data From/To flat files.

 Responsible for SQL Server Standard security implementation, managing SQL Server connections and granting permissions for user(s) and objects.

 Used DBCC to check physical and logical consistency of the database and solved page allocation errors and Table corruption.

 Used Database object Transfer utility to transfer the data and database objects from local to remote server.

Environment MS SQL Server 7.0, MS ACCESS 97, Visual Basic 6.0, MS Windows 95, Windows NT 4.0. PL/SQL, Toad 7.0, Unix 5.0.5, Windows 2000

Education

Degree University Year of Passing

Master of Computer Applications Kavitha Memorial PG Center, Kakatiya University 1997

Bachelor Science SSRJ College, Kakatiya University 1992

Achievements:

 Microsoft certified professional in VB.

Trainings

Course Description Duration Year

Informatica 5i (Repository Manager, Informatica designer, Server Manager) 15 days 2002

Ab Initio 10 days 2003

PL/SQL 10 days 2002

ORACLE DBA 5 days 1998

Java Script 10 days 1998

Data Warehousing Concepts 2 days 2002

C 30 days 1995

C ++ 10 days 1996

Visual Basic 6 15 1998

Oracle- 8i 10 days 1998



Contact this candidate