Post Job Free

Resume

Sign in

Sr Ab Initio Developer

Location:
Fairfax, VA
Posted:
April 16, 2015

Contact this candidate

Resume:

Ravi Raju

Tel No: 571-***-****

Around ** years of comprehensive experience in IT industry with

specialization in the Analysis, Design, Development and Implementation of

Relational Database (OLTP) and Data Warehousing Systems (OLAP) using, Ab

Initio, UNIX (Solaris, HP-UX, IBM AIX), Oracle 7.x, 8.1.5, and 9i, 10g,

11i, Teradata, SQL Server 2008.Worked on EDW (Enterprise Data Warehouse)

projects from inception to implementation and delivered high quality

solutions in a timely manner. I received awards for successfully completing

the projects in Agile. Hands on experience in implementing Agile ETL UNIX

environments for Transactional and Data Warehouse projects. Expertise in

Ralph Kimball Data Warehouse methodology. And a good understanding of

stochastic calculus.

Technical background:

OPERATING SYSTEMS: UNIX (Solaris, HP-UX, IBM AIX), Windows XP, MVS(basic

knowledge)

Scripting Languages: Korn Shell, awk, sed, Perl and Python

ETL TOOLS: Ab Initio

Job Schedulers: Autosys, Maestro

CASE TOOLS: ERWIN

METHODOLOGIES: Ralph Kimball, UML, Object Oriented Design

LANGUAGES: SQL, PL/SQL, COBOL, C, C++, Java, XML (Strong XML processing

experience such as XSD, XPath, XSL, XSLT, etc), SAS

Data Integration TOOLs: Tibco

Modeling: ERwin

DATABASES: Oracle 11G, 10G, 9i, MS SQL Server

Data warehouse appliances: Teradata, Greenplum, and Netezza

Awards:

Citi Star Player award: For delivering a complex application since

inception to successful deployment in 1300 branches and even

while working on next generation of software deliveries.

Best Team player award twice: For working across the global

teams in agile environment.

Galaxy of thanks: For incredible dedication and hard work in

resolving production issues.

Technical Trainings:

. ETL architecture in depth: By Ralph Kimball

. EMC Greenplum appliance administration and implementation: By EMC

Greenplum Inc.

. Agile project management in depth: By Accenture and Citigroup Inc.

. Managing Critical project deliveries: By Citigroup Inc.

. Hadoop and Big Data Management: Citigroup Inc.

. XML A Comprehensive Hands-On: By Learning tree

. Clear Case and Clear Quest Administrator: by IBM Inc.

. Netezza Performance server implementation: By Netezza Inc.

. Object Modeling: By Citigroup Inc.

Education Qualification

* BS in Electronics and Communication Engineering

Professional Summary

Geico Inc

Chevy Chase MD

May 2013 to Present

Geico Insurance built the warehouse on Inmon Methodology of EDW design and

later changed to Kimball

Data Mart and Data Warehouse BUS. EDW holds all the Historical Data for

Policy, Claims, and Transactions etc

The source systems are traditionally Mainframe files, DB2, XML, .net and

other database systems. The data is hosted in DB2 and Netezza. This large

size warehouse uses Ab Initio as the ETL platform. EDW platform is

integrated with multiple systems to service the users.

. As a Sr Developer integrated XML file processing design and developed

the code for MSI process.

. Developed code to Normalize Carma XML/XSD file for customer codes.

. Involved in the Claims process Architectural Design.

. Developed Ab Initio code for Policy enhancements.

. Worked on the Sales Data Mart Ab Initio code.

. Analyzed and Developed code for Rating Data Mart.

. Worked on the Design to add NoSQL database Mongo DB as the

Documentation Database.

. As a member of ETL group performed the production support duties.

Environment: Solaris, DB2, Netezza, Oracle 11i/12c, SQLserver 2008, Ab

Initio, ERStudio, CA7, Perl, Korn Shell Scripts, .net, XML and

Mainframes(MVS, DB2), Microstrategy

Citigroup

Baltimore MD

Aug 2007 to April 2013

Citi has decided to design and build next generation customer self-service

and lead management system that leverages 15 years experiences of legacy

applications and centralizes system functions from various aspects: Core

Sales Initiatives, Cross Sell Initiatives, Performance Management and

Strategic Initiatives. The project is called Symphony. Symphony project has

Multi-Generation Plan (MGP): Generation 1a, Generation 1b, Generation 2,

Generation 3a and Generation 3b. This scope of this document only covers

integration architecture for Generation 1a (Gen 1a). Project focuses on

the following basic self-service capabilities: Infrastructure: Integration

Services, Origination: Lead Management, Web-based Origination for Lenders,

Servicing: Branch Access to Private Label Payoffs, Web-based Payments and

Account Maintenance.

The project is managed in Agile.

. As a main ETL Architect, worked with steering committee on deciding

the ETL platform (Ab Initio).

. Analyzed the data extensively, to come up with a solution for

Marketing team and also analyzed

Risk SAS datasets.

. Worked with program management on pointing work to different CRP

teams.

. Defined ETL delivery schedule of use cases for each sprint.

. Interacted with Business users on regular bases in deciding the Use

cases and its functions.

. Presented the ETL design to the Business users, Architecture, CRP

teams, Molders and PMO for each sprint.

. Worked with Scrum Masters on delivering the use cases and completed

code for the urgent tasks.

. Successfully developed and deployed one generation of Ab Initio code

myself and simultaneously worked with Architecture, frontend

development team, Modelers and DBA team.

. Interviewed and setup the Off-shore Ab Initio development team.

. Monitored the Off-shore development work and gave them the

requirements and design document for each use case.

. A new solutions team was created and I was involved daily from the

start with all the different specialization Architects to come up with

a solution for each business requirement.

. Worked with UNIX admin team for carving out different environments.

. Developed a special code for generating unique ID creation.

. Interacted with North American Scheduling team for setting up Autosys

Jobs.

. Involved in conflict resolution between multiple teams to create

amicable solution.

. Worked daily with Marketing, Risk and business users for finding Leads

elements and Risk indicators for different kinds of offers.

. Involved in all the generations of the project.

. Analyzed and Designed custom list application on Teradata, Worked with

Molders, and DBA team for data classification also table creation and

delivered Ab Initio code.

. Once the project was in production, I was involved as the member of

command center.

. Supported the project from first branch deployment to final branch.

And Later till it was stabilized.

. Now the system is used by all the Business Users, Branch employees and

more than million customers.

CitiFinancial business groups (Risk, Marketing, Legal, Finance, Compliance

and BI) currently assesses data from Legacy systems and present

Transactional systems. But there is no integrity with data extracted from

different sources as these are sometimes manual tasks. Hence it's been

thought of using a common data environment as the source to all the data

needs for the business groups. The various CFNA business groups have a

stake in this environment. A consolidated data warehouse (EDW) will be

built using these source systems. After that the business groups will

prepare their own data mart out of this Enterprise Data Warehouse and use

in their respective business.

. Analyzed the existing systems and business requirements.

. Advised the Modeling team on the Star Schema, because of my knowledge on

the business process and elements.

. Provided solution for loading historical data from Archive area, because

the transactional system was purging data after certain period.

. Executed the jobs for 30days, 24hours and loaded Historical data into

warehouse, since it required a lot of data verification.

. Explained to business users how the warehouse can be used and provided

queries.

. Worked with DBA and UNIX team to setup COB environment.

. Involved in the COB testing.

. The Consolidated Warehouse is successfully deployed and Business users

are using for decision making, Risk and Financial Analysis.

Environment: Solaris, Oracle 11i, Teradata, SQLserver 2008, Ab Initio,

Chordaint, CDM engine, ERStudio, Autoys (Jil), Perl, Korn Shell Scripts,

Java(JMS,Jboss), XML and Tibco.

Fannie

Mae

April 06 to Jul07

Washington DC

Total Return Infrastructure (TRI) covers the total risk management process

from end-to-end; from the data acquisition, to the modeling assumptions,

and finally the production reports. TRI capture's selected asset data on a

daily basis, aggregate this data, and make portfolio data accessible via an

API. TRI will capture data-specifically, settled and unsettled data for

whole loans and mortgage-backed pass-through securities.

. Involved in the Mathematical modeling of derivations

. Develop Derivations prototype using Ab Initio .XFR

. Involved in writing System design spec.

. Developed scripts and graphs to pick up data from different

production servers.

. Developed graphs (XFR) and conditional DML to perform complex

mathematical computation.

. Implemented code in the start scripts for simultaneously running of

graphs.

. Implemented Scheduling Process for all JOBS in Autosys.

. Developed promotion and SCP scripts.

. Worked with the requirements team on daily bases.

. Worked with UAT team to complete the user acceptance test.

. Supported and fixed production issues.

Environment: Solaris, Oracle 9i, 10g, Sybase, Ab Initio, ERStudio,

Autoys (Jil), Perl, Korn Shell Scripts, Java(JMS,Jboss), SAS,

Bloomberg data, Lehman data

Citibank

Mar05 to April 06

New York NY

Profitability data mart processes the profitability data for leasing

business that is received from Fast Finance, Aqua and ELMR product

processors CRE and CMG Businesses received from RBNA Data-warehouse. And

Accounts/Loans not retained by business i.e. sold Accounts/Loans are to be

excluded from Expense, Cost of Fund and Cost Of credit calculations. Only

servicing fee portion of revenues for sold contracts will be captured as

reconciling items. r. Outstanding Balances for sold contracts need not be

considered for any calculations and reporting. Also calculates revenue for

leasing business.

Involved in the requirements gathering from business

Allocation of work to the team members and Mentoring the team

Setup and maintained Ab Initio Environment

Developed Ab Initio standard Environment Scripts

Written control scripts for running the graphs from command line

Responsible for designing HLD (High Level Design) in Ab initio

Wrote Design Spec for the Profitability data mart

Wrote mini spec for Cost of Fund and Cost of Credit calculations

Along with business team developed formula's for Revenue

Developed complex graphs for performing calculations on product processors

Developed custom components for managing logs and mailing address system

Created graphs for performing Cost of Credit calculations

Developed MDE (Meta data Environment) process using components like

generate records, reformat, run program

Implemented parallelism on UNIX side using commands like m_touch, m_mkdir,

m_cp etc

Used partition components on the GDE side extensively

Implemented and Designed promotion process from Dev to UAT and Production

Implemented clear case and clear quest tools into DWH process

Wrote scripts using air commands for promoting the data

Implemented compliance requirements in the promotion scripts

Worked on scheduler maestro composer for scheduling the jobs

Environment: AIX, Windows 2000, Oracle 9i, Ab Initio, ERWin, Maestro, Perl,

Shell Scripts

Clear case, Clear quest

Fannie Mae

Aug04 to Feb05

Washington DC

ETL Consultant

The E-Business Data Warehouse application consists of loan information.

Warehouse is used to evaluate on standard and Loans coming into Fannie

Mae.The results data from the E-Business Data Mart Reporting DB will be

integrated into FMIS RDW.

. Designed a document for integration of Autosys for Deal Factory

. Designed the Logical and Physical Model using ER Studio

. Developed number of ODS like case file in summary

. Developed complex graphs in Ab Initio for Data Mart re-engineering

. Trained the Team members in Ab Initio

. Worked as Production support 24/7

. Created graphs for version logic

. Written code to reference a previously created driver file which

identifies deals

Reopened

Environment: Solaris, Windows NT, Oracle 9i, 10g, Sybase, Ab Initio, ERWin,

Business Objects (Business Objects, Autosys (Jil), Perl, Shell Scripts

AOL

Jan04- Jul04

Customer Connect Data Mart/ETL Developer

Tools: Ab Initio and Business Objects

The CCDM Project combines certain sub-account level demographic information

with impression and click-through data in order to present a more

meaningful view of the AOL audience, both for advertising and programming

purposes. Completion of the CCDM Project will result in Makes available

AOL demographic information, impression and click-through data in order to

present a more meaningful view of the AOL audience primarily for

advertising purposes but also programming purposes.

* Designed HLD Mezzo for the project

* Developed TRD along with Team members

* Developed Depedent process jobs

* Written code for ccdm dep_auto recovery job

* Written code for code for daily wait adlkup

* Developed CCDM ORDemo process Autosys Flow

* Developed complex graphs like update_household,update line Item and

Update contract etc

* Designed and developed custom component for loading data into Netezza

performance server

* Developed CCDM dependent process Autosys flow diagram

* Involved in standards team for integration of Ab Initio and Netezza

* Written product support Document (PSD)

Environment: Solaris, Windows NT, Oracle 9i, Sybase, Red Brick, Ab Initio,

ERWin, Business Objects (Business Objects, Autosys (Jil), Netezza, Perl,

Shell Scripts

AOL July

03 to Dec03

20/20/ETL Developer

The 20/20 Operational Optimization initiative will provide an enhanced

method of developing a standard production transformation of consolidated

plus group and web stats systems data. This system is currently

maintained by a small group of analysts without any ongoing operational

support from AOL technical teams supports the delivery of a data

infrastructure that improves access to currently available data and

enhances the analytical capabilities of the RPM organization

. Developed SDD design document

. Involved in design review

. Developed graphs using complex transforms using Rollup packages

. Created EME standards for repository creation using Air commands

. Involved in web stats configuration with warehouse data

. Integrated existing project into 20/20 architecture

Environment: Solaris, Windows NT, Oracle 9i, Sybase, Red Brick, Ab Initio,

ERWin, Business Objects (Business Objects, Autosys (Jil), Perl, Shell

Scripts

Oct'00-

May'03

Volvo (Mack Trucks)

Data Warehousing Designer & Developer

This system is a data warehouse, which provides data for analysis and

reporting to Finance, Sales and HR departments. CSO (Customer Sales Orders)

information is stored in this system. Data is extracted from different

systems like SAP, People Soft, Oracle ERPs etc., and load into data

warehouse with Oracle 8 as back-end, Oracle client as middle-tier, Business

Objects as front-end.

Responsibilities:

* Coded and maintained Finance Backlog, CSO and B2C modules.

* Created and Maintained UTF Testing database (managing space, creating

tables, indexes, db links and other changes).

* Designed and documented flow diagrams using Visio 2000.

* Maintained user security and access with Business Objects Supervisor.

* Extensively worked on Web Intelligence to create reports.

* Extensively worked on Broadcast Agent and scheduling reports.

* Extensively worked on SQL Loader, Export and Import utilities for loading

data.

Extensively worked on PL/SQL to code and maintain packages and stored

procedures for Finance Backlog and CSO module.

* Done Scheduling/Monitoring the Maestro Remote Console.

* Tuned SQL queries and performed code debugging using SQL Navigator.

* Developed transformations to modify the data.

* Created batches and sessions for server manager.

* Worked on performance and tuning for transformation and memory process.

Worked on repository administration for security, folders, repository

backup and copy, and metadata exchange.

* Created update strategies for row operations and target table options,

aggregations, refresh strategies, and changing dimensions.

* Conducted code walk-throughs of all ETL work units and provided feedback

to ensure code is written efficiently, meets the design specifications,

and is of high quality.

* Configure Development, Acceptance and Production environment for

Informatica Sessions

Environment: Solaris, Windows 95, Oracle 8i, Business Objects (Broadcast

Agent, Business Objects Auditor, Web Admin tool, Supervisor, Designer, Web

Intelligence/Infoview), SQL Navigator, SQL*Net8, PL/SQL, Server Manager,

ERWin, K-Shell programming, Brio Query Designer (Data warehousing tool),

Maestro Remote Console (Job Scheduler), QVT/Term and Reflection for Windows

(used to Telnet, FTP and remote connections, Lotus Notes and Tomcat

Application Server

Sprint

Jun'00-Sep'00

Customer Relationship Management

Systems Analyst & Developer

This system, Customer Relationship Management (CRM), builds a chain between

its customers and suppliers. It provides service by phone, fax, voicemails

and emails. CRMs basic responsibility is to store customer information like

contacts and other customer details and to provide information to customer

about the services available. Provides both software and hardware support

to its customers.

Responsibilities:

* Responsible for the System Study & Business Requirements Analysis &

Documentation.

* Designed LLDs using Visio 2000.

* Provided the logical model after detailed discussions with the end users

and application developers. Also done physical implementation of the

logical model.

* Designed schema using ERWin and done Table Sizing.

* Designed and maintained business objects universe.

* Create Load Specifications for ETL development using Informatica

* Data Analysis and Data Mapping between file/relational Source and Targets

* Design mappings and transformations

* Configure Development, Acceptance and Production environment for

Informatica Sessions

* Maintained UNIX shell scripts to execute Informatica Batches/Sessions.

* Co-ordinate Production Turnovers, Prepare documentation and Turnover

procedures

* Co-ordinate with business analyst, Source focal points and reporting team

to address data related issues

* Monitoring and continuously improving the ETL processes

* Developed PL/SQL packages and stored procedures for Customer Orders and

Billing System Front.

* Coded the shell scripts and SQL scripts. Scheduling/Monitoring the Cron

jobs.

* Wrote Functional Specs for packages and stored procedures.

* Coded, compiled and implemented packages, DB Triggers and Stored

procedures.

* Worked with Todd for code debugging.

* Worked on version control and issue tracking system called Razor

Environment: HP-UNIX, Windows NT, Oracle 8i, SQL*Net 8, DBA Studio, PL/SQL,

Oracle Enterprise Manager (OEM GUI), Server Manager, ERWin, Todd, Razor

(Version Control and Issue Tracking System), Informatica Powermart and K-

Shell programming.

Lucent Technologies

Jul'99-May'00

Global Regulatory Compliance System

Data Warehousing Specialist

This system, GRCS, provides material safety and hazard information to

employees and customers of GE Plastics. This system is consists of three

stage. First stage consists of extract programs and load programs, which

extracts and loads from the legacy system to GRCS Oracle database. Second

stage generates reports in Business Objects to retrieve data from Back-end

on user request. Third stage is making GRCS more Customers friendly by

using Web technology. GRCS Web page is designed to handle GRCS Customer

requests more efficiently and quickly.

Responsibilities:

* Analyzed and Designed data warehouse schema-using ERWin.

* Created the database objects and done table sizing.

* Maintained the user access and document access using Supervisor.

* Design Universe using Business Objects Designer.

* Developed reports using Business Objects for GRCS Reporting System.

* Extensively worked on designing the procedures to cleanse data and

extract it from all systems to Data Warehousing system. The data was

standardized to store various Business Units in tables.

* Design mappings and transformations

* Configure Development, Acceptance and Production environment for

Informatica Sessions

* Maintained UNIX shell scripts to execute Informatica Batches/Sessions.

* Monitoring and continuously improving the ETL processes

* Coded and maintained PL/SQL procedures/functions to build business rules

to load data.

* Prepared scripts for day-to-day maintenance of the database.

* Written Shell scripts for maintenance of the databases.

* Coded, tested and implemented Packages, Stored Procedures and DB Triggers

* Analyzed the data and streamlined extracting and loading process.

* Used Import and Export utility and SQL*Loader for data loading.

Environment: Win NT, Solaris, Oracle 8.0.1, Server Manager, PL/SQL, ODBC,

ERWin, Business Objects (Supervisor, Universe Designer and Client-Server

Front end), Informatica Powermart and MS SQL Server.

Lucent Technologies

May'98-Jun'99

HR Data Warehouse

Data Warehousing Specialist

This system is developed keeping in view the increasing demand for accurate

and faster retrieval of data from huge databases. This will efficiently

serve the needs of Corporate Managers who want to access the corporate

database for planning effective strategies. The data will be loaded into

Oracle HR ERP tables using SQL Loader. Data is queried and reports are

generated whenever required by using a Business Objects.

Responsibilities:

* Involved in Design Documentation and Software Requirement Specifications.

* Designed Data Warehouse using Oracle HR ERP Functionality

Extensively worked on ERWin to design schema

Extensively worked on Business Objects Universe to design Universe.

* Extensively worked Business Objects client-server fornt to develop

complex reports.

* Coded, tested and implemented Packages, Stored Procedures and DB Triggers

* Responsible for testing and preparing test cases and implementation.

Environment: Win NT, Solaris, Oracle 8.0.1, PL/SQL, PRO*C, ERWin, Oracle HR

ERP, Business Objects (Universe Designer & Client-Server Front end) and

Informix 7.0.

Lucent Technologies

Sep'97-Apr'98

Telecommunications Switch Maintenance

Systems Analyst/Programmer

This system closely monitors the collection of data and building base

blocks. Raw data received from Telecommunication Switch is processed in to

good and bad data. Good data is stored in flat files and processed. This

data is used to build the base blocks, which are used in generation of

reports.

Responsibilities:

* Involved in System Study & Business Requirements Analysis &

Documentation

* Maintained C Helper files.

* Coded Packages and Stored Procedures.

* Coded C programs to maintain FFI system.

* Coded C programs to monitor and maintain FFI and Monitor module.

* Coded Pro*C programs to loaded data in Oracle database.

* Coded stored procedures.

* Worked with Build team to prepare a testing environment.

* Used version control tool Visual Source safe.

Environment: C, C++, Oracle 7.3, Sco UNIX, Windows NT, Visual Source Safe

and Microsoft Visual Interdev

Core Technologies

95Aug-Jul97

Roles & Responsibilities:

. Involved in the generation of different types of reports for different

sets of data to target customer segments, which can be changed in many

ways to 'Slice and Dice' and drill-through data for further multi-

dimensional OLAP analysis.

. Defined, created, maintained and published Power-Cube, Impromptu

reports on Enterprise Server.

. Extensively used Transformer to move relative data mart data into

Power-Cubes and cube groups by designing Dimensional Maps and

Transformer Model to perform On-Line analytical processing and data

mining through PowerPlay reports.

. Configured multiple user access using Authenticator and defined

dimension views and user class views for different user groups to

provide high security.

. Responsible for requirements study, systems analysis, design and

creation of the USS Catalog, user classes, drill through reports, sub-

reports, complex queries, publishing the reports using Impromptu Web

Reports.

. Involved in deploying the cube.

Environment:

Cognos Impromptu,PL/SQL, ERWin, Cognos Visualizer, Cognos Query, Cognos

IWR, Oracle,7, 8, Access Manager, Unix Scripting, C, C++



Contact this candidate