Post Job Free

Resume

Sign in

Data Project

Location:
Moline, IL, 61265
Posted:
October 06, 2010

Contact this candidate

Resume:

EXPERTISE SUMMARY

> Innovative professional with over 10+ years of progressive experience

in the IT industry with prime focus on Analysis, Design, Development,

Customization, and Maintenance of various Business Applications in an

effective way to deliver leading edge software solutions.

> Direct experience in implementing enterprise data management

processes, procedures, and decision support.

> Hands-on experience with data architecting, data mining, large-scale

data modeling, and business requirements gathering/analysis.

> Executed data warehousing / OLTP database projects through the full

software life-cycle from requirements gathering, data profile, data

modeling and technical specifications to design, development and

implementation.

> Extensive experience in gathering business requirements, implementing

business processes, identifying risks, impact analysis and use-case

diagrams to depict the business requirements using Rational Rose.

> Performed data analysis and database design in addition to testing and

implementation. This includes conducting system-focused requirement

analysis, data profile and quantitative assessment of business needs

> Extensively involved with the end users and developers to facilitate

optimum data warehouse design, development and enhancement

> Hands-on Data Modeling experienced in designing OLTP database / Data

Vault/ Star Schema/ Snowflake Schema methodologies in Power Designer

15.

> Strong Working experience in Informatica PowerCenter 8.1, Repository

Manager, Server Manager, Workflow Manager, Mapping Designer,

Transformation Designer, and Mapplet Designer to Extract, Transform

and Load the data.

> In depth exposure and understanding of DWH Concepts and ETL process

using Informatica.

> Worked on transformations like Stored Procedure, Source Qualifier,

Expression, Filter, Lookup, Update Strategy, Joiner, Router,

Aggregator transformation.

> Developed Informatica 8.1 mappings to load DWH tables from OLTP.

> Making recommendations for ETL design and performance enhancements for

large volume data loads and processing.

> Provide in-depth technical consultation to ensure development of

efficient ETLs by utilizing established methodologies and best

practices.

> Have managed medium to large sized teams in software development &

maintenance projects across onsite - offshore

> Experience having defined best practices and standards across the

organization for data modelling and ETL implementation.

> Defined, establish and communicate data architecture guidelines,

standards, and best practices

> Experience in backend programming including schema and table design,

stored procedures, Triggers, Views, and Indices.

> Poses strong performance tuning experience by identifying bottlenecks,

tuning, rebuilding indices, using hints in queries, using cluster

tables/partition tables/large table space and creating materialized

views.

> Conducted system focused requirements analysis and quantitative

assessment of business needs.

EDUCATIONAL QUALIFICATIONS

Master of Computer Applications from D.A.V.V. University

CERTIFICATIONS

Six Sigma Greenbelt Certified.

Java Sun Certified.

PROFESSIONAL TRAINING UNDERGONE

-Informatica

-Cognos

-OOAD

TECHNICAL SKILLS

-Informatica 8.1

-Data Modeling

-Power Designer 12, 15

-DB2 V9.1

-Data Vault

-Data Warehouse

-Data Analysis

-Database Design

-ESP

-JCL

-SAS

-Siebel Analytics 7.8

-COGNOS

-Visio 2003

-UML

-J2EE

-STRUTS 2

PROFESSIONAL EXPERINCE

YASH Technologies

John Deere, Moline, IL ETL Architect & Data Architect Jul' 09

to Present

Dealer Inventory Tracking

For years, dealers have had inadequate process and software to accurately

track and locate their attachments/bundles inventory. This includes

checking in inventory when it arrives at a dealership, moving inventory

within a dealership and checking out inventory when it leaves the

dealership. This has resulted in a general lack of confidence in the

inventory availability shown in the dealer systems, which leads to

unnecessary inventory orders, lost inventory, and inventory write-offs.

Upon successful completion of this project, in conjunction with related

project(s), dealers will have processes to follow and systems to use so

that they can accurately track their inventory. This will add efficiency in

the dealer's ordering and customer order fulfillment process resulting in

improved customer satisfaction and a reduction in lost inventory and

inventory write-offs. The dealer inventory being tracked will be centrally

stored at Deere. When enough dealers adopt the system, this additional data

will enhance the dealer inventory locator and the factory inventory build

scheduling processes.

Responsibilities:

. Provided leadership and strategic direction on data management and

analytics.

. Logical and Physical Data model design using power designer to have

database structure in place for Orders & Products and Dealer Inventory

Tracking data to start web interface and ETL implementation.

. Involved in data model reviews with business analysts, and business

users with in depth explanation of the logical data model to make sure

it is in line with business requirements.

. Analyzed the business requirements to design, architect, develop and

implement highly efficient, highly scalable ETL processes for

Inventory data.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Design functional and technical data solutions following John Deere

System Delivery process.

. Worked closely with application teams, business analysts, data

modeler, database administrators and reporting teams to ensure the

data solution meets business requirements.

. Designed the COMAR Order Data extraction & load process for EIM order

data requirements using SAS & web components to get order current and

history data loaded into PI Order data mart.

. Informatica ETL process design and review that ensured smooth

development and testing for the ETL requirements.

. Come up with EIM Reports requirement and Reports design including

source to target mappings which are used as base line documents for

EIM Reports development.

. Designed and developed the Operational Data Store for EIM report

requirements that will help business users in business decision

process.

Tools/Technologies:

Informatica 8.1, DB2 V9.1, Power Designer 15, MQ Series, SAS, ESP,

J2EE,Shell Script, AIX, Main Frame, IMS database

Logan Britton

John Deere, Moline, IL ETL Architect & Data Analyst Mar' 09 to

Jun' 09

PIN Expansion

This project delivers the flexibility to accommodate changes to Product

Identification Numbers(PINs) with little to no detrimental impact on

existing JD IT systems

It provides just such a flexible environment by providing an internal

standard for the identity of a machine that is not subject or dependent on

a PIN format. Using this approach, existing IT systems would not be

impacted by a PIN format change if the IT systems had no mandate to adopt

the PIN change

This implementation would allow existing IT systems that have no business

need to move to a 17-character PIN to continue to use a 13-character PIN.

Systems that have a business need to utilize the 17-characters PIN would be

able to do so. Interoperability between systems utilizing different PIN

formats would be possible using an enterprise shared service.

Responsibilities:

. Communication with business users, translating business requirements

into Technical specifications.

. Analyze the source systems data and the current reports at the client

side to gather requirements for the Data Vault and Data Model

. Performed data analysis to design functional and technical solutions

including data and process flow diagrams.

. Actively designed PIN Expansion data model to achieve Interoperability

between systems utilizing different PIN formats.

. Defined the data migration strategy and leading the data migration

activities from old database to new database.

. Analyze the business requirements to design, architect, develop and

implement highly efficient, highly scalable ETL processes to load the

product data in different PIN formats

. Designed and implemented product data vault to extract, transform and

load the product data with history of changes on the products

attributes.

. Work closely with application teams, business analysts, data

designers/architects, database administrators and reporting teams to

ensure the ETL solution meets business requirements.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Informatica ETL process design and review that ensured smooth

development and testing for the solution being implemented.

. Supported the project teams during the user acceptance testing

activities and assists in development of post deployment data

maintenance plan

Tools/Technologies:

Informatica7.1, UDB2 V9.1, Microsoft Visio2003, Power Designer 12, ESP,

JCL, AIX, Mainframe

Logan Britton

John Deere Moline, IL ETL Architect & Data Analyst Nov' 08 to

Mar' 09

Products Identity V4

There were following requirements/problem solutions that have been

accomplished / addressed and delivered as the part of this implementation.

Agreements implementation will allow us to finally answer questions like

"What dealers can sell/order/service this particular product?" This is

needed to make it easier for customers to interact with Deere and find a

place that can sell them the product they want. This will also enable

Product Identity to become the source system for product to agreement

mappings. When this is done we'll be able to lessen the administrative

overhead of maintaining these mappings by assigning agreement to whole

categories of products rather than one manufacturing model at a time.

We need to be able to show only product we are currently selling,

historical, and future product to different audiences at different times.

Adding start and end dates to categories and the ability for the component

to filter based on date provides this capability. We need future product

to work with products in field testing. Other applications need to only

show current product we are selling to customers, like Deere.com and

Configurator. Some applications need all current and historical models

that a customer could be using such as JDAIM and JDQuote. Series also are

changed over time, and we need to be able to show the current view of how

we are marketing our products.

Providing factory codes will be immediately used to help applications to

show applicable lists of models or detail machine codes to people. By

keeping those lists short we make it easier and faster for people to find

what they need and get the job done as well as increase the performance of

applications.

Currently, a Detail Machine Code, and a Base Machine Code are unique to a

Decal Model. A PCI Model is not unique to a Decal Model. Today the

relationship structure in Product Identity is DMC -> BMC -> PCI -> Decal

Model. Meaning if you start with a Detail Machine code you cannot get to a

unique Decal model in some instances because we do not maintain a direct

relationship. This causes analytical systems and JDLink issues as they

use the Detail Machine Code to get to Decal Model and now are getting

multiple options

Responsibilities:

. Designed and documented the system As-Is and To-Be Architecture.

. Provided strategic direction for data integration to build the process

which could provide business data required.

. Analyze the business requirements to design, architect, develop and

implement highly efficient, highly scalable ETL processes

. Work closely with application teams, business analysts, designers,

database administrators and reporting teams to ensure the ETL solution

meets business requirements.

. Designed and developed the database changes required for To-Be system

proposed.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Informatica ETL process design and review that ensured smooth

development and testing for the project.

. Improved database performance by identifying bottlenecks, tuning,

rebuilding indices and using cluster tables.

Tools/Technologies:

Informatica7.1, DB2 V8.1, AIX, Shell Script, Microsoft Visio2003, ESP,

MainFrame

Logan Britton

John Deere, Moline, IL Data Analyst Sept' 08 to

Nov' 08

Products Identity Globalization

This implementation has enabled users to know what models are allowed to be

sold in which countries. There are business rules around it that have been

incorporated to implement the logic of ETL process which load product

identity system with models and countries in order to feed web application

that will be used by the users to determine the models availability in a

country for sell.

Responsibilities:

. Executed data profile steps on source data to come up with logic of

implementation for ETL process.

. Conducted system focused requirement analysis and quantitative

assessment of business needs to facilitating optimum database design

development and enhancement.

. Analyze the business requirements to design, architect, develop and

implement highly efficient, highly scalable ETL processes

. Work closely with application teams, business analysts, data

designers, database administrators and reporting teams to ensure the

ETL solution meets business requirements.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Informatica ETL process design and review that ensured smooth

development and testing for the project.

. Actively involved into data model design in order to ensure right

database model in place to start ETL design with.

Tools/Technologies:

Informatica7.1, DB2 V8.1, AIX, Shell Script, Microsoft Visio2003, Power

Designer 12, ESP, Poweresigner12

Logan Britton

John Deere, Moline, IL Data Analyst & ETL Tech Lead Mar' 08 to

Sept' 08

Construction & Forestry Data Integration

The purpose of this project is to utilize an enterprise data solution for

accessing product information. As we proceed to implement JDLink to the

enterprise there is a need to retrieve production information from a

centralized location. Product Identity (PI) has been defined as the data

source for a standard categorization of John Deere product offerings. This

standard consists of a defined taxonomy of categorized product offerings

which will be used by users and systems so that we have a consistent

categorization of John Deere product offerings to improve effectiveness and

efficiencies of business operations.

The project was consisting of:

. Necessary changes will be made to the JDLink applications to use PI

as a standard data source to retrieve information on products.

. Necessary changes will be made to the ETL processes that loaded the

PI database to include C&F products.

Necessary changes will be made to the component that will be used within

the JDLink applications to access the PI data

Responsibilities:

. Performed Data Profiling Steps and analyzed source systems data to

define the extract and transforming logic as a part of Data Analyst

Role.

. Analyze the business requirements to design, architect, develop and

implement highly efficient, highly scalable ETL processes to load C&F

product Information into PI database

. Work closely with application teams, business analysts, data

designers/architects, database administrators and reporting teams to

ensure the ETL solution meets business requirements.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Informatica ETL process design and review that ensured smooth

development and testing for the project.

Tools/Technologies:

Informatica7.1, DB2 V8.1, AIX, Shell Script, Microsoft Visio2003, Power

Designer 12, JCL, ESP

Logan Britton

John Deere, Moline, IL ETL Architect & Tech Lead Jan'07

to Mar' 08

Enterprise Dashboards

The Enterprise Dashboards ETL System is designed in such a way that it can

support the various aspects of flexibility that users are looking for and

meeting their requirements to have an integrated system in place to do

business data analysis through a common tool that can help better analyzing

the business data in the form of metrics. The ultimate goal is to provide a

common reporting interface that can be used across the whole organization

as a common platform for slicing and dicing on the metrics data where as

the source elements data will be entered through different external systems

in the form of raw elements that can be associated with the defined metrics

in the MMS system.

The data can be entered either through MMS application Interface or

automated process of source data extraction. The Data is extracted from

different sources at different time frequencies and loaded to staging area

where data validation and cleansing need to happened before it is taken to

further processing. Any data that is entered in staging area is flagged as

not processed and next ETL process that source data from the staging area

will pick only those records that are not yet processed.

Responsibilities:

. Analyze the business reporting requirements to design, architect,

develop and implement highly efficient, highly scalable reporting

system.

. Designed & implement data architecture and business intelligence

solution for the business data requirements.

. Designed and created conceptual, logical and physical data models with

commonly used data modeling tools like Sybase Power Designer

. Work closely with application teams, business analysts, data modeler,

database administrators and reporting teams to ensure the ETL solution

meets business data requirements.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Informatica ETL process design and review that ensured smooth

development and testing for the project.

Tools/Technologies:

Informatica7.1, DB2 Host, AIX, Shell Script, QMF Windows8.1, WinSQL3.5,

Microsoft Visio2003, Power Designer12, ESP, JCL, SAS

Logan Britton

John Deere, Moline, IL ETL Tech Lead

Sept'06 to Dec06

Product Identity

This application has provided easy product categorization ability for

business users on equipments. The desired categorization scheme requires

that each piece of equipment (John Deere and Competitors) be assigned a

three-tiered categorization made up of Make-Machine-Model

Establishing an authoritative data source for equipment categorization has

significantly improved the current environment by reducing redundant

individual development efforts and lowering overall costs and improving

SVA. Data integration efforts commonly acquire data from a number of

external data sources and perform an integration function With regards to

equipment data; it is almost always required to categorize the equipment

prior to use. The categorization of equipment data is used to enable

interactive navigation and search capabilities. This data integration

process establishes a standard for equipment categorization. That standard

is the integrated equipment categorization scheme.

Responsibilities:

. Analyze the business requirements to design, architect, develop and

implement highly efficient, highly scalable ETL processes

. Work closely with application teams, business analysts, data

designers/architects, database administrators and reporting teams to

ensure the ETL solution meets business requirements.

. Prepare, maintain and publish ETL documentation, including source-to-

target mappings and business-driven transformation rules.

. Informatica ETL process design and review that ensured smooth

development and testing for the project.

. Develop Informatica ETL process for product identity

. Performed Unit & Integration Testing

Tools/Technologies:

Informatica7.1, DB2 8, AIX, Shell Script, QMF Windows8.1, WinSQL3.5,

Microsoft Visio2003, Power Designer12, ESP, JCL

WIPRO

Pitney Bowes, ETL Tech Lead Oct'05 to Aug' 06

Stamford, CT

DMT Analytical Reporting

The DMT Field Service application is a transactional system that offers

field service solutions for DMT user in order to have servicing

requirements managed proficiently. The DMT application enables user to

experience the various features of Asset, Order, Product, Part and

Inventory processing and providing them with remarkable reporting interface

that can meet all the aspects of servicing requirements. The DWH load

involved net change records extraction from the transactional system that

requires validation and processes to be performed prior to populate into

DataMart. The ETL process to populate DMT DWH is scheduled daily basis so

that fresh data is available in DWH to be picked up by the Siebel Analytics

reports/dashboards. As a part of this project, various analytics reports

are developed that have enabled business user to experience outstanding

data analysis system that can measure business performance as efficiently

as needed. This system has significantly improved the analysis environment

that business users are better able to manage servicing requirements across

the America.

Responsibilities:

Extensively involved in requirements gathering, clarification process.

Coordinate with offshore Technical Lead

Design and documented the ETL processes that have used various

transformations like Source Qualifier, Joiner, Router, Aggregator, Stored

Procedure, Filter for DMT Organization data.

Provide user with best ETL solutions for project which has followed best

practice of informatica development.

Informatica ETL design and Code review that ensured smooth development.

Document defects and coordinate fixes with offshore Technical Lead

Coordinate all test environment data preparation and planning

Ensure development standards and guidelines are being followed

Worked closely with data modeler to make sure right database is in place to

begin development with.

Tools/Technologies:

Informatica7.1, Oracle 9i, Tomcat Web Server, HP-UX, Shell Script, Siebel

Analytics784, Siebel DWH

WIPRO

Pitney Bowes Inc, Production Support Tech Lead Aug'04

To Sep'05

Stamford, CT

Siebel Analytics / Siebel DWH Production Support

Siebel Analytics warehouse is an analytical environment that facilitates ad-

hoc reporting environment for Sales & Marketing and Field Service users.

The data for the Siebel Analytics sourced from Siebel OLTP, Certegy DM,

DB2, External Flat files, internal flat files and SAS model files. The

transactional data from these sources are loaded into the Siebel warehouse

using Informatica Power Center ETL tool and reporting out through Siebel

Analytics.

Responsibilities:

Production Support efficiently and managed a team of 3 members.

Reviewed Siebel Analytics configurations to verify accuracy and

functionality of Siebel Analytics configuration

Provided with suggestions to improve the performance, maintainability and

configuration of Siebel Analytics

Identified causes of failures in Siebel Delivers, iBOTs, Siebel Analytics

Scheduler and Siebel Analytics Caching and recommended solutions to

mitigate these problems

Customized Siebel Analytics repository and reports to meet user's

additional reporting requirements.

Identify, document and perform root cause analysis to prevent Production

Issues from reoccurring

Design and developed customize ETL process for Siebel DWH to meet

additional requirement of data availability.

Performance fine tune Informatica jobs and optimized data load process.

Tools/Technologies:

Siebel Analytics784/75, Siebel DWH, Informatica7.1, Oracle 9i, Tomcat Web

Server, HP-UX, DAC Server, Shell script

WIPRO

Client: D&B, PA. Senior Developer Jun'04 to

Jul'04

Alternate Linkage Project

Alternate Linkage Project was designed to read xml messages directly from

MQ Series in real time environment which require further processesing to be

performed prior to load into Data Warehouse. The messages are processed in

real environment which enables users to have refreshed data available in

Data Warehouse as soon as new or update data created in the source system.

Responsibilities:

Primarily responsible for design and development ETL process for migrating

large volume of data messages into data warehouse for reporting.

Worked in Informatica PowerCenter 6.22 and used transformations like MQ

source qualifier, xml Source qualifier Dynamic Lookup, Router, Filter

transformation, etc to develop etl solutions for the project.

Project Unit Testing and System Testing ensured code development meeting

requirements.

Tools/Technologies:

Windows2000, InformaticaRT6.22, PowerConnect, MQSeries, Midstream, XML,

PL/SQL, Oracle 8i,XSLT, Repository Manager, Server Manager, Workflow

Manager, Mapping Designer, Transformation Designer, and Mapplet Designer.

GE Aircraft Engines, Designer & Developer Sep'03 To Apr'04

Cincinnati, OH

Airline Data Collection

THIS SYSTEM IS DESIGNED TO COLLECT ENGINE DATA FROM VARIOUS AIRLINES

LOCATED ACROSS THE WORLD. THE ENGINE DATA IS RECEIVED IN THE FORM OF FLAT

FILE FTPED TO ETL PROFFERED LOCATION ON EXTERNAL SERVER IN SPECIFIC FORMAT

TO BE PROCESSED THROUGH INFORMATICA ETL JOBS THAT IMPLEMENT THE BUSINESS

LOGIC OF TRANSFORMATION AND POPULATE DATA WAREHOUSE TO MAKE REFRESHED DATA

AVAILABLE FOR THE COGNOS TO BE PICKED UP FOR THE REPORTING. THE RELIABILITY

REPORTS OF AIRLINE ARE PRODUCED THROUGH COGNOS REPORTING TOOLS AND MADE

AVAILABLE FROM CUSTOMER WEB CENTER (CWC).

The reporting environment for engines data improved significantly that

allowed airline customers to track engine performance daily basis.

Responsibilities:

Established and define the theoretical basis and principles for estimating

project effort and duration, while analyzing customer needs and

recommending appropriate technical solutions.

Served as customer liaison facilitated daily and weekly project review

meetings with client and team members.

Monitor and adjust project schedules as needed to ensure timely delivery in

line with client expectations, ensuring that tight schedules are met.

System Architect role ensured right system Architecture for the project and

setup the system environment.

Actively participate in project requirements analysis and clarification

process.

Design ETL solutions that have used various Informatica transformations as

Joiner, Expression, Aggregator, Lookup and Router etc that have ensured

smooth development of the system.

Provide in-depth technical consultation to ensure development of efficient

ETLs by utilizing established methodologies and best practices

Tools/Technologies:

Appworx, Crontab, Informatica Power Centere5.1, Cognos Power play

Transformer, Cognos Power Play, Cognos Impromtu, Cognos Visulizer, Shell

Script.

GE Aircraft Engines, Systems Analyst May'03 To Aug'03

Cincinnati, OH

Automation of Data loading for Digital Cockpit

This application is developed to automate process of data loading in CMS

digital cockpit data mart. CMS digital cockpit is a cross tab report

developed to keep track of projects work taking place across the vertical.

The data is source from HPSD database and loaded into flat file so that

Cognos Reporting Cubes can be provided with refreshed data to report out

projects status report.

Responsibilities:

Complete requirements analysis and clarification for the project that

delivered baseline requirements document for design the system.

Design document ETL solutions to ensure smooth development for the project.

Developed ETL mappings in Informatica using various transformations like

joiner, expression, router, update strategy and lookup etc.

Project unit testing and system testing make sure defects free development

for the system.

Tools/Technologies:

Cognos- Impromptu 6.0, Power Play 6.6, PPES 6.6, Informatica Power Center

v5.1, Erwin 3.5.2, Oracle 8i on Sun Solaris and Windows 2000

GE Aircraft Engines, Systems Analyst Oct'02 To

Apr03

Cincinnati, OH

Automation of Customer Dashboards

This project directly impacts the Customer Support Operation and provides

increased capability, functionality, and level of automation to the

Dashboard generation process. Customer Dashboards are nothing but graphical

reports. Secondary effects of the Dashboard implementation will include

standardization of metrics and data presentation modes, as well as

increased accuracy, stability and consistency of metrics derived from

statistical analysis.

Responsibilities:

Requirements Clarification Process and baseline project requirements.

Analysis and Design the system to make sure developer understand how to

implement ETL and reporting solutions for the project.

Conduct code and unit test plan reviews

Provide guidance throughout the development and testing

Prepare integration and user acceptance test plans

Data modeling using Erwin tool to provide enough table structure that can

hold processing data.

Tools/Technologies:

Informatica Power Center v5.1, Cognos-EP Series 7.0/6.x Impromptu Admin,

Power Play, PPES, Visualizer, Erwin 3.5.2, Oracle8i on Sun Solaris and

Windows 2000.

GE Aircraft Engines, Technical Consultant May'02

To Sep'02

Cincinnati, OH

PCUGA

PCUGA stands for Personal Computing Utilities and Gatekeeper Automation.

And, it is a management level initiative taken in view of applying the

entire hardware resource management by defining the concept in a web

portal.

Responsibilities:

Requirements management using UML to create use case diagrams.

Requirements clarification and understand of overall system requirements.

Designed using UML to have Sequence Diagrams, Class Diagrams, Activity

Diagrams created that visually help developer to understand ETL solutions

to be implemented.

The functional document is prepared corresponding to baseline requirements

document.

Develop a technical specification for each of the application components.

Involve into development and unit test for the system.

Tools/Technologies:

EJB, JAVA, Java Script, JSP, JDBC, Oracle 8i, Weblogic Server 5.0, Sun

Solaris, JDeveloper3.2.3, Team Site4.2.1, Rational Rose2000, Toad,

ErWin3.5., Appworx4 job scheduling, PL/SQL Script.

PRIOR EXPERINCE

L&T InfoTech Software Engineer Nov'99 To Apr'02



Contact this candidate