Pen
SUMMARY
. * plus years of experience in client/server business systems
design/analysis, GUI prototyping, Data Warehousing and application
development.
. Overall 7+ years of extensive experience in using ETL Tool Ascential
DataStage 8.0.1/8.0/7.5/7.0/6.0/5.2/5.1/4.2 (DataStage Manager,
DataStage Administrator, DataStage Designer, DataStage Director,
Parallel Extender, and Orchestrate)for data migration from legacy
systems to Enterprise Data Warehouse/Data Mart.
. Extensive Experience in Creating Repository, Source, Target Databases
and developing Strategies for Extraction, Transformation and Loading
(ETL) mechanism using Ascential.
. Extensive experience in using ETL Tools like Data Stage Ascential, SQL
*Loader for data migration from legacy systems to Enterprise Data
Warehouse/Data Mart.
. Detailed knowledge of Ascential Designer, Director and Repository
Manager
. Experienced in all stages of the software lifecycle Architecture for
building the Datawarehouse, Star schema and Snow flake schema
. Expertise in Extraction Transformation and Loading (ETL) processes
tool (Ascential) Datastage
. Strong experience in Oracle Database design and developing Stored
Procedures/Packages, Triggers, Cursors using SQL and PL/SQL.
. Proficient in Oracle Tools and Utilities such as SQL*Loader,
Import/Export, TOAD, SQL*Navigator, Oracle Designer, Compuware
DevPartner.
. Expertise in Data warehousing and Data migration.
. Extensive experience in design and development of Decision Support
Systems (DSS).
. Experience in writing, testing and implementation of the triggers,
Procedures, functions at Database level and form level using PL/SQL
. Strong working and theoretical knowledge on Oracle 11g.
. Extensive experience in Data Modeling using Erwin tool and IGRAFIX.
. Knowledge in business intelligent software
. Outstanding engineer capable of improving all aspects of development
lifecycle to include coding, testing, debugging and final rollout.
. Work experience of Teradata, DB2, Informix, Oracle, Ingress, SQL
Server 2005 & MS-Access.
. Have excellent design, programming and testing skills with strong
analytical background.
EDUCATION
. Bachelor of Engineering from Osmania University, Hyderabad
. Post Graduate Diploma in Computer Applications, Hyderabad
TECHNICAL SKILLS
ETL Ascential DataStage
8.0.1/8.0/7.5/7.0/6.0/5.2/5.1/4.2 (DataStage Manager,
DataStage Administrator, DataStage
Designer, DataStage Director,
Parallel Extender, and Orchestrate,
Quality stage / Integrity), ETL,
Data Warehousing,Metadata, Datamart,
OLAP, OLTP, SQL*Plus,
Informatica
Database Tools SQL Plus, SQL *Loader, Test Director, Databases,
Oracle
10g/9i/8i/8.0.3/7.3, SQL Server 2005,
DB2, MS Access, Tera Data
Data Modeling Erwin 4.0, Designer 2000, Star Schema, Snowflake
Schema
Cleansing/Analysis Tools Quality Stage/Integrity 7.5.2/7.5.1
Programming UNIX Shell Scripting, COBOL, HTML, SQL, PL/SQL,
HTML, C, C++, SQL, PL/SQL,Dos Scripting,
Pro*C, CGI/PERL, PHP
Environment Win 3.x/95/98, Win NT 4.0, UNIX, MS-DOS 6.22.
PROJECT EXPERIENCE
Shell, Houston, TX
OCT'08 to till date
Sr.ETL Developer(Data Stage)
Shell is a global group of energy and petrochemical companies. Shell aim is
to meet the energy needs of society, in ways that are economically,
socially and environmentally viable, now and in the future. The project
involved extracting data from different sources and loading into Data
Marts. The major job involved in cleansing the data and transforming the
data to the staging area then loading the data in the Data Marts. The Data
Mart is an integrated Data Mine that provides feed for extensive reporting.
It enables an insight into the current and future customer needs based on
the information received from the Data Warehouse. Involved in jobs and
analyzing scope of application, defining relationship within & between
groups of data, star schema, etc.
Roles & Responsibilities:
. Involved in development phase for Business Analysis and Requirements
Gathering. coordinated with the team members to translate business
requirements into data mart design.
. Designed logical and physical models using Erwin data modeling tool.
. Involved in data warehousing design and data integration with
different enterprise systems.
. Involved in data analysis, data mining, cleansing and mapping
documents, technical documentation, test cases and SDLCs and business
analysis methodologies.
. Coordinated with the team members to translate business requirements
into data mart design. Creating and loading data warehouse tables like
dimensional, fact and aggregate tables using Ascential DataStage.
. Worked on modifying the star schema dimensions to meet the current
requirements. The granularity of the fact table was set to the most
atomic level according to the requirements.
. Responsible for getting the Data Dictionary, designing and modifying
the DataStage jobs using DataStage designer. Created mappings based on
the data dictionary.
. Worked on data migration from SAP and was responsible developing
DataStage jobs, creating mapping for data transformation, extracting
data from SAP and loading it into the dimensional tables.
. Worked on mapping, transforming, and transferring data between
varieties of data formats. Improved performance tuning and system
integrity and batch scripting. Developed Database Procedures,
Functions, Cursors, Joins and Triggers using PL/SQL.
. Data Quality and validation using Quality Stage, Metadata capture and
analysis using MetaStage.
. Extensively worked with Quality stage to standardize data by trimming
spaces, removing nonprinting characters, incorrect spellings using
transfer, parse, collapse, select, uni join and build stages.
. Designed the Target Schema definition and Extraction, Transformation
and Loading (ETL) using Data stage.
. Mapped Data Items from Source Systems to the Target System.
. Used the Datastage Designer to develop processes for extracting,
cleansing, transforming, integrating, and loading data into data
warehouse database.
. Designed and developed PL/SQL Procedures, functions, and packages to
create Summary tables.
. Exporting of DataStage Components & Packaging of Projects using
DataStage Manager and to create backups.
. Used AUTOSYS for scheduling and loading of jobs in UNIX.
. Involved in data integration, migration.
. Data Quality and validation using Quality Stage, Metadata capture and
analysis using Metastases.
. Extensively worked with Quality stage to standardize data by trimming
spaces, removing nonprinting characters, incorrect spellings using
transfer, parse, collapse, select, uni join and build stages. Quality
Stage jobs using standardize, Investigate, Match and Survive stages.
. Prepared Master Test Plan and performed Unit testing, System,
Integration, Volume testing.Prepared implementation plans and involved
in implementation
. Developed test cases and Performed Unit testing, System, Integration
testing, UAT testing
Environment: DataStage EE 8.0.1/7.5 (Designer, Director, Manager), SAP R/3,
ERwin 4.1.5, SQLServer 2005, Oracle10g/9i, PL/SQL, AutoSys, Humming Bird
Connectivity (FTP), TOAD, UNIX, Shell scripting, Windows NT 4.0
Target Corporation, Minneapolis, MN
Sep'07 to Sep'08
Sr.ETL Developer(Data Stage)
Target Corporation is a nationwide channel of retail stores which focuses
on research and planning initiatives for store identification projects,
market research, graphic design, and financial analytics. The Repeatable
Infrastructure and Pattern (RIP) tool provides a proactive, well-
orchestrated, enterprise-wide approach to infrastructure planning that is
automated, and provides a horizontal view of all project-related
infrastructure needs and standard communication across business groups in a
timely manner throughout the project life cycle. Involved in jobs and
analyzing scope of application, defining relationship within & between
groups of data, star schema, etc.
Roles & Responsibilities:
. Developed Architecture for building a Data warehouse by using data
modeling tool Erwin.
. Involved in creating entity relational and dimensional relational data
models using Data modeling tool Erwin and preparing Design Documents,
Performance Review, coding and Test Case Specifications.
. Designed logical and physical models using Erwin data modeling tool.
. Enforcing data Integrity and Business Rules.
. Coordinated with the team members to translate business requirements
into data mart design. Creating and loading data warehouse tables like
dimensional, fact using Ascential DataStage.
. Worked on modifying the star schema dimensions to meet the current
requirements. The granularity of the fact table was set to the most
atomic level according to the requirements.
. Responsible for getting the Data Dictionary, designing and modifying
the DataStage jobs using DataStage designer. Created mappings based on
the data dictionary
. Designed the Target Schema definition and Extraction, Transformation
and Loading (ETL) using Data stage.
. Mapped Data Items from Source Systems(Oracle, mainframe, Sybase) to
the Target System.
. Used the Datastage Designer to develop processes for extracting,
cleansing, transforming, integrating, and loading data into data
warehouse database. Source database is oracle 9i and
Target is oracle 10 g and tera data.Involved in create tables in tera
data base and used multi loader
In tera data database.
. Designed and developed PL/SQL Procedures, functions, and packages to
create Summary tables.
. Exporting of DataStage Components & Packaging of Projects using
DataStage Manager and to create backups.
. Used for scheduling of datastage few jobs running in Perl scripting
. Used AUTOSYS for scheduling and loading of jobs in UNIX.
. Involved in data integration, migration.
. Prepared Master Test Plan and performed Unit testing, System,
Integration, Volume testing.Prepared implementation plans and involved
in implementation
. Developed test cases and Performed Unit testing, System, Integration
testing, UAT testing.
. Involved in implementing developed components and Provided On call
support.
Environment: Ascential DataStage 8.0/7.5, ORACLE 10g/9i, TOAD / Oracle SQL
Developer, AutoSys, ERwin 4.1.5, Sybase, UNIX, Shell scripting, Tera data,
Perl scripting, SQLServer 2005, Aqua Data Studio 6.5.
FORD Motor Company, Dearborn, MI
June'06 to Aug'07
Sr.ETL Developer (Data Stage)
Ford Motor Company is a global company with two core businesses: automotive
and financial services. F@ST (Financials @ the Speed of Thought) is the #1
priority for the Finance community and a key enabler of the "Spirit of
Ford" transformation. F@ST is designed to be a long term solution to
improve & integrate financial information, analysis and accounting
processes and systems.
Specific goals of F@ST include providing:
. Provide fast and easy access to financial information that will enable
leaders at all levels to improve.
. Delivery of actual and projected financial data by CBG (Consumer
Business Groups), Vehicle Line and Region driven by Business Operating
Systems and delivered via the Web.
. Build a flexible system adaptable to change
Roles & Responsibilities:
. Involved in the creation of jobs using DataStage Designer to validate
schedule run and monitor the data stage jobs.
. Developed the ETL jobs to load the data in to a data warehouse, which
is coming from various data sources like Oracle, Mainframes and AS 400
. Developed various Transformations based on Business Requirements.
. Involved in designing the procedures for getting the data from all
systems to Data Warehousing system and preparing Design Documents,
Performance Review, coding and Test Case Specifications and Mapping
documents
. Involved in data warehousing designing and data integration with
different enterprise systems.
. Involved in data analysis, data mining, cleansing and mapping
documents, technical documentation, SDLCs and business analysis
methodologies and identified the Facts and Dimensions using Erwin Data
modeling tool to represent the Star Schema Data Marts.
. Upgraded 7.1v jobs to 8.0 version datastage jobs
. Implemented the Surrogate key by using Key Management functionality
for newly inserted rows in Data warehouse.
. Developed Shell Scripts for taking backup and recovery of database.
Performed physical and logical backup.
. Developed Database Procedures, Functions, Cursors, Joins and Triggers
using PL/SQL.
. Data Quality and validation using Quality Stage, Metadata capture and
analysis using MetaStage.
. Extensively worked with Quality stage to standardize data by trimming
spaces, removing nonprinting characters, incorrect spellings using
transfer, parse, collapse, select, uni join and build stages. Quality
Stage jobs using standardize, Investigate, Match and Survive stages.
Wrote custom Quality Stage Rule sets using Pattern Action Language and
used BTEQ in tera data base.
. Extensively used hash files to perform Lookups and to act as an
intermediate file in a job, used almost all the stages in the
designer. Loading in to tera data and oracle databases and we used
FastLoad, MultiLoad, and BTEQ in tera data.
. Worked on programs for scheduling data loading and transformations
using Datastage from legacy systems to Oracle 9i using SQL*Loader and
PL/SQL and meta stage used.
. Developed Shell scripts to automate data loading procedures. Used Cron
utility to schedule the jobs. Prepared Master Test Plan and performed
Unit testing, System, Integration, Volume testing.Prepared
implementation plans and involved in implementation and Involved in
data integration, migration.
. Developed test cases and Performed Unit testing, System, Integration
testing, UAT testing.
. Involved in implementing developed components and Provided On call
support
Environment: Ascential DataStage 8.0/7.1.0.8, ORACLE 10g/9i, TeraData, DB2,
DB2-UDB, SQL, PL/SQL, Quality Stage/Integrity 7.5.2/7.5.1, TOAD / Oracle
SQL Developer / Oracle Designer / TeraData SQL Assistant / Compuware
DevPartner, IGRAFIX Flow Charter, UNIX, AIX, Linux, AutoSys, ERwin 4.1.5,
Ascential Meta Stage, Perl scripting, Humming Bird Connectivity(FTP).
Citizens Bank, Providence, RI
Aug'05 to May'06 DataStage Developer
Citizens Bank is a leading financial services company, with international
offices serving clients in many countries. Citizens Bank provides
individuals, small businesses and commercial, corporate and institutional
clients across the United States to manage their financial lives. Also
Citizens Bank is a part of Charter One Bank, which serves clients in the
overseas countries. Developed common Module that can be used in different
parallel jobs for Look up reference logic, in which takes its input
parameters from all the source systems and develops an output parameters
according to the input source systems that are given.
Roles & Responsibilities:
. Common Module Design to classify the data and update the invalid
classification id through out the target database, which went through
an ETL Cycle.
. Designed and developed jobs for source extraction, transformation and
loading in to target databases.Common Module Development involving
classification look up
. Involved in designing, developing and mentoring data cleansing and
quality procedures to data stewards. Developed and implemented
Architectural artifacts for DW audit, security controls, segregation
of duties for DW environment. Data Quality and validation using
Quality Stage.
. Source System Mapping design, worked with the mapping team to design
the process of source to target mapping for source systems.Created
database procedures, functions and triggers using PL/SQLCreated and
used Oracle Stored Procedures in the DataStage jobs to increase the
performance of the jobs. Used INTEGRITY tool for cleansing and
integrating the data before it is loaded into the staging area
analysis.Worked with TOAD to interact with Oracle and Used SQL Trace
and Explain Plan to improve the performance of the PL/SQL programs
. Interacting with the business analyst's team to figure the mapping
details for source systems.
. Initialization for control structures which play a key role in
successful execution of an ETL Cycle
. Involved in exporting the developed jobs from Development Environment
to SIT Environment
. Developed the Job Initialization sequencer, which acts as Start up
sequencer for every job, which plays a key role in control structures
for successful completion of ETL cycle
. We are using Perl scripting and involved in data integration,
migration.
. Testing of the jobs and validating all the transformation logic and
mapping details
. Documentation of all the development and design updates and unit
testing documents
. Exporting the jobs to System Integration Testing and monitoring the
jobs performance and the database statistics.Developed a Unix script
for a dynamic database updating.
. Development for sequencers for source systems
Environment: DataStage 7.5 (Designer, Director, Manager), Quality
Stage/Integrity 7.5.2/7.5.1,DB2, Oracle 8i/9, PL/SQL, PRO*C, SQL Loader,
Meta Stage, Perl scripting, Windows NT 4.0,Mainframes.
Micron Technologies, Boise, ID
June'04 to July'05
ETL Developer(Data Stage)
Micron is one of the world's leading providers of advanced semiconductor
solutions. Micron's DRAM and Flash components are used in todays most
advanced computing, networking, and communications products, including
computers, workstations, servers, cell phones, wireless devices, digital
cameras, and gaming systems. The project involved extracting data from
different sources and loading into Data Marts. The major job involved in
cleansing the data and transforming the data to the staging area then
loading the data in the Data Marts. The Data Mart is an integrated Data
Mine that provides feed for extensive reporting. It enables an insight into
the current and future customer needs based on the information received
from the Data Warehouse.
Roles & Responsibilities:
. Involved in development phase for Business Analysis and Requirements
Gathering. coordinated with the team members to translate business
requirements into data mart design.
. Designed logical and physical models using Erwin data modeling tool.
. Coordinated with the team members to translate business requirements
into data mart design. Creating and loading data warehouse tables like
dimensional, fact and aggregate tables using Ascential DataStage.
. Worked on modifying the star schema dimensions to meet the current
requirements. The granularity of the fact table was set to the most
atomic level according to the requirements.
. Responsible for getting the Data Dictionary, designing and modifying
the DataStage jobs using DataStage designer. Created mappings based on
the data dictionary.
. Worked on data migration from SAP and was responsible developing
DataStage jobs, creating mapping for data transformation, extracting
data from SAP and loading it into the dimensional tables.
. Worked on Low Level Production Orders, Material Consumption
Quantities, Work Centers, and Material Pricing Cross Reference Files,
Sales, Shipment, Inventory Management and other ad hoc requirements.
Used various SAP transactions for extracting, viewing and loading data
. Used DataStage Designer to create DataStage applications (jobs), where
all the required mappings, transformations, routines and formats were
designed to load the data into Cognos reports (SAP R/3. )
. Converted Server jobs to Parallel jobs using Parallel Extender for
processing the large volumes of data. Created DataStage jobs, batches
and job sequences and tuned them for better performance.
. Created local containers/ shared containers, to give a clean look to
the jobs and cut down on the programming efforts. Troubleshoot
problems and rectified errors. Involved in data integration,
migration.
. Used Version Control Management (Scheduling tool). Wrote SQL queries
for validating the data for tables. Developed PL/SQL procedures and
functions for validating the data.
. Designed the Logical Model & Physical Model using Erwin modeling tool
. Rational Clear Quest used to enhancement requests, assign work
activities, and assess the real status of projects throughout the life
cycle.
Environment: DataStage EE 7.5/7.x (Designer, Director, Manager), SAP
R/3(Cognos), TeraData, Oracle9i, PL/SQL, SQL Loader, DOS scripting, Clear
Quest, UNIX, Perl scripting, Shell scripting, Windows NT 4.0
Bank of America, Charlotte, NC
May'03 to May'04
DataStage Developer
Bank of America is a leading financial services company, with international
offices serving clients in many countries. Bank of America provides
individuals, small businesses and commercial, corporate and institutional
clients across the United States to manage their financial lives. In this
project I was involved in the analysis, design, testing and deploying the
data from the source system to the Data warehouse system according to the
business requirements of the users by using the Data stage ETL tool.
Roles & Responsibilities:
. Involved in every phase of the system development life cycle
including research and analysis gathering user and business
requirements from client.
. Worked extensively in logical and physical data modeling and
forward and reverse engineering process using Erwin and conducted
sessions with users and business group.
. Assisted in capacity planning, and management of all warehouse
conversion or migration processes.Involved in building strategy for
Star Schemas with fact & dimension tables.
. Interacted with virtual users and IT team to identify key
dimensions and measures for business performance.Created Data Stage
jobs, batches and job sequences and tuned them for better
performance.Mapping of Data items from source systems to Target
systems
. Used DataStage Director schedule running the server jobs, events
and to resolve error conditions, job failures due to exceptions
raised in GCI sub routines.
. Used DataStage Administrator to control purging of Repository and
DataStage client applications or jobs they run and reset.
. Processed Cleansing, Purging and Optimizing of the data in
warehouse.
. Developed and implemented all ETL (Extract, Transform and Loading)
Components based on the filter rules to obtain needed data from
different source systems for calculating required metrics using
PL/SQL, Proc*C.
. Extensively worked on Hash-file stage for look-ups, ODBC, Hashed
file, Aggregator, Sequential file, Link Partitoner and Link
Collector Stages.
. Extensively involved in writing Stored Procedures and calling the
same through Data Stage Stored Procedure Transformation.
. Developed Server Side functionality by using PL/SQL and UNIX shell
programming.
. Constructed SQL Scripts to validate the data after loading process.
. Used Parallel Extender for splitting the data into subsets and
flowing of data concurrently across all available processors to
achieve job performance.
. Created reports with Business Objects using Multiple Data Sources,
Drill Down, Slice and Charts. Extracted data from various sources,
Transformed according to the requirement and Loaded into data
warehouse schema using various stages like sequential file,
Aggregator, Transformer stage. Loading in to tera data and oracle
databases and we used FastLoad, MultiLoad, and BTEQ in tera data.
Participated in all phases including Requirement Analysis, Design,
Coding, Testing, Support and Documentation.
. Transformations including aggregation and summation construct from
operational data source to Data warehouse and Involved in data
integration, migration.
. Worked on programs for scheduling Data loading and transformations
using DataStage from DB2 to Oracle using SQL* Loader and PL/SQL.
Environment: DataStage 7.x/6.0 (Designer, Director, Manager), Tera Data,
Oracle 8i/9, PL/SQL, PRO*C, SQL Loader, UNIX, Perl scripting, Shell
scripting, Windows NT 4.0.
United Health Care, Edina, MN
June'02 to May'03
DW Developer (Data Stage)
United health care is a medical insurance company that advocates proven
consumer health. The Scope of this project is to build the data marts for
the customer service department and for claims division, to integrate all
the information in to the data warehouse for generating reports. The
project involved extracting data from the different source and loading on
the data marts. It enables an insight to the current and future customer
service and the claims business decisions based on the reports generated
from this data warehouse.
Roles & Responsibilities:
. Involved in the analysis of physical data model for ETL mapping and
process flow diagrams.
. Involved in designing the procedures for getting the data from all
systems to data warehousing system. The data was standardized to store
various business units in tables.
. Used DataStage Manager to import, create and edit the metadata.
. Used DatatStage administrator for managing the users and privileges.
. Used DataStage to design and develop various jobs and performs data
loads.
. Extensively used most of the transformations of the DataStage in
developing jobs.
. Validated all the applications and ran the jobs using DataStage
Director.
. Performed tuning of the repository and jobs for optimum performance.
. Created UNIX shell Scripts using k-shell for data cleansing and for
scheduling the jobs.
. Analyzed and explored Business Objects for reporting purposes.
Environment: Ascential DataStage6.0/5.1 (Designer, Manager, Director),
ETL, Business Objects 5.1, UNIX, Shell Scripting, Oracle8i, TOAD, Erwin,
Windows N.T 4.0, Sun Solaris 2.7, PL/SQL.
Eli Lilly Pharmaceuticals, Indianapolis, IN
Dec'01 to June'02 DW Developer(Data
Stage)
Data warehouse is designed specifically for the needs of the corporate
secretary. With this Data warehouse, you can eliminate most tedious and
repetitive corporate management duties. The team implemented an Enterprise
Data Warehouse (EDW) that collects, organizes and stores data from
different systems to provide a single source of integrated and historical
data for the purpose of end user reporting, analysis and decision support.
Primary objective is to improve the client's services by preventing errors,
providing real time data and upgrading records as transactions are
completed.
Roles & Responsibilities:
. Involved in development phase meetings for Business Analysis and
Requirements Gathering.
. Designed DataStage ETL jobs for extracting data from heterogeneous
source systems, transform and finally load into the Data Marts.
. Analyzed the given source Dimensions and target Fact table structures
to develop surroggate key tables referencing required dimensions.
. Created Hash tables referencing the Surrogate Key tables for quicker
and more efficient lookup to Dimensions.Developed various jobs using
ODBC, Hashed file, Aggregator, Sequential file, Transformer stages.
Imported and exported Repositories across DataStage projects
. Using Shared Containers and creating reusable components for local and
shared use in the ETL process.Trouble-shooted and tested the designed
jobs using the DataStage Debugger
. Worked on performance tuning and enhancement of DataStage job
transformations.
. Summed key performance indicator using Aggregator stages as an aid to
Decision Support Systems.
. Used the DataStage Director its run-time engine to schedule and
execute developed jobs and job sequences and use Log events to monitor
job progress and performance.
. Participated in weekly status meeting and conducting internal and
external reviews as well as formal walkthroughs among various teams
and documenting the proceeding.
Environment: AscentialDataStage5.0 (Director, Manager, Parallel Extender,
Debugger) IBM DB2 Universal Database, UNIX, Oracle8i, SQLServer, HTML,
XML, Windows NT
IntelliGroup (SeraNova Internet Division)
July'01 to Nov'01
Online Recruitment and Deployment (ORAD)
This project is for SeraNova for the online Recruitment and Deployment of
candidates. This system caters and fulfills the need of HR Department with
the aid of various facilities such as the candidate information, scheduling
interviews, Requests made by the various branches for Resource
Requirements. Also facilitates for user-friendly reports based on the
section of various options, which caters the need of the user.
Roles & Responsibilities:
. Designed and developed UI interface-R diagrams, Logical, Physical
layout design.
. Designed Servlets for the server side.
. Guiding developers working on JSP.
. Developing and designing the data objects, business objects using Java
Beans and EJB.
. Designed the procedures and factions in Transact SQL for transaction
process.
. Wrote the data base triggers for the complex business rule
validations.
. Involved in testing the packages.
Environment: Java, JavaScript, Servlets, MS-Office Tools, C, C++, SQL,
PL/SQL, Oracle 7.x, Win NT
Meera Sai Informatics India
Aug'00 to June'01
Oracle developer
This project maintains details of all the incoming orders made by the
customers for various products sold by the company, the Sales Persons
assigned for the various orders, the date of delivery etc. The operator
provides data entry screens to enter all this information. Also maintains
information of the product stock, Customer Information, Supplier
information etc. Invoices are printed for the orders made by the Customers
just by the click of a button. Annual and Monthly Reports are generated to
finalize the sales based on the following categories:--Product wise sales
-Employee wise sales.
Roles & Responsibilities:
. Participated in the requirement specification study and communication
with the different user departments.
. Involved in the design of tables, Forms and Reports.
. Followed the Software Development Life Cycle (SDLC) process in the
development of the application.Developed various triggers, functions
and stored procedures in PL/SQL.
. Designed screens layouts for the report generation modules.
. Writing control file scripts for loading the data into tables using
SQL * Loader.
. Writing PL/SQL procedures, functions and packages to perform numerous
calculations.
. Interacted with both Oracle and SQL Server RDBMS's.
. Designing and Developing of various screens for the user interfaces
using Forms 4.5
Environment: Oracle 7.x, PL/SQL, SQL * Plus, Developer 2000(forms 4.5,
Reports 2.5).