Post Job Free

Resume

Sign in

Sr Business Analyst/ Web Developer

Location:
Panjim, GA, India
Posted:
June 09, 2011

Contact this candidate

Resume:

RAJESH GAONKAR

cp4iq0@r.postjobfree.com,cp4iq0@r.postjobfree.com

096********-Cell

Model Legacy,8A/FO-1,Taliegao,Goa.

SUMMARY:

• Expertise in ETL Tools – Informatica (8.6,8.1,7.1.3,6.x) (Powermart, Power Center, Server Manager and Workflow Manager) and DTS (SQLServer 7.0/2000)(DTS Designer, DTS Import/Export, DTS Object Model)

• Extensive knowledge in the development of Stored Procedures, Functions & Oracle Packages, Triggers using Oracle PL/SQL, Oracle 10G, 9i/8i/8.0/7.3, SQL SERVER 2000, T-SQL. TOAD 8.1.6 as Front-end tool.

• Extensive knowledge in developing batch programs using SQL*LOADER, DTS and Unix SHELL Scripts.

• Acquainted in Abinitio ETL tool. Ab Initio (GDE 1.7.0.9, Co-Operating System 2.7.3)

• DTS (Creating DTS, Active-x Scripts, Scheduling and Monitoring)

• Hands on experience in UNIX shell scripting(scripts,crontab,awk,sed,grep) for Job Scheduling, SQL Script execution.

• Developing, executing and trouble shooting Autosys jobs.

• Extensive experience writing Unix Shell Scripts for data processing, data loading and scheduling jobs. Worked on HPUnix, AIX UNIX, Solaris and SCO Unixware.

• Documented Design Documents for ETL Mappings, Business Rules/Transformation and conducted Design reviews with peers. Documenting ETL coding standards, mappings to extract data from different sources to targets. Documentation of Implementation Plan to migrate data from Development to Test to Production Environment. Documentation of ETL mapping for Fianance/Accounting Related Source and Target Tables. Documentation of Unit and System Test cases, Test Labs, Test Plans and tracking defects using Mercury Test Director.

• Strong in Data warehousing concepts using Normalized Multi-dimensional Star Schema and Snowflakes methodologies. Worked on Ralph Kimball's & Bill Inmon’s Methodology.

• Data/Dimensional modeling using ERWIN Tool and Visio.

• Experience in architecture, database modeling, analysis, design, development, and data conversion.

• Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets.

• Technically proficient in Identifying and Resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

• Proven ability to effectively manage time and resources and prioritize tasks in order to consistently complete projects according to specifications and on time.

• Responsible for interacting with business partners to identify information needs and business requirements.

• Strong analytical and conceptual skills in database design and development using Oracle, DB2 V8.0, MS SQL Server and MS Access. Proven expertise in ASP, VB.NET, IIS, JavaScript and ASP.

• Creative problem solving capabilities, analytical approach and exceptional technical writing skills.

• Trained on Cognos EPSeries 7.0 & Business Object 5.1

• Strong working experience in the Requirements Analysis, Data Analysis, Design, Implementation, Testing of Data warehouses, ODS and Business Data marts

• Acquainted in Informatica 8.x features and architecture.

• Worked with Toad for data analysis, performance tuning of plsql/sql code, database link and view creation.

• Table creation scripts, external tables, analyzing tables/indexes, using latest oracle 9i and 10g features

• Involved in design, development, and documentation of etl process, test cases, implementation plan, production support issues, and providing production support in most of my projects.

• Migration of Informatica etl processes from Dev to QA to PROD.

• Informatica Admin , for creating Repository, Users, Groups, Versioning of Informatica Objects, Creating Connection objects, Assiging privileges to users and permissions to folders. creating shared folders and trouble shooting issues with connection, running of sessions and workflow in Dev, UAT and Prod environments.

TECHNICAL SKILLS

Data Warehousing: Metadata, Datamart, Data cleansing, Erwin, OLAP, OLTP, SQL*Plus, SQL*Loader, Informatica PowerCenter 8.1/7.1.2/6.1, Informatica PowerMart 6.x (Source Analyzer, warehousing designer, Mapping Designer, Mapplets, Transformations). Acquainted in Abinitio ETL tool. Ab Initio (GDE 1.7.0.9, Co-Operating System 2.7.3), SQLServer DTS 7.0/2000.

Business Intelligence: Trained in Business Objects 5.1.x (Designer, Supervisor, Reporter & WEBI), Cognos

7.0/6.0/5.x (Impromptu, Power Play, Transformer), Seagate Crystal Reports.

Data Modeling: ERwin 4.x/3.5.2/3.x, Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling,

FACT and Dimensions Tables, Physical and Logical Data Modeling

Data Bases: Oracle 10G, 9i/8i/8.0/7.3, DB2/AIX64 8.2.0,MS SQL Server 6.5/7.0, 2000, MS Access 7.0 / 97 / 2000,Sybase, TOAD 7.5,TOAD 8.1.6,SQL DEVELOPER

Testing Tools: Mercury Test Directory 8.0

Job Scheduling: Autosys, for scheduling Informatica Workflows,Maestro.

Version Control: PVCS

Languages: SQL, PL/SQL, Transact SQL, UNIX Shell Scripting, Familiar with Perl,VBScript, JavaScript, HTML, XML

Operating Systems: HP-UX, SCO-Unix, IBMAIX4.3/4.2,Solaris, Win NT/2000/98/95, DOS.

Acquainted in Abinitio, MFS (Multifile System), Sandbox for Version Control and Used Data Parallelism and Data Partition Components and Configured .cfg files for Database Connectivity (Oracle and SQLServer.)

PROFESSIONAL EXPERIENCE:

Digiplustech , West Windsor,NJ

Credit Suisse (CSFB) , Madison Ave,NYC, June 10 – Feb 28 2011

Repo IT Datawarehouse:

Project involves design and development of ETL interfaces to various source systems and loading the data in data marts.

These data marts are used by BO reports as front end for Treasury,MatchBook,Total Funding and others.

Data marts are build from staging tables which are sourced from flat files, Sybase and oracle source tables.

Type 2 dimensions are created for storing history and summary tables for reporting use.

Developing Informatica mappings,Sessions,Workflows,Shell Scripting for scheduling jobs through Tidal.

Adhoc SQLs in Oracle and Sybase database, Stored procedures for transformation, error handling.

Informatica maps store error information in statistics tables/exceptions tables.

Informatica 8.5.x,Unix ,Oracle and Sybase Database.

S&P (Standard & Poor's ) , 55 Water Street.NYC Oct 09 - June 10

SAS- Surveillance Automation System

LPS Rollup project, to design and develop ETL interfaces using informatica and convert existing plsql code To informatica to rollup Loans related information. Performance tuning SQL Code and Informatica Maps for better performance and throughput to load data to reporting tables. Developing Maps,Worklets,Workflows for data load. Troubleshooting Informatica maps for failures. Automating the data load for various application vendors – intex,lps and zenta. Informatica error handling tables are used for error logging and creating log of errors.

The whole process of etl is control though control table where status of deals and files are mainted through each step of etl transformation form staging to reporting.

Tables are created in respective schemas to reduce data load and for better performance. Mapping Parameters are used

In etl processing for controlling the number of deals processed per run.

Informatica Admin, creation of folders, users for running sessions and workflows, giving permissions for users/informatica objects for read ,write and execute . Migrating Informatica etl objects from dev to qa to prod environment.

Informatica 8.x,Oracle 10g,Unix . Informatica Designer , Workflow Manager,Monitor, Repository Manager clients tool are used to develop maps,sessions,worklets , and automate workflows. Move maps export/import in different environments - dev/qa and prod. Configuring informatica parameter file, unix shell scripting. Agile/Scrum methodology used for development.

Informatica 8.6 Lead Developer Feb 09 to Oct 09

HP (Hewlett-Packard Company) Financial , 420 Moutain Ave,Murray Hill ,NJ

ETL Consultant – (Informatica/Oracle 11g,PL/SQL,Unix Shell Scripting)

Project Name: Portfolio Management Reporting

Description: To Position GPMO and IT segments to better meet short-term and long-term reporting requirements by

1) addressing security and data structure deficiencies

2) expanding the breadth of available data and

3) reducing the manual manipulation currently needed to produce management reports.

• Working on PPMR reporting framework datawarehouse.

• Creating mapping documents from BO Universes Objects and Reports for developing ETL interfaces.

• Developing ETL interfaces using Informatica 8.6 on ISS environment.

• PLSQL development for transforming the data.

• Unix shell scripting for running the Informatica ETL and PLSQL Jobs.

• Troubleshooting DEV/ITG/PROD job failures related to Informatica/Oracle and Unix.

Environment: Informatica 8.5 ISS ETL tool, Oracle 11 G , Unix Shell Scripting, PL/SQL

Programming for transformation, TOAD 8.6 and Tidal for job scheduling.

Lexstream , Biz, Hoboken,NJ. June 08- Dec 08

Oracle PLSQL Lead Developer/Development DBA

• Creating and modifying Oracle Database tables as per the design requirements and applying Constraints to maintain complete Referential Integrity.

• Extensively used TOAD /SQLDEVELOPER to analyze data and fix errors and developing and unit testing packages/stored procedures and functions.

• Worked on developing custom reports for fas123 and fas128 related to tax on stock options.

• Developed stored procedures ,functions and packages to pass result sets as refcursor to java application.

• Calculation were done using store procedures which were executed dynamically .

• Dynamic sql strings were used to create sql on the fly.

• Worked on merge feature of oracle 11g.

• Database modeling , creating of tables, views, indexes, synonyms.

• Data Modeling , converting DOM models to Logical and Physical Model.

• Defining Entities, Attributes and Relationship between Entities.

• Defining Unique, Primary and Foreign Key Constraints.

• Resolving Many to Many Relationships between tables.

• Converting Business Requirements to ER models used VISIO for ER/Dimensional Modeling.

• SQL Querying - complex joins, correlated sub-queries, in-line views, text functions, aggregate functions, analytic functions, union , outer join, minus, intersect for data analysis.

• Experience developing complex packages, procedures and functions

• Experience integrating Shell Scripts with Oracle procedures

• Configuring VPD/ FGAC for securing data.

• Performance tuning of SQL

Environment : Oracle 11g , Flat Files, Unix Shell Scripting, PL/SQL Programming for transformation, TOAD 8.1.6/ SQL Developer .

OPTIMA GLOBAL SOLUTIONS,HAMILTON,NJ (Nov ‘06 to June 08)

Bruno’s Birmingham, AL Mar ’08 – May ‘08

• Working on conversion/decoupling project.

• Analysis/Gap Analysis and designing new interfaces in unix shell scripts.

• Modifying existing ETL process written in Oracle plsql.

• Writing scripts for encrypting/decrypting data using GPG/PGP encryption software. Configuring files for encryption using gpg encryption software. Configuring files for multiple keys.

• Configuring SFTP,SCP,SSH for authentication less SFTP.

• Creating public and private keys and sharing the keys across servers and configuring the authorized keys file for authentication less ftp process which helps in writing automated batch scripts.

• Transferring files using sftp, ftp and automating the whole process of encryption, file transfer and auditing of file moves and The process of file moves(push and pull).

• Writing new shell scripts and modifying existing shell scripts.

• Automating the shell scripts using Autosys Job Scheduler.

• Autosys 4.0 is used to develop jobs, jobs are scheduled individually and in Boxes.

• Responsible for writing file watcher jobs, box jobs, send event jobs.

• Scheduling jobs Using calendars , time parameters and job dependencies

• Moving code from UAT, DEV to PROD and Unit testing the scripts and jobs.

Environment: Flat files, Oracle 10G Target, Unix Shell Scripting, PL/SQL Programming for transformation, TOAD 8.1.6 and Autosys 4.0 for scheduling ETL jobs.

PFIZER INC., MORRIS PLAINS, NJ Nov ’06 to Sep ‘07

ETL/Oracle Lead – J&J Integration (Informatica/Oracle 10g,PL/SQL)

• Working on Open Orders,Item dimension. Unix Shell Scripting, Informatica & Oracle PL/SQL will be used for ETL Process.Maestro is job scheduling tool.

• Performance tuning views,sql queries and strored procedures.

• Creating views, external tables, database links, public/private synonyms, configuring .ora files for connectivity.

• Using new features for data warehousing (merge statement) .

• Documenting table creation scripts and deploying in dev/qa/prod environments.

• Analyzing tables and indexes for performance tuning.

• Documenting GAP analysis, for Pfizer and JnJ source and target tables.

• Working on Adhoc request to create ETL process to load data for Zantac Vendor to give information related to Colsed(Sales) order and Open Orders. Data resides on Oracle database in ODS and target is flat files.

• Designed the ETL process to get data from denormalized customer,invoiced order,open order and lot tables.

• Unix shell scripting to run the informatica workflows and store header and footer information in flat files. Writing unix functions to run different steps of jobs to load and validate data.

• Writing wrapper script in unix shell to run the whole process,rename/moving flat files for archiving and emailing the flat files to clients.

• Validating the data with end users.

• Documenting the etl process,mapping document, file formats.

• Pulling data from archive tables for history reports.

• Creating Adhoc SQL queries, modifying existing pl/sql packages.

• Debugging production jobs,fixing oracle pl/sql procedures,enhancing existing pl/sql code.

• Supported following downstream applications, CPFR,DRC,ODS,AWMS and other adhoc requests from business users.

Environment: Informatica 7.1.2 ETL tool, DB2 Source, Oracle 10G Target, Unix Shell Scripting, PL/SQL

Programming for transformation, TOAD 8.1.6 and Maestro for job scheduling.

IGATE MASTECH,PITTSBURGH,PA (Mar ‘06 to Oct ‘06)

THE HARTFORD, HARTFORD, CT Mar ‘06 to Oct ‘06

ETL /Oracle Team Lead (Informatica/Oracle PL/SQL).

Development and Enhancement of Personal Lines Enterprise Datawarehouse.

Avaya Phone Load Datawarehouse:

• Interact with Customers/Users/ Business Analyst to get the requirements and Prepare functional and technical documents.

• Developing New and Maintaining Existing Informatica Maps, Sessions and workflows

• Updating ERWIN models with changes.

• Developing PL/SQL code for translates.

• Loading Data from Cobol Sources(.cbl files) to Oracle 10g tables using pl/sql and informatica etl tools.

• Scheduling Informatica Workflows and PL/SQL packages using Autosys Jobs and unix shell scripts.

• Shell Scripts for load balancing, table alter scripts.

• Data Analysis using Toad 7.5/8.1.6 for Oracle Target and Source Tables.

• Documenting Informatica mappings, impact to applications, Migration of Maps and other associated elements.

• Assisting Integration Testing, Unit Testing, documenting test logs/issues.

• Assisting Migration of Informatica Maps, Source, Targets, Transformation, Sessions, Workflows from Dev, QA To Prod environment.

• Post Production support & performance tuning of Maps and PL/SQL code.

• Performed extensive debugging and performance tuning of mappings.

• Defined naming standards and implemented.

• Implemented batch process for all Informatica mappings. (Audit Control).

• Implemented balancing packages in all stages (Stage, Load, DMstage, Mart).

• Implemented Autosys scripts to automate the Processes.

• Interacted with DBA team regarding the database Process.

• Prepared Workflow process to run Sessions based on Load dependency.

• Performed Bulk and Incremental Load.

• Prepared test plans/test scripts as part of Post/Pre Production testing.

• Involved in Post Production support & Moving entire system from UAT environment to PRODUCTION environment.

• Identify/Analyze all data elements that are required for the Scorecard project

• Build ETL process to load Claim CMS data utilizing EDW best practices.

• Define data model and understand source data with some identified possible 9 tables and 300+ elements.

• Define business rules to derive some fields (duration of call, hold call etc.)

• Setup incremental process to retrieve data from twelve CMS servers

• Verification of data (unit testing, user acceptance testing etc.)

• Build linking process to link Claim CMS data to adjuster detail in WCR and APG DM.

• Load historical data from Claim CMS if available

Environment:ERWIN , Informatica 7.1.2 ETL tool, Cobol Sources, Oracle 10G Target, Unix Shell Scripting, PL/SQL

Programming for transformation, TOAD 8.1.6 and Autosys for job scheduling.

TAJSOFTWARE SYSTEM CORP,NY (Jan ‘05 – Mar ’06)

CIRCUIT CITY, RICHMOND, VA May ‘05 - Mar ’06

Datawarehouse Consultant

Enterprise Datawarehouse- Dimensions and Inventory Track (Development & Support)

• Documentation of Technical specification (Maps, Transformations, Mapplets, Sessions, Workflows, Autosys Jobs, Parameter files, Source and Target Tables.) and the ETL process to load data from various sources to different target tables.

• Documentation of Business Rules/Transformation involved when moving data from source to Targets.

• Documentation of Implementaion Plan to migrate Informatica Maps, Session, Workflows, Source, shortcuts, Shell Script, Autosys jobs from Development to Test to Production Environment.

• Involved in logical and physical design of tables.

• Design & Development of various Maps, Unit Testing of Maps. Developing Mapplets, Reusable Transformation.

• Designing Mapping flow with Performance Tuning of Maps. Creating Sessions and Workflows. Using Debugger to Debug Maps. Documenting Unit Test Cases and Unit Testing Maps.

• Creating Sessions and Workflows and Configuring Parameter Files. System Testing Informatica Maps/Sessions/Workflow using Autosys jobs.

• UNIX Shell Scripting and Autosys for Job Scheduling

• UNIX Shell Scripting and Autosys for Job Scheduling, using various commands, insert_job, update_job, delete_job. Using various parameters to define jobs like start_time:, day_of_weeks and trouble shooting autosys jobs in production, dev and test environment.

• Developing autosys jobs with dependency using the condition parameter and defining various dependency between the autosys jobs.

• Writing shell script to set the environment for running autosys jobs and SQL shell scripts.

• Defining different job streams to run workflows in parallel for different dimensions tracks (Product, Location, Code tables).

• Monitoring Jobs in development, test and production using jr, autorep commands and checking for status of jobs for a week using various options. Force starting jobs, setting or changing of job status using sendevent commands while testing. Jilling all the jobs and organizing in a structured way.

Environment: Informatica 7.1.1, repository on Oracle 8.1.7. Source databases are AS400, Flat Files, DB2. Target

Database is Database server - DB2/AIX64 8.2.0, AutoSys3.0, 4.5

BERLITZ, PRINCETON, NEW JERSEY Feb ‘05 - May ’05

ETL Developer (Team Lead and Offshore Coordinator)

Data Migration and ETL

• Business Analysis, Designed ETL Process, interviewed Business users, get requirements. Documented all the DTS packages implemented to move the data from source to target tables.

• Documented source update process for different countries.

• Writing adhoc queries to update/correct source data in Sybase database.

• Coordinated with offshore team.

• Involved in designing and creating Tables.

• Documented mapping for the following Accounting related tables from source to

• targets,GL_ACCOUNT,GL_DETAIL,INVOICE,INVOICE_DETAIL,TaxRate,Tax,Tax_Detail,

• Data Mapped from Current System (Around 70 Tables) to New System (Around 550 Tables), Documented Mapping for the ETL process .

• Created the Mapping document and the ETL process involved.

• Created Test data and load in temporary tables for testing.

• Data Profiling, Data quality check for duplication of data, standard codes.

• Designed and wrote the DTS Package, workflow and scheduled the jobs. Ran test loads.

Environment: Microsoft NT Server 4.0,MS-SQL*Server 2000/7.0, DTS. Sybase, MS Excel, MS Word, MS

PowerPoint

VMOKSHA TECHNOLOGIES (Aug ‘03 – Mar ’04)

B2BC web /Database Consultant Aug ‘03 – Feb ’04

Worked as an in-house consultant on Database and B2bc web related projects.

CHETU INC , MIAMI FLORIDA USA. (Jan 2000 – June. 2003)

AMERICAN MEDICAL DEPOT Jan ‘02 – June’03

Oracle/ETL Developer

• Imported various Sources, Targets, and developed Transformations using Informatica Power Center Designer.

• Developed various Mappings with the collection of all Sources, Targets, and Transformations.

• Created Mapplets with the help of Mapplet Designer and used those Mapplets in the Mappings.

• Used Type2 version to update slowly changing dimension table to keep full history.

• Involved in performance tuning by optimizing sources, targets and sessions.

• Loaded target data warehouse in Oracle 9i

• Worked with Oracle External Tables new feature of Oracle 9i, which allows Oracle to query the data that is stored outside the database in flat files using drivers. They are useful in the ETL process of Data warehouse and can be queried in parallel.

• Used Oracle's Multitable inserts a new feature of Oracle 9i to load data into new application from source systems, which reduces table, scans and PL/SQL Code necessary for performing multiple conditional inserts. Its main use is for the ETL process in data warehouses where it can be parallelized and/or convert non-relational data into a relational format

• Extensively used, External Loader Utilities.

• Created Workflows in workflow designer and executed tasks such as sessions, commands using Informatica Workflow manager.

• Monitored transformation processes using Informatica PowerCenter 6.1 Workflow monitor.

• Worked extensively on almost all the transformations such as Aggregator, Joiner, Cached and Uncached lookups, Connected and Unconnected lookups, Filter, Router, Expression, Normalizer, Sequence generator, Joiner etc.

• Implemented Complex logics using expression transformations. Involved in Data Certification by using complex SQL queries on warehouse. Identified the required dimensions and, measures for the reports.

• Prepared prototypes of all the reports and an approval for the requirements and the Prototypes were obtained from business owners.

Environment: Informatica 6.1 (Source Analyzer, Warehouse designer, Mapping Designer, Transformation

Developer), Oracle 9i, Unix, Flat Files, Access, SQLServer 7.0, Access, ERwin, Windows NT.SQL,PL/SQL,SQL*Loader, Korn Shell Scripting.

SCHEDULEEARTH.COM, MIAMI, FL Feb ‘01 - Jan’02

Database / Web Developer

• As a lead developer, gathered client requirements, scope, design and developed the site.

• The site was developed using ASP (VB Script), HTML. The backend database was MS-SQL*Server 7.0.

ALLMARINEPARTS.COM, MIAMI, FL Jul ‘00 – Jan ’01

Database / Web Developer

• The b2b exchange was being designed and built to be a virtual hub or marketplace for cargo and cruise ship owners and part buyers to invite quotes from the ship part manufacturers.

• Developed using ASP (VB Script), HTML. The backend database was MS-SQL*Server 7.0.

DUEDILIGENCEDONE.COM, MIAMI, FL Jan ’00 – Jun ’00

Database / Web Developer

• This project required coding in ASP (VB Script), HTML and interaction with the backend database using ADO. The connection was made using ODBC and the backend database was Microsoft Access. One of the major responsibilities included online credit card authorization using Cyber Cash.

ECOMSERVER INC, LAWRENCEVILLE, NJ April 98- Jan 00

Technical Associate

• System Installation: This being a newly formed startup, in the initial phases was involved in the LAN setup involving hardware and software installation. Role involved setting up the network consisting of desktops running a mixture of MS-Windows 95 and NT machines. The network also has Microsoft Windows NT servers running as Primary, Secondary and backup Domain Controllers.

• VoiceStream Wireless: The project involved the design, setup, administration and maintenance of the database environment for remotely loading data into the ORACLE RDBMS. The data resided on an Oracle 7.3.2.3.0 RDBMS running over the UnixWare Operating System. The data was uploaded on a daily basis onto the Oracle 7.x database residing on a Windows NT 4.0 system by running batch jobs. Remote database connectivity was accomplished by using SQL*NET V2 client software. This involved configuring the LISTENER.ORA and TNSNAMES.ORA files, starting the listener process and configuring the respective files required for remote database connection. This being a single-team member project was involved end to end into the design and development of the system. As part of ORACLE system administration and maintenance responsibilities was involved in installing the Oracle 7.3.x on Windows NT, creating users, creating roles, creating tablespaces , granting the roles to users, configuring the SQL*NET V2 files required for the remote database connectivity. The connection to remote Database was accomplish by using ODBC and SQL*NET V2. I was also responsible for generating reports using Crystal Reports 7.0 running on a Windows NT4.0 OS. This also involved creation of database links, views, elimination of duplicate records, deleting records after certain time frame, backing up data and automating the full process. For this we made use of the SQL*LOADER and the IMP/EXP functionality.

• Oracle 8.0.x installation (Lucent Technologies): Involved in automating the process of installing Oracle 8.0.5.0.0 onto the Intuity Conversant. This involves creation of a package that can automate the process. Intuity Conversant (8.0), is the Voice Response Unit developed by Lucent and is anticipated to be released in the beginning of next year. This is an on-going project. Role involved researching the files that are required to be executed to install the oracle software, create the database and execute the files required to created the data dictionary tables.

Environment: UnixWare 2.0, Microsoft Windows 95 and NT, Oracle 7.x, Oracle 8.0.5.0.0,and Oracle 8.i,

SQL*NET v1/2, net 8.0, SQL*LOADER, Crystal Reports 7.0

WEBSCI TECHNOLOGIES, MONMOUTH JUNCTION, NJ (May 97 – Apr. 98)

Technical Associate

WebSci Technologies is an Independent Service Vendor for Lucent. During this period worked on multiple client

projects.

• Design, development, administration and maintenance of applications running over the ORACLE database.

• Designed and developed dB tables and the dB structure

• Regular dB administration and maintenance

• Developed stored procedures using PL/SQL

• Remote dB linking, connection and configuration using SQL*NET V2.0

• Report generation using SQL*PLUS

• Execution and generation of data loading scripts using SQL*LOADER

• Automating process of loading data using unix wrapper shell scripts.

• Extensive client interaction with executives from corporations like: Chase Mellon Share Holder Services, Prudential, Samsung, Lucent, ADP and AT&T.

• Worked as a in-house consultant for AT&T and Chase Mellon Share Holder Services.

Environment: ORACLE 7.0 and above, SQL*PLUS, SQL*LOADER, SQL*NET V2.0, UnixWare 2.1, Intuity

Conversant

CENTURY ENKA LIMITED, PUNE, INDIA (May ‘96- Mar ’97)

Developer

• Involved in the system design and implementation. Writing programs to load the data into Shareholder tables, Writing test suites to ensure that the integrity of the system was maintained and that invalid transactions were discarded, writing UNIX programs using awk, cut paste, to modify the Flat file data so it could be checked for errors, reformatted and be prepared for loading into database table. The RDBMS used was Oracle on HP-UX HP-G50 server running HP-UX 9.0.

Environment: ORACLE 7.0, SQL*FORMS, SQL*MENU and Report Writer, HP-UX 9.0, HP G-50 RISC

based server

SMART INFORMATION SYSTEMS, GOA, INDIA (Aug ‘95 - Mar ’96)

Sales Support

This was a sales and support position that required a thorough knowledge of the Oracle product suites

EDUCATION: Bachelors Degree in Computer Engineering, Goa Engineering College, Ponda, Goa. Goa University, India.

First Class with Honors. June 1994

HSSCE, Dhempe College Of Arts and Science.Panjim,Goa.Goa University. First Class. March 1990.

SSCE, Peoples High School, Panjim,Goa. Goa University. First Class. March 1988.



Contact this candidate