Post Job Free
Sign in

Data Project

Location:
Fremont, CA
Posted:
June 16, 2016

Contact this candidate

Resume:

Kumud Chauhan

(Female)

Location: Fremont CA ****8 Phone: 510-***-****(C)

Visa: US Citizen Email:******@*****.***

IT Experience

7+ Years Data Analyst/Business Analyst/Data Quality Analyst

15+ years Sr. Data Architect/ Sr. Data modeler

10+ Years Data warehousing/BI (Full Project Life Cycle)

Experience Summary:

Solid analytical skills and data integration knowledge, worked in team environment, Strong problem solving skill.

Expert in Data Profiling, Data Discovery, Data Usage analysis, Data Pattern/Data trends findings. Identification of Business rules, Data anomalies, Data Inconsistency, Data Transformation logic.

Tracking all data related issues and create tickets. Root cause analysis and diagnostic issue. Work with multiple source teams, developer team, design team, reporting team, business team. Give suggestion to resolve issues or get business rules corrected

Creation of Data Dictionary, Metadata document for end to end process data linage, Technical design document, ETL data mapping document, Data dependency matrix, Data flow diagram. Data element Definition

Proficiency in SQL statement. Familiarity with NOSQL Hive, Mongo db

Understand Business function and requirements. Requirement Analysis, Data Analysis, Identify source system, Analyze existing system. Requirement gap Analysis and Requirement validity.

Guide QA team. Done testing for very complex logic.

Data Analytic, R, Rstudio, Tableau, Excel, Data Structure/ Schema Creation

Standardizing Data Exception Handling process, Data Rejection process, Data publishing process, Data loading process. Data arrival process

Management of Data standard, Data integration, Data capture, Data migration,

Data Quality checking for data Integrity, data consistency, Data completeness, Data validity, Data Anomalies, Data Accuracy

Created data validation rules/Error rules for checking above data quality issue

Created cases for all Data quality issues and provide a solution to resolve them

Partner with various teams like Reporting team, Business Analyst team, ETL team, QA Team, Source Team, data architect team, DBA team, Other DW team.

Expert in conceptual, logical, Physical data modeling. ERD,3NF (OLTP), Dimension Modeling, Star/Snowflake Modeling for data warehouse and data mart, Reverse engineer existing data model.

Familiarly with NOSQL, Key-Value model, Columnar modeling, Embedded Data model for big data. Create Model based on data pattern.

Familiarly with Big Data Concept, HIVE, SQOOP, HADOOP, FLUME, HBASE, Hana Modeling, Cassandra, Couchbase, MongoDb, Python

Expert in ERWIN, Power Designer, ER Studio.

Played key role in multiple projects and acted as a knowledge expert (Technical, Functional), SME (subject matter expertise)

Multiple data sources integration in Data mart

Develops standards and guidelines for the design of data architectures, data models, ETL architecture, BI Architecture, Data mart Architecture

Handled all oracle database development, PL/SQL,SQL programming, ETL, OLAP reporting, Materialize view

Full project life cycle, SDLC, PLC experience

Certified

Skills Used:

ORACLE 7.x,8i.X/9i/10g/11g, MS SQL Server 7.0/2000

Oracle EBS R12, Oracle Warehouse builder (OWB), Data Junction (ETL), Developer 2000, Power Builder 4.0, IWATCH, TOAD, OPS, OEM, OPD, OAS, Cognos 8.0 Analysis/Report studio, Cognos data manager, Informatica 8.1.

HIVE, SQOOP, HADOOP, FLUME, HBASE, Hana Modeling,

NOSQL, Cassandra, Couchbase, MongoDb, R, Rstudio, Tableau

PL/SQL, SQL, Python, Shell script, JDBC, ODBC, Clipper 5.2, Pro*C, COBOL, Linux

Oracle Designer 9i, ERWIN 7.2/8./9.1, System Architecture, Power Designer15, Toad Data Modeler, ER studio, Visio,

SSADM, JAD, SE, RAD, CASE, Ralph Kimball/Inmon/Big Data Methodology,Agile

MS-Project 4.0/95/98, MS Excel, MS Word, Tableau 9.3, BugZilla, Clear case, source safe V6.0, DevTrack V6.0

Windows XP/NT/7.0, Unix (Sun Solaris) 2.7, Hp Unix 10.X

Domain Knowledge:

Order Management, Enterprise Service management, B2B Market Place, Banking, BOM, Shipping, Financial Accounting (GL, Account, account Payable/Receivable, Revenue), Sales and Marketing, Oracle EBS, Certificate of Deposits (Bank), Offender Management System, Child Support Management System, Human Resource Management, Stock Management & Inventory, Production, Export Import, Energy Rebate Application, Customer Analystic.

Full Project Life Cycle and Project Management experience. Project Planning, Scheduling, Tracking, Issue Resolution, Resource Allocation, Configuration Management, QA Management, Team Management, User Interaction

Professional Experience IN USA:

Project: Training from Intellipaat.com, Udemy.com Feb’2016 -Current

Title : Hadoop, Hive, Sqoop, Flume, Hbase, MongoDB, Tableau, R-Studio

Synopsis: Online training from Intellipaat.com, Udemy.com

Role: Analyst/Data Modeler

Big data concept, Hadoop Eco System

Installation of Hadoop, Hbase, Hive

Data ingestion from MYsql to Hadoop Hbase using Sqoop

Data Streaming from Twitter to Hadoop Habse using Flume

Data ingestion of flat file to Hadoop Hbase using Hive

Json File/Flat file analysis using Hive

No-Sql columnar database, Document data base concept.

Key Value Model, Master/Reference Model, Embedded model creation

Document data inserted/updated/deleted/viewed in Mongodb

Data Mining/Big Data analytic/Data visualization using Tableau

Data Analysis/basic statistical analysis of excel/json/xml/mysql/csv data using R-studio, Tableau

Familiarity with Cassandra, Couchbase, Map reduce, Pig, Python

Project: PG&E, San Ramon May’12 – Nov’15

Title: Rebate Applications system, Customer data warehouse, Channel of Choice

Role: As Sr. Data Modeler/Analyst/Data Architect

Synopsis: PG&E has various programs to save energy. PG&E provides consumers rebate for participating in energy efficiently programs or replacing energy efficient appliances or instruments. There will be savings associated with each programs. All savings due to Energy efficient programs will be reported to CPUC (California Public Utility Commission)

The Channel of Choice Customer Data Analytic reporting requirements fall into four categories: Performance Reporting, Operational Reporting, Management Reporting (Customer Satisfaction), and Executive Dashboards.

Responsibilities:

Data Profiling, Data Discovery, Data Usage analysis, Data Pattern/Data trends findings. Identification of Business rules, Data anomalies, Data Inconsistency, Data Transformation logic.

Standardizing Data Exception Handling process, Data Rejection process, Data publishing process, Data loading process. Data arrival process

Management of Data integrity, Data standard, Data element definition, Data capture, Data migration, Data Quality, Meta data.

Tracking all data related issues and create tickets. Work with multiple source teams, developer team, design team, reporting team, business team. Give suggestion to resolve issues or get business rules corrected

Creation of Data Dictionary, Metadata document for end to end process, Technical design document, ETL data mapping document, Data dependency matrix, Data flow diagram, Data Element Definition.

Created Conceptual, logical, Physical data model. Dimension modeling, star schema design for data mart, Data flow diagram(DFD)

Reverse Engineer Existing data model from database

Provide subject matter expertise in Data warehouse / Business intelligence

Partner with various teams like ETL team, QA Team, Source Team, Reporting team, Business team, data architect team, Business Analyst.

Data Quality checking for data Integrity, data consistency, Data completeness, Data validity, Data Anomalies, Data Accuracy

oCreated data validation rules/Error rules for checking above data quality issue

oCreated cases for all Data quality issues and provide a solution to resolve them

Work with Business analyst, SME, Business user and understand, Collect business/functional Requirements, business rules and changes in existing system

Understand source system prepare list of questions and get answer from source team.

Worked as bridge between business team and technical teams.

Presentation/Review of Data Model with Technical team/ Business user, Presentation of Technical Design Document to ETL/QA/Design team.

Extensive data analysis using SQL queries.

Change management and analyses impact of changes to Upstream/ Downstream / Existing Current system. Provide solution to impact of changes.

Root cause analysis of data issues, ETL issues, Source system issues. Supporting Reporting team and escalate issues to concerned groups.

QA of Developed ETL code

Partner Master Data Management., Multiple data sources integration in Data mart.

Environment: Oracle 11g, ERWin 9.1, Toad, Sales Force, Ms Excel, Agile

Project: HP Cupertino Aug’10 – Mar‘12

Title : OPSDW, OVSC, OVSM, ITSM, HP Enterprise Service Management

Role: As Data Modeler/Analyst/Data Architect

Synopsis: HP is providing hardware and services to client. Services in the areas like network/Printer/Storage/computer/database/ management of Incident /Problem /Change etc. Clients create the ticket for problem or failure in hardware or incident recorded in automated system. This incident is routed to proper channel and engineer get assigned. The customer’s data is loaded from various sources and centralized on the HP Customer Reporting Data Warehouse (CRDW) daily.

Responsibilities:

Created Conceptual, logical, Physical data model. Dimension modeling, Star schema design for data mart, Data flow diagram(DFD)

Creation of logical schema, table(DDL), indexes, sequences, views

Creation of Data element definition and Data dictionaries. Meta data document

Creation of Technical Specification document, ETL mapping documents, Data dictionary, Work flow dependency matrix, Work flow Bus Matrix.

Integrating multiple data sources in single data mart

Data analysis, data discovery, data integrity, Data quality, Data profiling, Data Integration, Data Anomalies, Data Pattern findings

Participate in establishing data policy, data element naming standard, data architecture, ETL architecture, exception handling process, Data load batch process, data extraction process, Data auditing

Collaboration with various teams ETL team, QA Team, Source Team, BI team

Supporting, Guiding ETL developers, QA team

Tracking all data related issues. Root cause analysis of issue, Check/Understand existing codes written in Informatica. Work with developer team, design team, QA team Give suggestion to resolve issues or get business rules corrected

Requirement analysis, understand business function, Business rule.

Post project, extensive data analysis of generated data and created report to prove that data is accurate and all logic is implemented correctly.

Understand source system prepare list of questions and get answer from source team. Analyzed source system and do gap analysis

Provide subject matter expertise in Data warehouse / Business intelligence

Written many Queries SQL statements.

Involve in full project life cycle.

QA of Developed code, Unit testing

OVSD, OVSM, OVSC, OV suite of products.

Environment: Oracle 11, SQL developer, ERWin 8., MS Excel, Ms Word

Project: WellsFargo.com (San Leandro) Oct’09 – Jul’10

Title : Wholesale banking Treasury Management Data Mart

Role: As Sr. Data Modeler /Data Architect

Synopsis: Company is financial bank. Data Mart was created for Wholesale Finance group of Wells Fargo for Treasury Management Profitability. Wholesale customer profitability, product profitability, accounting unit Profitability

Responsibilities:

Created logical data model, physical data model Dimension modeling, star schema design

Creation of Physical Schema, tables (DDL), indexes, sequences, views, synonyms, db links. Comparison of objects of different schema.

Reverse Engineer Existing data model from database

Created data dictionary with data element definitions.

Data analysis, data discovery, data integrity, data quality, data profiling, meta data, Data dictionary

Deployment of schema to different server. Static data creation.

4 layer architecture. Landing, integration, semantic and application.

Recommendation about setting up Oracle parameters

Performance tuning query.

Participate in establishing data policy, data architecture, data standard, exception handling process, Data load batch process, data extraction process, Data auditing, data rejection analysis

Change management and impact of changes and change procedures

Writing SQL / Unix scripts

Working with Business Analyst, User and track issues and recommend resolution.

Environment: Toad 10.1, Sybase Power Designer 15.0, Oracle 10g, Clear Case 7.0, Clear Quest

Project: ART.com, Emeryville (CA) May’09 – Sep’09

Title : Sales Marketing Data mart BI product

Role: As Data Modeler/Analyst

Synopsis: Company has B2C web site selling art product. Created enterprise data warehouse for various applications in oracle EBS like finance, order management, Marketing, BOM.

Responsibilities:

Requirement analysis, understand business function, Business rule

Understand source system prepare list of questions and get answer from source team. Analyzed source system and do gap analysis

Provide subject matter expertise in Data warehouse / Business intelligence

Collaboration with various teams ETL team, QA Team, Source Team, BI team

Conceptual, logical data model. Dimension modeling, star schema design for data mart, Data flow diagram(DFD)

Verification of requirements with data model with business requirements.

Reverse engineer Existing Source data model

Data analysis, Data discovery, Data integrity, Data quality, Data profiling, Meta data, data dictionary, Work flow dependency matrix, Work flow Bus Matrix. Create diagrams using VISIO, DW Data Flow (DFD), Technical Architecture

Creation of logical schema, table(DDL), indexes, sequences, views

Write Technical Specification (TD), ETL mapping documents

Participate in establishing data policy, ETL architecture, BI architecture, data architecture, data standard, exception handling process, Data load batch process, data extraction process, Data auditing

QA of Developed code, Unit testing, Guiding ETL team.

Business object (BO) rapid mart analysis

Order management(OM), BOM, Shipping(WSH), Work order (WIP)

Environment: Oracle EBS R12 (Oracle Applications), SQL developer, ERWin 7.2, oracle 10g, Business Object, Visio 2003

Project: OpenTV.com, Mountain View (CA) May’08 – Nov’08

Title : Sales Data mart BI product

Role: As Data Modeler/Analyst

Synopsis: Company has product for TV advertisement. They are looking for creating BI product attach to main product. Eclipse product takes all order from advertiser and create schedule of advertisement in each channel each network. Once advertise executed it compare schedule with actual advertise run. Create invoice and take care of Account Receivable, adjustment and payment

Responsibilities:

Involve in full project life cycle.

Requirement analysis, understand business function, Business rule

Understand source system prepare list of questions and get answer from source team. Analyzed source system.

Provide subject matter expertise in Data warehouse / Business intelligence

Collaboration with ETL team, QA Team, Source Team, DBA team, BI team

Conceptual, logical data model. Dimension modeling, star schema design for data mart, Data flow diagram(DFD), Reverse engineer Existing Source data model

Verification of requirements with data model. Data Analysis, Gap Analysis, data integrity, Data quality, data profiling, Data Dictionary, Workflow, Dependency Matrix, Logical Data Map

Staging persistent layer data model creation.

Creation of physical schema, table(DDL), indexes, sequences, views, MV

Created Data Mapping document from CDC to Staging and Staging to Data Mart

Assigning task to ETL developers, Guiding ETL developers, QA team, DBA

Create data dictionary, data definition, business rules, metadata

Participate in establishing data policy, ETL architecture, BI architecture, data architecture, data standard, exception handling process, Data load batch process, data extraction process, Data auditing, data purging policy.

Help creating project plan, task listing, work estimation

Change management and impact of changes and change procedures

OWB training, Develop few mapping

Environment: Oracle 10g, SQL developer, ERWin 7.2, Oracle Warehouse Builder (OWB), Visual source safe V6.0, DevTrack V6.0

Project: Applied Biosystems, Foster City(CA) Mar’08 – Apr’08

Title : SalesForce, SAP integration, Marketing Data Mart

Role: As Data Modeler

Synopsis: Company has installed two software salesforce and SAP. They want to integrate customer, quotation module and create one unique module. Where data will come and merge from two sources. Sales marketing data warehouse

Responsibilities:

Logical and Physical Data Modeling.

Understand business requirement

Customer Master data management (MDM)

Project: PlanetOutInc.com, San Francisco (CA) Oct’06 – Jan’08

Title : Sales/Finance Data Warehouse

Role: As Data Architect/Analyst

Synopsis: Company has different line of product sales. E.g. magazine, dvertisement, cruise sale etc. Data warehouse was created for Billing, Account receivable, Deposits, Sales, Refund, Adjustment Cash Receipt, budget and forecast From data warehouse data feed to and posted to ERP Great Plain.

Responsibilities:

Involve in full project life cycle.

Working with business team and collect business requirement, analyze understand business function and create specification document and finalize requirement.

Reverse engineering source system and create existing data model

Collaboration with various source group and finalized data requirement of BI team. Data profiling source. Design flat file Requirement for source to send data.

Create/maintain data model, data dictionary, business rules, metadata

Establishing process for ETL, QA, Error handling

Manage Issue Tracking/resolution and provide solution.

Convert source system understanding and business requirement into Logical/ Physical Data model. Data flow diagram(DFD)

Dimension modeling, Star Schema Design, normalization/De-normalization

Creation of logical schema, table, indexes, materialized views, sequences

Data mapping source to target document, ETL architecture setup

Participate in establishing data policy, BI architecture, data Architecture, data standard, exception handling process, Data load batch process, data extraction process, Data auditing, data purging policy, Data Rejection.

Data Extraction from flat files to staging using Cognos Data manager

Created reports in Analysis studio and Report studio of Cognos

Created ETL store Procedure, functions using PL/SQL

Performance Tuning of SQL statements

Environment: Oracle 9i, ER Studio, TOAD, DBVisulizer 5.1.1, Data Manager (Cognos), ODBC, Toad Data Modeler, Cognos Analysis studio/report studio, oracle EBS 11i

Project: Cisco Systems, Inc. . San Jose (CA) Dec’02 – Sep’06

Title: LCPR (OLTP), ASRT (OLTP), Cisco University Data warehouse

Role: As Data warehouse Architect/Sr. Data Modeler

Synopsis: E learning Division of Cisco is providing online training to internal and external learner. Data was coming from multiple sources to CU data mart. Advance Service Routing Tools. When incident happens then it automatically routed to the country/time zone/group/available service agent. Once service completed it is recorded.

Responsibilities:

Involve in full project life cycle (PLC) and SDLC from beginning to end

Reporting requirement analysis, Data requirement analysis

Understand source business process and data model and do data analysis and map the requirement to source system.

Conducting meeting with source team and finalize data gathering requirements

Build partnership with various team (ODS, Different source team, BI)

Act as liaison between various functional teams and guiding and mentoring them

Effort estimation and project plan

Assigning Tasks, create road map of development and measure WIP

System Flow, Data Flow Diagram(DFD), CRUD Matrix, Dependency Matrix

Guiding BI team, ETL developer team and QA team. Played key role.

Set up guidelines for ETL process for incremental/full Load

ERD,3NF Data model, Dimension modeling, Star Schema Design, normalization/De-normalization

Convert requirement and source system understanding into Logical Data Model, Physical Data Model and presentation of model to team

Writing Technical specification document for ETL, Source to target data mapping document with pseudo-ETL code.

ETL architecture/ Backend architecture, BI architecture setup, Process set up

Created Meta data, Data dictionary, Business Rule, DDL

Creation of logical schema, table, indexes, materialized views, sequences

Data management of Data integrity, Data standard, Data enhancement, Data publishing, Data auditing, Data Quality, Data Exception handling process., Data Rejection Analysis.

Person Master data Management( MDM)

Manage Issue Tracking/resolution and provide solution.

Risk/Impact Assessment, Gap analysis, Change Management

Written store Procedures, Packages, Function in PL/SQL

Data Migration Scripts in SQLPLUS, PLSQL

SQL Query Tuning using Explain Plan, SQL Analyzer

Export /Import,User/Role, Database Privileges, Database Link, ODBC connectivity

Reverse engineering existing database using Designer 9i.

Environment: Oracle 9i, Oracle Designer 9i, Oracle Enterprise Manager, TOAD, Visio, BugZilla

Project: Commerceflow.com Sep’01 - Sep’02

Title : B2B Data Warehouse, Click Stream

Role: As Data warehouse architect/ Sr. DBA

Synopsis: Company is developing software for online auction sales acceleration. Development database, production database, Data warehouse Database and reporting

Responsibilities:

Business requirement gathering and analysis, source data analysis.

Logical Data Model, physical data model.

Creation of Physical schema, table definitions, indexes, views, sequences

Designing Star Schema, Dimension Modeling, Data dictionary, Data Normalization, De-Normalization

Data mapping, ETL Architecture using Oracle warehouse builder, BI architecture

Data Transformation, Data Loading (flat files, SQL OADER), Data Cleansing

Management of Data integrity, Data standard, Data enhancement, Data publishing, Data auditing, Data Quality, Data Exception handling process.

PLSQL Store procedure to Transfer Data from staging to Data warehouse

SQL Query Tuning, explain plan, Data Migration Script SQL PLUS, PLSQL

Database Monitoring, Performance Tuning/ Partitioning and sizing.

Database Replication/Refreshing/Test/ migration/ upgrade / Creation, Installation.

Database Connectivity, SQL NET, TCP/IP, ODBC, FTP.

Database Administrator, Database Issue Resolution, Troubleshooting,

Database, Data transformation from SQL server to oracle

Oracle Database Capacity Planning, Standby Database maintenance

Database Backup (Hot /Cold), Recovery Strategy and Planning, Export/Import,

Reverse Engineering /Re-organization of Database

Data Encryption/Decryption, Crystal Report

Job Scheduling, Job automation processing, Batch data loading

Data mart, Data warehouse infrastructure setup, configuration.

Scalability, Availability (24/7), Manageability of Database.

OLAP cube, Slowly changing Dimension creation and OLAP reporting

Environment: Oracle 8i/9i, PL/SQL, oracle warehouse builder (ETL), Linux, Windows NT, ERWIN, T-SQL,MSSQL 2000, MSOLAP, MSDTS

Project:Tensquare.com(SanJose),ISP CHANNEL Apr'00-Jun’01

Title: Data Warehouse, Database Integration Project

Role: As Sr .Data Modeler /Technical Lead

Synopsis: Company is providing advertising services at gas pump. Data warehouse

Company is providing Internet services on TV. This project was an Integration of 12 databases which are lying on different platform, Data warehouse

Responsibilities

Preparing Documents for Functional Design and Technical Design

Dimension Modeling, Star/snow flake Schema Design for Data Warehouse

Physical Data model, Attributes/Definition, Data Dictionary, Meta Data

Source Target Analysis, Data analysis, Data mapping document

Creation of Logical schema, table definitions, indexes, views, sequences

SQL Tuning, Explain Plan, PL/SQL Store procedure

Reverse Engineering /Re-organization of Database/Data Migration.

Data Loading / Extracting /Transforming /Refreshing / Replication

Environment: Oracle 8i, PL/SQL, BRIO 5.X, UNIX sun 7.0, Windows NT, ERWIN, IWATCH, TOAD, SQL Loader, MS-Project

Project: Landmark Products, SFO, CA Nov’99 - Mar’00

Title: B2B, Market Place. HTTPRINT.COM

Role: As Sr. Data Modeler /DBA.

Synopsis: This project was E-commerce market place web site for printing industry.

Responsibilities:

Data Modeling (Logical, Physical), Data Normalization (3rd normal form)

Process Design, Business Rule and integrity Constraint implementation

Physical Data model, Attributes/Definition, Data Dictionary, Meta Data

Database Planning, Installation, Creation, configuration,

Creation of Logical schema, table definitions, indexes, views, sequences

PL/SQL Store Procedure, Triggers, TCP/IP, FTP

UML modeling, Use cases, Activity diagram

Environment: Toad, Oracle(8i), ERWIN(3.0.5), PL/SQL, Windows NT, Sun UNIX

Project: Department of Corrections, (Texas, Delaware). Sep’98 - Oct'99

Title: Offender Information Management. (Deloite & Touche )

Role: As Sr. Data Architect/ DBA

Synopsis: This project entailed automating the Offender Management System. The Department of Corrections takes care of offenders during incarceration, parole and their probation period. The system has 16 modules.

Responsibilities:

Requirement Analysis and Design, Business modeling, Data flow diagram

Reverse engineering existing system and corporate new changes to new system

Enterprise Wide Logical Data Modeling. (1300 Entities)

Data Normalization (OLTP), Data Analysis

Business Operation rule and integrity Constraint Implementation

Backend /Database Architecture, Reconciling complex business structure

Physical Data model, Data structure/Attributes/definition, Data Dictionary

Creation of Physical schema, table definitions, indexes, views, sequences

Data Change Management, Impact Analysis and Process to Change

PL/SQL Store Procedures, Triggers, TCP/IP, SQL*NET

Guiding DBA and Programmers for building database and building application

Environment: Oracle 8.0.5, Designer 2000 2.1.2, System Architecture 4.01, DB2

Project: Washtenaw County of Ann Arbor, Michigan. May’98 - Jul’98

Title: Friend of the Court, Child Support Management System

Role: As Project Leader /Architect

Synopsis: The Friend of the Court System deals with the divorced or separated family.

Responsibilities:

Project planning; Project scheduling, Strategy/Policy/Standards Establishment, Issue resolution, Resource allocation, User interaction, Presentation to the client.

Involved in system study, analysis, designing. (JAD), Requirement Specification,

Logical Data Model, Data Normalization, Business Model and technical architecture

Environment: Designer 2000 1.35, Oracle 7.x, UNIX. MS Project 98, Windows

India Experience:

Employer: Bhorucom Software PVT. LTD., Baroda, Gujarat, India. Oct'95 -Apr'98

Project: Kirlosker Electronics, Banglore, India

Order Processing, Invoice

Employer: PAC Computers Pvt. Ltd. Andheri (E), Bombay, India. Mar’93 - Jun’95

Project: Miltex Pvt. Ltd. (HongKong), Chase International

Export Import Processing

Employer: Bush India Ltd. (Mahalaxmi), Bombay, India. Aug'89 - Jan'93

Project: Dattani & Somaiya. (Bombay), India.

Financial accounting system

Employer: Digisoft (India) PVT. Ltd. (Baroda), India Apr’86 - May’89

Role: As Project Leader/Architect/Programmer Analyst

Project Planning, Scheduling, Tracking, Team Management, Communicating customer, negotiation, managing developer, architect, DBA. Issue Resolution, Cost Analysis, Requirement Specifications, User Interaction, RS, FS, FDD, DFD. UM, Design Supply Chain Operation, Prepare Data Model, Developed 100 Modules

Environment: Oracle-7.0, Developer 2000, SQL*FORM 4.5, SQL*REP 2.5, SQL*PLUS 3.1,PL/SQL 2.0,Ms-Project-4.0,ERWIN.,MS-Windows, Clipper 5.01+, Dbase III Plus, ORG 2001 Mini, Cobol and Basic

Education

oBachelor of Science, M.S. University, Baroda, India

oMaster of Science, M.S. University, Baroda, Gujarat, India

oPost Graduate Diploma in Computer programming.

o(Gujarat Technical Education Board. Baroda India)

oC' and C++ Programming under UNIX and DOS

oORACLE 7.0 &SSAD, Case tools, Silicon chip Technologies

oJava Advanced (Java, JDBC, Java Bean, Servlets), EYSOFT



Contact this candidate