MONIKA
Fremont CA *****.
Mobile:- 510-***-****
**************@*****.***
OBJECTIVE
Seeking a challenging position in Software development/maintenance/support.
SUMMARY
14+ years in Software
analysis/design/development/debugging/implementation which includes 2+
years of experience in Big data ecosystem related technologies.
. Excellent understanding / knowledge of Hadoop architecture and various
components such as HDFS, JobTracker, Task Tracker, Name Node, Data
Node and Map Reduce programming paradigm.
. Hands on experience in installing, configuring, and using Hadoop
ecosystem components like HadoopMap Reduce, HDFS, HBase, Oozie, Hive,
Sqoop, Pig, and Flume.
. Experience in managing and reviewing Hadoop log files.
. Experience in analyzing data using Pig Latin, HiveQL HBase and
custom MapReduce programs in Java.
. Extending Hive and Pig core functionality by writing custom UDFs.
. Experience in designing, developing and implementing connectivity
products that allow efficient exchange of data between our core
database engine and the Hadoop ecosystem.
. Experience in importing and exporting data using Sqoop from HDFS to
Relational Database Systems
. Machine learning concepts (POC using Mahout for recommendation engine)
Experience working with geographically dispersed teams.
TCP/IP, Sockets, Multithreaded Programming, and IPC based client
server programming using C/C++ .
Operating System Virtualization and Partitioning technologies and
their implementations.
Oracle Technologies and Applications Oracle Application Server,
Database.
Multiple years of RDBMS porting & development experience in Oracle,
Informix and Sybase
Systems development for high-availability fault tolerant cluster
environment.
SKILLS
.
. Operating Systems:- Proficient in Unix, Solaris, HP-UX, AIX, Linux - Red
Hat 5, SUSE 10.0. Unix Internals, System Calls, development/debugging
tools & inter process communication. TCP/IP programming for
client/server, Distributed computing, Rational Clearcase .
. Languages:- C, C++,,HiveQL,Pig latin, Perl, Java, Shell scripting, MPI
programming, Database technologies including Oracle,Sybase and Informix.
EDUCATION
M.S Computer Science Kurukshetra University,INDIA.
B. S. Electronics Kurukshetra Universiy .
Java Certified
EXPERIENCE
Sears Web Intelligence Hadoop Project
June 2013 - till date
Environment : Cloudera Manager 4 version, Hadoop,Pig, Hive, SQOOP
Role : Hadoop Developer.
Description:
The purpose of the project is to store terabytes of log
information generated by the ecommerce website and extract meaning
information out of it. The solution is based on the open source BigData
s/w Hadoop .The data will be stored in Hadoop file system and processed
using Map/Reduce jobs. Which intern includes getting the raw html data from
the websites,Process the html to obtain product and pricing
information,Extract various reports out of the product pricing information
and Export the information for further processing.
This project is maily for the replatforming of the current
existing system which is running on WebHarvest a third party JAR and in
MySQL DB to a new cloud solution technology called Hadoop which can able to
process large date sets (i.e. Tera bytes and Peta bytes of data) in order
to meet the client requirements with the incresing competition from his
retailers.
Role / Responsibilities
. Involved in HiveQL.
. Involved in Pig Latin.
. Importing And Exporting Data from Mysql/Oracle to HiveQL Using SQOOP.
. Importing And Exporting Data from Mysql/Oracle to HDFS.
Krogers :
Jan 2012 - May 2013
Environment : Apache Hadoop,Pig, Hive, SQOOP
Role : Hadoop Developer.
Description:
In this project log datasets need to be analyzed information in semi
structured form. Most of them didn't conform to any specific schema. This
project aims to move all log data from individual servers to HDFS as the
main log storage and management system and then perform analysis on these
HDFS datasets. Flume was used to move the log data periodically into HDFS.
Once the dataset is inside HDFS, Pig and Hive were used to perform various
analyses.
Role / Responsibilities
. Involved in HiveQL.
. Involved in Pig Latin.
. Importing And Exporting Data from Mysql/Oracle to HiveQL using sqoop
.
Experienced in analyzing data with Hive and Pig
. Experienced in defining job flows
. Responsible for operational support of Production system
. Loading log data directly into HDFS using Flume
. Experienced in managing and reviewing Hadoop log file
IBM Technical Consultant
March 2001 - Sept 2011
Oracle Corporation, Redwood Shores, CA.
May 2006 - Sept 2011, IBM Consultant at Oracle Corporation
Job Responsibilities:- Enablement of ongoing releases of Oracle database
on IBM p Series AIX versions 5.x and 6.x.,7.x
o Porting, QA, debugging, certification
o Resolve Customer Escalations on AIX, IBM Hardware.
o Work involved programming skills in the area of Application
Programming Interface, Object Oriented development using C++/Java,
multithreading.
Environment:- C/C++, AIX, ClearCase, PL/SQL,Unix Scripting
Sep 2004 - Sep 2005, IBM Consultant at Oracle Corporation.
Job Responsibilities:- Porting for Linux on IBM's PowerPC architecture.
Involved with port of Oracle client 10.1.0.3/ RDBMS 10.2 on Linux
versions Red Hat 4.0 and SUSE Linux 9.0. Work involves extensive code
development/enhancement primarily in C/C++/Perl with qualifying, testing,
debugging of system level code related to Enterprise manager.Porting of
ioser library from the Sun JDK source to AIX
Environment:- C/C++, Perl, gdb, Linux SUSE and Red Hat, PL/SQL,Unix
Scripting
April 2003 - Jan 2004 Sybase, Dublin,CA.
Certification Sybase Maintanence releases on Aix 5.2
Nov 2002 to March 2003 Oracle Corporation, Redwood Shores, CA.
Involved porting of Oracle 9iAS v9.0.2.0.1 on AIX 5L.Porting
responsibilities included all phases of software lifecycle involving
building, testing, failure analysis, bug fixing and packaging the database
product for release to the customer.
April 2002 - Oct 2002 Sybase, Dublin,CA.
Involved in porting of new releases of Sybase Database Engine ASE 12.5 to
AIX 5.2 . Porting responsibilities included all phases of software
lifecycle involving building, testing, failure analysis, bug fixing and
packaging the database product for release to the customer.
Nov 2001 - March 2002 Informix, Menlo Park, CA.
Involved in porting of existing and new releases of IBM's Informix Database
Engine to AIX /Dynix Unix. This included the Informix server products
(IDS 9.x). Porting responsibilities included all phases of software
lifecycle involving building, testing, failure analysis, bug fixing and
packaging the database product for release to the customer. Work involved
knowledge of TCP/IP based client server technology with extensive use of
Unix system call interface, ANSI "C" and C++. These projects also required
the use of debuggers/debugging techniques with special emphasis on code
optimization.
Report Generation: This module will have standard reports that are based on
the information entered against various elements of the model.Advanced
Specification: This module is responsible for automatic generation of
framework selected by user like EJB, RMI etc. according to their
specification in the customizable XML template. Code Generation Module:
facilitates the wizards to generate the code in Java/C++ language of the
UML Model.
March 2001 - Oct 2001 Oracle Corporation, Redwood Shores, CA.
Enhancement work involved coding of interface to replace AIX system call,
oracle_kstat, which is used by Oracle on AIX to gather system statistics is
not supported for 64 bit applications. This system call has been replaced
with a new set of perfstat API's in AIX 5 which are supported for both 32
and 64 bit applications.
October 2000 - March 2001 Second Foundation Inc., Menlo Park. CA
Build / Release Engineer responsible for products maintained by Second
Foundation under maintenance contracts for its customers and also its own
in-house products in a mixed environment of CVS, SCCS and gnu make.
Build / release engineer for Fujitsu Software's iFlow Workflow engine on
HP-UX 10.x and 11.x, which SecF maintains for Fujitsu Software.
Responsibilities include:Incorporating patches and enhancements for the
product which have been ported from Solaris to HP-UX Merging code bases,
which include enhancements provided by Fujitsu as well as local changes
provided by SecF.
Performing verification builds and resolving problems with makefiles
May 2000 to October 2000 Cadence Design Systems,San Jose, CA
Developed web application which would allow account Managers to see the
Market share metrics of their client vies a vies their competitors broken
down by high-level design flows. Responsible for design and implementation
on Solaris 2.6 and Windows NT over Oracle RDBMS. Made extensive use of JDBC
and Servlets on Oracle 7.x, on Solaris 2.6.
Environmrent:Java,JIntegra,Orgpublisher,Servlets
Aug 1999 to May 2000 Brahma Forces, India
Environment: Web Server API (iPlanet, IIS, Apache), C/C++, HTTP, Rational
Rose.Responsibilities: Coding & Testing of Webserver Plug-in
development in C using Microsoft VC++ IDE and Win32 API.1 Developed
webserver plug-in to intercept the HTTP requests supposed to be coming
fromwireless devices. The plug-in was written for iPlanet Webserver using
NSAPI and laterported to IIS 4.0 Webserver using ISAPI filters.2 Reverse
engineering and document the C++ version webserver plug-in to UML
notationusing Rational Rose 98.3 Learnt and used Rational Rose purifier to
detect memory leaks. Participated in team codeReview and design review.
June 1997 - July 1999 PCL-Mindware, India
PMD Log Data Convert Command
Developed software for PMD Log Data Convert Command in C on HP-UX
10.x. This was because the PMD Agent cannot view Log Data directly .The Log
Data is the data which is previously gathered for the particular request
.This is to be viewed on the NT side .If the customer wants to see the Log
Data on the same Agent machine then this Log convert command first takes
the object Map of the object requested and then converts the raw formatted
data to CSV formatted text file . Did Porting of PM&D Agent and Log Data
Convert Command from HP-UX (10.0) to Solaris 2.x.
PMD Manager Data Collection
Developed support for PMD Manager Data Collection. The PMD Manager is a
sort of interface to the PMD Agent and the PMD SVC. It services the request
coming from the PMD SVC side and forwards it to the appropriate PMD Agent.
Developed software for collecting the Performance Data for various objects
on the machine on which it is running, in C on HP-UX 10.x Ported this
software from HP-UX to Solaris 2.x.
PMD Agent Log Transfer Process
The PMD Agent Log Transfer Process is an independent module which transfers
the Statistical Performance Log Data from the PMD Agent to the PMD SVC's
which request for the same. The SVC requests for the Log Data, for specific
objects which collect on PMD Agent, machines through a third party network
interface called NW-Base and the required log data is sent to the
requesting SVC by the Log Transfer Process running on PMD Agent side. The
Log Transfer process is a multi-threaded application using POSIX threads
and the NW-Base API's. This was developed in C on HP-UX 10.x. Ported this
software from HP-UX to Solaris 2.x.
PMD Manager SGM Module & Maintenance of PMD module
Did Analysis, design, coding and testing on the UNIX side. Main role was to
incorporate various modifications and fix the bugs pertaining to PM&D
Manager side as reported by the client at the Customer Site. This was done
in C on HP-UX 10.x.
Visa Status Green Card holder