Post Job Free

Resume

Sign in

Data Manager

Location:
Hoffman Estates, IL
Posted:
February 22, 2017

Contact this candidate

Resume:

+1-630-***-****

Apeksha Mahendra Khatik

acyyn5@r.postjobfree.com

Professional Summary

5.9 years of total IT experience in Development of various applications by using Hadoop and Java Technology.

Strong web application development experience using Core and Advance Java.

Very good experience in complete project life cycle (design, development, testing and implementation) and Web applications.

Strong Knowledge on Hadoop Cluster architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node and MapReduce programming paradigm.

Experienced in usage of Hadoop shell commands and Bash Shell commands.

Experienced in writing Map-Reducers logic using JAVA, Ruby and Pig.

Involved in migration development of applications through different databases into Hadoop.

Good understanding and knowledge of NOSQL databases like mongodb.

Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice versa.

Experience working in DB2,My SQL and Teradata database

Scripting to deploy monitors, checks and notification alerts for critical system

Hands on experience in application development using Java, RDBMS, and Linux shell scripting.

Experienced in developing and implementing Map Reduce jobs using Java and Pig to process and perform various analytics on large datasets

Experience with performance tuning on pig and hive jobs.

Successfully managed multiple priorities, and performed under pressure in a fast-paced, rapidly changing environment.

Capable of quickly acquiring new skills and learn about the systems that support the business.

Strong communication skills and effective personal interactions combined with technical expertise.

Excellent capability in understanding others code and modifying it to suit new requirements.

Highly analytical, motivated and result oriented.

Employment History:

Sears Global

Technology Services LLC, USA

20 March 2014 - Till Date…

Technical Associate

Sears Holding Pvt Ltd, India

31 March 2011 – 19 March 2014

Technical Associate

Technical Profile:

Languages

Pig, Hive, Ruby, Java, SQL, HTML, Unix Shell, JQuery, Java Script, JSP, Ajax, HTML, XML, JSON, UML, PL/SQL

RDBMS

MySQL, Oracle, Teradata, DB2, MS SQL Server

Technology / Service

Hadoop, Java, Struts 2, MongoDB

Tools

Operating Systems

Eclipse, Control-M,Putty, Winscp, Teradata SQL Assistant, Apache Tomcat 6.0, SVN, MS Office,SQLYOG, JBOSS

Windows Family, Linux, UNIX

Academic Profile:

Bachelor of Engineering in Information Technology

S.R.E.S College of Engineering – Kopargaon, Maharashtra.

Certification:

Oracle Certified Professional, JAVA SE 6 Programmer

Seed Certified JAVA Professional

Project Profile:

Project # 1: Dynamic Pricing Application

Hadoop, Apache Pig, Fast export utility, bash Shell Script, Sqoop utility, MySQL Server, Hive, Ruby, Control-M

Client: Sears USA.

Role: Hadoop Developer

Responsibilities:

Implemented business logic in pig script with pig udf.

Implemented the Sqoop import and Sqoop export utilities to get and load data into MYSQL.

Implemented fast export scripts to deal with Teradata tables data.

Developed multiple pig jobs for data cleaning and processing.

Tuned Jobs for optimal performance to process the large data sets

Built reusable pig UDF libraries for business requirements which enabled users to use these UDF's in pig commands

Processed data with pig script is imported into Hive warehouse which enabled business analysts to write Hive queries

Configured big data workflows to run on the top of Hadoop using Control M and these workflows comprises of heterogeneous jobs like Pig, Hive, Sqoop and MapReduce

Developed suit of Unit Test Cases for pig, hive jobs.

Developed workflow in Control M to automate tasks of loading data into HDFS and preprocessing with PIG

Used hive to analyze or test the output.

Prepared technical document for this project.

Bug fixing and 24 * 7 production support

Description:

Dynamic Pricing application has been designed to generate dynamic pricing recommendation for items and send prices to Online and physical stores. These Price recommendations are based on rules created by Business users and data from different other database.

Project # 2: Pricing Hub Application

J2EE, Struts 2.0, Pig, Bash Shell, Mysql import/export, Sqoop import, Teradata, Control-m

Client: Sears USA.

Role: Hadoop Developer

Responsibilities:

Responsible for design, development, maintenance, documentation of business logic as well as UI

Unit and system testing of application code, as well as execution of implementation activities

Implemented business logic in pig script with pig udf.

Implemented the mysql import and mysql export utilities to get and load data into MYSQL.

Implemented fast export scripts to deal with Teradata tables data.

Processed data with pig script is imported into Hive warehouse which enabled business analysts to write Hive queries

Configured big data workflows to run on the top of Hadoop using Control M and these workflows comprises of heterogeneous jobs like Pig, Hive, Sqoop and MapReduce

Developed suit of Unit Test Cases for pig, hive jobs.

Used hive to analyze or test the output.

Prepared technical document for this project.

Bug fixing and 24 * 7 production support

Automated workflow using Shell Scripts.

The Hive tables created as per requirement or external tables defined with appropriate static and dynamic partitions, intended for efficiency.

Description:

Pricing Hub Application is used to manually push prices on sears.com and Kmart.com. Once Business user uploads the prices for sears.com and Kmart.com, validation process which runs on Hadoop does cost and price, Inventory validation and change the item status to valid so that price for an item would get sent to Sears.com or Kmart.com. This gives ability for user to change prices based on competitor prices.

Project # 3:Offer Manager Support UI

J2EE, Struts 2.0, Javascript, Ajax, HTML, JSP, Bash Shell, Sqoop import, Control-m,

Client: Sears USA.

Role: Java and Hadoop Developer

Responsibilities:

Responsible for database design, coding, development, maintenance and support of application

Unit and system testing of application code, as well as execution of implementation activities

Lead and mentor other developers in coding in developing clean and quality solutions

Analysis and identification of technical areas of improvement within applications

Developed Hadoop jobs to refresh data on the cluster for Batch processing.

Communicate gaps in business and technical functionality

Description:

Offer Manager support Application is designed for user to create query and generate the report as per business need. This Application has below modules which help Business users to perform multiple activities to take corrective action on prices for Items.

1. Add/update/Search offers

2. Flag items to Resend

3. Generate back traffic files

4. Activity update



Contact this candidate