Post Job Free
Sign in

Project Data

Location:
Hyderabad, Telangana, India
Posted:
February 15, 2016

Contact this candidate

Resume:

RADHIKAJANGAMPETA

+91-990******* E-Mail: actja5@r.postjobfree.com

Professional Summary:

* ***** ** ******* ** experience in Application Development in Java and Big Data Hadoop.

* ***** ** *xclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop.

Good Knowledge in Setting Hadoop Cluster

Good working knowledge with Map Reduce and Apache Pig

Involved in writing the Pig scripts to reduce the job execution time

Very well experienced in designing and developing both server side and client side applications.

Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team.

Exceptional ability to learn new concepts.

Hard working and enthusiastic.

Knowledge on HBase and Oozie.

Professional Experience:

Currently Working as a Systems Analyst in AtoS India Pvt Ltd, Bangalore since April 2013.

Technical Skills:

Languages

Java,Hdfs, MapReduce, Pig, Sqoop,Hive,Flume.

Java IDEs

Eclipse.

Databases

MySQL (DDL, DML, DCL) .

Design Skills

Object Oriented Analysis and Design (OOAD), UML.

Operating Systems

Windows7, Windows XP, 2000, 2003, Unix and Linux

Project Details:

Project #1:

Project Name : Target – Web Intelligence

Client : Target Minneapolis, Minnesota, USA.

Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL

Duration : April 2013 to March 2015

Role : Hadoop Developer

Description:

This Project is all about the rehousting of their (Target) current existing project into Hadoop platform. Previously Target was using mysql DB for storing their competitor’s retailer’s information.[The Crawled web data]. Early Target use to have only 4 competitor retailers namely Amazon.com, walmart.com etc….

But as and when the competitor retailers are increasing the data generated out of their web crawling is also increased massively and which cannot be accomodable in a mysql kind of data box with the same reason Target wants to move it Hadoop, where exactly we can handle massive amount of data by means of its cluster nodes and also to satisfy the scaling needs of the Target business operation.

Roles and Responsibilities:

Moved all crawl data flat files generated to HDFS for further processing.

Written the Apache PIG scripts to process the HDFS data.

Created Hive tables to store the processed results in a tabular format.

Developed sqoop scripts in order to make the interaction between Pig and MySQL Database.

Involved in gathering the requirements, designing, development and testing

Writing script files for processing data and loading to HDFS

Writing CLI commands using HDFS.

Completely involved in the requirement analysis phase.

Analyzing the requirement to setup a cluster

Ensured NFS is configured for Name Node

Moved all log/text files generated into HDFS location

Knowledge on Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data

Created External Hive Table on top of parsed data.

Project #2:

Project Name : Peripheral Quotation Generator System

Environment : Eclipse, Java, Javascript, Oracle 9i

Duration : 4 months

Role : Java Programmer

Description :

A group of hardware vendors turned into a network to compete with the currently prevailing fierce competition. These Syndicate members are spread across the city to provide better accessibility and service to their customers. By becoming a Syndicate, they can pool all the orders from customers of different locations in the city, and make a bulk purchase to gain more voluminous margins / discounts from the main distributors. In turn, these profits can be shared with customers by providing more competitive prices, and better service.

Responsibilities:

Developing the system,which meet the SRS(software requirement specification) and solving all the requirements of the system.

Demonstrating the system and installing at client’s location after the acceptance testing is successful.

Submitting the required user manual describing the system interfaces to work on it and also the documents of the system.

Conducting any user training that might be needed for using the system.

Maintaining the system for aperiod of one year after installation.

Qualifications:

M tech -Software Engineering from JNTU University in 2012 with 72%.



Contact this candidate