Post Job Free

Resume

Sign in

Data Project

Location:
Bengaluru, KA, 560001, India
Posted:
September 18, 2016

Contact this candidate

Resume:

Name

Radhakrishnan R

Contact No

998*******

E-mail

acwnrk@r.postjobfree.com

Total Exp

1.9 Years

Communication Skills

3.5/5

Current Organization

MAVA Technologies

Current Location

Hyderabad

Highest Degree

B.E

Notice Period

20 Days

Professional Summary

Working as Software Engineer at MAVA Technologies,Bangalore.

10 months of experience on Hadoop and its components like HDFS,HIVE,PIG

Total 1.9 years of work experience in Design, Development and Maintenance in Hadoop and Mainframe Technologies.

Efficient in building HIVE, PIG Latin Scripts.

Involved in converting the raw documents to the text documents for processing in the Hadoop environments.

Analyzing the xml data’s and Json data’s by writing the pig scripts.

Experience on Mainframe large scale applications and their Maintenance and Enhancements using VSAM and DB2.

Excellent problem solving skills, high analytical skills, good communication and interpersonal.

Skills

Operating system: Linux, Windows, MVS, z/OS

Environment: LINUX, HDFS

Languages: COBOL

Database: MySQL,HBASE(No SQL Concept),DB2

Hadoop: HDFS,HIVE,PIG,Map Reduce,SQOOP,OOZIE

Frameworks: Hadoop

Project Profiles

Project Name: Chrysler Group LLC

Client: TCS

Environment: MAINFRAME

Role: Developer

Duration: 1 year

Description:

The project is fully automated for Chrysler. Chrysler is one of the leading vehicle manufacturer organization in automobile industries. There are multiple product lines released in every year by Chrysler. It has multiple manufacturing units and assembly plants in different countries like US, Canada, Mexico and across the globe. So the manufacturing parts is transferring from one countries to other countries assembly plants via ocean, train, airway’s. Border crossing between countries are taken care by the business peoples. And after sales services for individual customers, including warranty and replacements, commercial, mortgage and consumer. Now Chrysler merged with Fiat. So it wanted to upgrade the present system. In this process, Applications need to be upgraded and customer data’s and parts data’s needs be converted, so as customer data is identified in all the applications. The enhanced application is compared with the old and is tested for the successful execution.

Responsibilities:

Requirement Analysis.

Coding, code review and documentation.

Handling change requests.

Preparing Unit testing.

Coordinated communication between QA and development team.

Analyzing and understanding the client requirements and problems.

Modifying the existing codes as per the requirement of the client.

Impact Analysis.

Project Name: Chrysler Distributed Network System

Client: TCS

Environment: HADOOP

Role: Developer

Duration: Sep 2015 to Till Date

Description:

Purpose of this project is to store gigabytes of transactional log information generated by the system. These log files are produced on monthly basis. These log files has to be parsed by set of rules defined in the various formats. Initially these were loaded into the database and retrieving these rules is time consuming activity. With the solution based on the open source BigData (Hadoop), we reduced the time for the whole process. Data will be stored in Hadoop distributed file system and processed using Map/Reduce,Hive and Pig scripts. Which intern includes getting the data from the MySQL,DB2 and process the files to obtain the analysed information from all the datasets, Extract various reports out of this information and Export the information for further processing. And sentimental analysis on released vehicle’s. This will help to meet the client requirement of revenue increase by delivering comprehensive data, advanced analytics.

Responsibilities:

Analyzing the requirement to setup a cluster.

Prepared Design document specifications.

Moved all log files generated by various network devices into HDFS location.

Developed the sqoop scripts in order to make the interaction between Hive,Pig and MySQL Database.

Prepared sentimental analysis and generate the reports.

Involved in developing the Pig scripts.

Involved in creating the External Hive Table on top of parsed data.

Prepared Unit Test documents and code review documents.



Contact this candidate