Post Job Free

Resume

Sign in

Hadoop Developer

Location:
Wilmington, DE
Posted:
March 03, 2015

Contact this candidate

Resume:

D EVA REDDY

Summary

Around *+ years of professional I T experience including 2+ years in H adoop/Big

d ata ecosystem related technologies.

Excellent understanding / knowledge of H adoop architecture and various

components such as H D FS, Job T racker, Task T racker, Name Node, Data Node

and M ap Reduce p rogramming paradigm.

Hands on experience in installing, configuring, and using Hadoop ecosystem

components like H adoop Map Reduce, H D FS, NoSQL, HBase, Oozie, H ive,

T ableau, Sqoop, Pig, and F lume .

Worked extensively on Data Visualization tool; Tableau.

Experience in managing and reviewing Hadoop log files.

Expert in importing and exporting data into HDFS and H ive using Sqoop.

Experience in analyzing data using H iveQL, P ig Latin, HBase and custom Map

Reduce programs in Java. .

A deep and thorough understanding of ETL tools and how they can be applied in a

B ig Data environment.

Responsible for smooth error-free configuration of DWH-ETL solution and

I ntegration with Hadoop.

Good understanding of cloud configuration in Amazon web services (AWS)

Extending Hive and Pig core functionality by wri ting custom UDF s.

Experience in installation, configuration, supporting and managing - C loudera's

H adoop platform a long with C DH3&4 clusters.

Hands on experience in application development using Java, RDBMS, and Linux

shell scripting.

Worked on multiple stages of Software Development Life Cycle including

Development, Component In tegration, Performance Testing, Deployment and

Support Maintenance.

Knowledge of U N IX and shell scripting .

Have f lair to adapt to new software applications and products, self-starter, have

excellent communication skills and good understanding of business work f low.

Working knowledge in SQL, PL/SQL, MSSQL Server2012, Stored Procedures,

F unctions, Packages, DB T riggers and Indexes.

Technical Skills:

Big Data Technologies: H adoop, HDFS, H ive, M ap Reduce, P ig, Sqoop, Flume,

Zookeeper, Impala, NoSQL, HBase, Cloudera Distribution and Hue

DB Languages: SQL, PL/SQL, o racle, MSSQL Server2012, Knowledge on MongoDB

Operating Systems: UN IX, L I NUX, Windows.

Programming Languages: Java, JavaScript, Restful, HTM L, Python and Eclipse.

Data V isualization Tool: Tableau

Professional Experience

Client: Barclays Mar 2014 – Till Date

Wilmington, DE

Hadoop Developer/ Visualization Analyst

Project Description : The aim of the project is to build r isk profiles for customers for the

bank that has multiple consumer lines of business by analyzing the customer activity across

m ultiple products to predict credit r isk with greater accuracy.

Responsibilities:

Involved in design and development phases of Software Development Life Cycle

(SDLC) using Scrum methodology

Developed data pipeline using Flume, Sqoop, Pig and Java map reduce to ingest

customer behavioral data and purchase histories into HDFS for analysis.

Functional and non-functional requirements gathering.

Exporting the analyzed and processed data to the relational databases using Sqoop

for visualization and for generation of reports for the BI team.

Analyzing large amounts of data sets to determine optimal way to aggregate and

report on these data sets

Importing and exporting data into HDFS and H ive using Sqoop.

Used Pig as ETL tool to do t ransformations, event joins, fil ters and some pre-

aggregations before storing the data onto HDFS

Optimizing Map reduce code, pig scripts, user interface analysis, performance tuning

and analysis.

Analysis with data visualization player Tableau.

Writing Pig scripts for data processing.

Used Hive to analyze the parti tioned and bucketed data and compute various

metrics for reporting on the dashboard.

Loaded the aggregated data onto DB2 for reporting on the dashboard.

Environment : JDK1.6, Linux, Python, Java, Agile, RESTful Web Services HDFS, Map-

Reduce, Hive, Pig, Sqoop, Flume, Zookeeper, Oozie, DB2, NoSQL, HBase and Tableau.

Project: Ha rtford I nsurance Dec 2012 – Feb 2014

Ha rtford, CT

Hadoop Developer

Project Description : This project is intended to replace existing mainframe legacy

applications by storing and processing the data of Billing, Payments, and Disbursements

A pplication Databases entirely in HDFS. The entire processing in HDFS would be done

t hrough Pig, H ive, Map reduce programs. Enhance performance using various sub-project

of H adoop l ike PIG, H IVE, FLUME perform data migration from legacy using SQOOP,

handle performance tuning and conduct regular backups. Ensure technical and functional

designs meet business requirements. We work with various data sources like RDBMS, f lat

f iles, fixed length files and delimited files and legacy file formats.

Responsibilities:

Worked on evaluation and analysis of Hadoop cluster and different big data analytic

tools including Pig, Hbase database and Sqoop.

Responsible for building scalable distributed data solutions using Hadoop.

Involved in loading data from L I NUX file system to Hadoop Dist ributed File System.

Created Hbase tables to store various data formats of PII data coming from different

portfolios.

Experience in managing and reviewing Hadoop log files.

Exporting the analyzed and processed data to the relational databases using Sqoop

for visualization and for generation of reports for the BI team.

Installed Oozie workflow engine to run multiple H ive and pig jobs.

Analyzing large amounts of data sets to determine optimal way to aggregate and

report on these data sets

Worked with the Data Science team to gather requirements for various data mining

p rojects.

Analyzed large data sets by running H ive queries and P ig scripts.

Created dash boards using Tableau to analyze data for reporting.

Support for setting up QA environment and updating of configurations for

i mplementation scripts with Pig and Sqoop.

Environment : Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Linux Red Hat.

Client: GE, MA May 2011 –

Nov 2012

Project: Money Home Lending

Project Description : GE Money Home Lending U K deals with Home Loans and Secure

Loans. Involved in development of solutions to various requirements through complete

SDLC and working with QA team using HP Quality center during In tegration and UAT

testing.

R esponsibilities:

Collected, understood, and t ransmit ted the business requirements for the project

Used agile development methodology.

Involved in each phase of SDLC to ensure smooth and proper functioning of the

p roject.

Retrieved source data using SQL for data analysis.

Performed User Acceptance Testing.

Developed Business Flow Diagrams, Dataflow diagrams, Activity diagrams and Use

cases diagrams using MS Visio.

Environment: M S Office 2007(word, PowerPoint, excel), MS Visio, Agile, Core Java (1.4),

O racle and Linux

Project: Salam I n ternational, I ndia. F eb

2010 - Apr 2011

Role: Java Developer

Project Description : Salam International Investment Limited is a listed public Qatari

shareholding company. The company has operations in diverse business sectors: technology

and communications, construction and development, luxury and consumer products, energy

and industry, and investments and real estate. As one of Qatar’s largest and established

conglomerates, the company owns and manages over thi r ty-three business units.

Responsibilities:

Developed the system by following the agile methodology.

Involved in the implementation of design using vital phases of the Software

development life cycle (SDLC) that includes Development, Testing, Implementation

and Maintenance Support.

Applied OOAD principles for the analysis and design of the system.

Used Websphere Application Server to deploy the build.

Developed front-end screens using JSP, HTM L, JQuery, JavaScript and CSS.

Used Spring Framework for developing business objects.

Performed data validation in Struts Form beans and Action Classes.

Used Eclipse for the Development, Testing and Debugging of the application.

SQL Developer was used as a database client.

Used WinSCP to t ransfer file from local system to other system.

Used Rational ClearQuest for defect logging and issue t racking.

Environment : JQuery, JSP, Servlets, JSF, JDBC, HTM L, JUnit, JavaScript, XML, SQL,

M aven, RESTfulWeb Services, U M L.

Project: Differential P ricing

C lient: Hong Leong Bank, Hong Kong, I ndia Oct 2008-

Jan 2010

Role: SQL Developer

P roject Description : T his Project basically deals with maintaining price from particular

date for an amount and tenure. So this will ensure that the customers investing for that

amount or above and for that tenure or above are eligible to get that extra rate of interest

apart from their original interest available tenure.

This is used to maintain the differential price for value added customers based on the

amount and tenure with effect from the date they maintain this.

Record has to be logged in the maintenance table automatically as and when the customer

g ives the amount range and tenure range. This will refrain user from maintaining for the

i ndividual amounts and tenure if there is large number of maintenance to be done that is in

a r ange.

Responsibilities:

Involved in Full Life Cycle Development in Distributed Environment Using UN IX,

O racle 9i and Shell Scripting.

Involved in database design.

Created tables, stored procedures in SQL for data manipulation and retrieval,

Database Modification using SQL, PL/SQL, Stored procedures, t r iggers, Views in

O racle 9i.

Responsible to mentor/work with team members to make sure the standards and

guide lines are followed and delivery of tasks in time.

Involve in Health Checking of Unix servers.

Database Health Checkup.

Diagnose and monitoring Alert log files.

Dealing with the space management issues.

Working on both L1 & L2 support.

Environment : SQL, PL/Sql, UN IX, Shell Script

Project: Bank of I ndia, I ndia Nov 2007- Sep

2008

Role: Java Developer

P roject Description: Bank of India (BoI) is an Indian state-owned commercial bank w ith

headquarters in M umbai, M aharashtra, India. BoI is a founder member of Society for

Worldwide In ter Bank Financial Telecommunications, which facilitates provision of cost-

effective financial processing and communication services.

R esponsibilities:

Gathered and analyzed user/business requirements and developed System test

p lans.

Managed the project using Test Director, added test categories and test details.

Involved in using various PeopleSoft Modules.

Performed execution of test cases manually to verify the expected results.

Created Recovery Scenarios for the application exception handling using recovery

manager.

Involved in doing the GAP Analysis of the Use cases and Requirements.

Test Scenarios developed for Test Automation.

E nvironment : Windows 98, Java 1.4, C, C++, JSP, Servlets, J2EE, PHP, Multi threading,

OO design, JDBC, HTM L, RAD, WSAD.

Education

Master of Business Administration (MBA), Andhra University, India.



Contact this candidate