Post Job Free

Resume

Sign in

Data Engineer

Location:
Nashua, NH, 03062
Posted:
October 15, 2014

Contact this candidate

Resume:

Naga Kolli

acgdx9@r.postjobfree.com;

603-***-****.

PROFESSIONAL SUMMARY:

o 10+ years of professional experience in IT industry as a Hadoop, Java & Mainframe

developer.

o Experienced and specialized in handling various projects in Banking and Retail domains.

o Proficient in areas of design, implementation, integration, tuning, project documentation and

providing IT solutions as per the requirement.

o Proficient knowledge on handling big data using Hadoop architecture, HDFS, Map Reduce,

HBase and ecosystems like Hive, Pig, Flume, Oozie & Sqoop.

o Working experience on Cloudera Hadoop distribution versions CDH4

o Expertise in writing Map-Reducers in Java.

o Expertise in usage of Hadoop and its ecosystem commands.

o Skilled on Migrating the data from Oracle to Hadoop HDFS using Scoop.

o Proficient on processing the data using Apache Pig by registering User Defined Functions

(UDF) written in Java.

o Skilled in scheduling recurring Hadoop jobs using Apache Oozie workflows.

o Proficient in designing and querying the NoSQL databases like HBase.

o Skilled in RDBMS and very good hands on experience on DB systems like Oracle & MySql.

o Proficiency on Advanced UNIX concepts and working experience on advance

scripting/programming.

o Development & working exposure to software configuration management tools like

Concurrent Version System (CVS).

o Ability to design and develop Extract Transform Load (ETL) packages in Oracle for

manipulating data as per the requirement of the project.

o Worked on different flavors of UNIX systems like Solaris, Red Hat and Ubuntu.

o Excellent customer management/resolution, problem solving and debugging skills with good

verbal/written communications and presentation skills.

EDUCATION:

• M.C.A. from University of Madras with grade of First Class.

TECHNICAL SKILLS:

Operating Systems : RHEL, Ubuntu, Mac OS X.

Big Data Tools : Cloudera CDH3, CDH 4

Hadoop Ecosystem : Hbase, Hive, Pig, Flume, Oozie.

Programming Languages : Java, COBOL, JCL

Database : MYSQL, Oracle, DB2, VSAM, IMS-DB

Documentation Tools : MS-Word, MS-Excel, MS-Power Point.

Hadoop Development Engineer July 2013– Sep 2014.

Project: Audience Measurement (AM)

Client: Union leader, Manchester, NH

Roles & Responsibilities:

• Analyzing the functional specs provided by the client and developing detailed solution design

document with the Architect and the team.

• Discussing with the client business teams to confirm the solution design and changing the

requirements if needed.

• Used Hadoop architecture with Map Reduce functionality and its ecosystem to solve the

customer requirements using Cloudera Distribution for Hadoop (CDH4).

• Streaming the data from their switches using Flume.

• Parsing the data using Map Reduce functions written in Java as per the solution design and

processing the information per subscriber.

• Used Pig Latin scripts to process further process and store the data in HBase using Java API.

• Writing customized User Defined Function’s (UDF) to process further in Java and Perl to

ease the processing in Pig.

• Storing the data in NoSQL database Hive from HBase.

• Migrating the data to Oracle from Hive using Scoop.

• Creating partitions and clusters in Hive.

• Writing the recurring workflows using Oozie to automate the scheduling flow.

• Addressing the issues occurring due to the huge volume of data and transitions.

• Migration of database objects from previous versions to the latest releases using latest

datapump methodologies, when the solution was upgraded.

• Addressing production issues and fixing the priority JIRA tickets.

• Writing the design and technical documentation of the solution for the client.

• Developing the solution on the Fusion Works platform for the change requests and for fixing

the solution production issues.

• Working with DBA & production support teams for implementing production changes.

Environment: Map Reduce, HDFS Sqoop, LINUX, Oozie, Hadoop, Pig, Hive, Hbase,

Hadoop Cluster.

Hadoop Development Engineer July 2012 – Jun 2013.

Client: Eaton & Berube Insurance – Nashua, NH

Description: Retail Insurance Risk analysis Team deals with Retail data, which consists of

multiple sources (home loans, student loans, auto loans) with retail data and finding out credit

risk based on applications built on collateral, guarantor logics etc.

Roles & Responsibilities:

• Developed data pipeline using Sqoop, Pig and Java map reduce to ingest customer

behavioral data and financial histories into HDFS for analysis.

• Moved all crawl data flat files generated from various retailers to HDFS for further

processing.

• Written the Apache PIG scripts to process the HDFS data.

• Created Hive tables to store the processed results in a tabular format.

• Developed the sqoop scripts inorder to make the interaction between Pig and MySQL

Database.

• Involved in resolving the JIRAs based on Hadoop.

• Developed the UNIX shell scripts for creating the reports from Hive data.

Completely involved in the requirement analysis phase.

Environment: Hadoop, Map Reduce, Hive, HDFS, PIG, Sqoop, Oozie, Cloudera, HBase,

ZooKeeper, CDH3, MongoDB, Oracle, NoSQL and Unix/Linux.

Java Developer

Tech Mahindra Private Limited, Hyderabad, India Nov 2007 – May 2012

Project: Non Ledger Application, National Australia Bank, Australia

Account Services is Non Ledger based ledger systems for National Australia Bank, which are

composed of both Batch and Online. Batch component performs mainly Value Posting, Service

Processing and Reporting. The online component involves with various front-ends such as

eBOBS, Siebel, ERS, IVR, Internet Banking and ACAPS. This is the core system of NAB which

deals with daily transactions of customer and business banking.

Roles & Responsibilities:

• Responsible for gathering business and functional requirements for the development and

support of in-house and vendor developed applications

• Gathered and analyzed information for developing, supporting, and modifying existing web

applications based on prioritized business needs

• Played key role in design and development of new application using J2EE, Servlets, and

Spring technologies/frameworks using Service Oriented Architecture (SOA)

• Wrote Action classes, Request Processor, Business Delegate, Business Objects, Service

classes and JSP pages

• Played a key role in designing the presentation tier components by customizing the spring

framework components, which includes configuring web modules, request processors, error

handling components, etc.

• Implemented the Web Services functionality in the application to allow external applications

to access data.

• Worked on spring to develop different modules to assist the product in handling different

requirements.

• Developed validation using spring’s Validation Interface and used Spring Core and MVC

develop the applications and access data.

• Implemented Spring Beans using IOC and Transaction management features to handle the

transactions and business logic.

• Design and developed different PL/SQL blocks, Stored Procedures in DB2 database

• Involved in writing DAO layer using Hibernate to access the database

• Involved in deploying and testing the application using Websphere Application Server

• Developed and implemented several test cases using JUnit framework

• Involved in troubleshoot technical issues, conduct code reviews, and enforce best practices.

Environment :, Oracle 11g J2EE, Java, JDBC, Servlets, JSP, XML, Design Patterns, CSS,

HTML,JavaScript 1.2, Junit, My SQL Server 2008.

Mainframe Programmer June 2003 – Oct 2007

EDZE Software Private Limited, Hyderabad, India

Project: Lock box Application for M & T Bank, New York, USA

Lock box is the box (Post box), which is setup for customers to receive their payments for

processing. There are different lock boxes exist for different customers. Each one will have

their unique identification number.

Lock box is group of three sub applications. They are

Whole Sale Lock box

Retail Lock box

Tax Lock box

Lock box processing involves processing payments made to commercial entities, creating

detail and summary reports, and making deposits of payments. In some cases customer files

of payments are also created.

Commercial customers allow M& T to access their post offices boxes where they have

received payments. These payments are delivered to the wholesale or retail lock box

operations areas where the envelopes are opened and reviewed according to individual

customer specs

Roles & Responsibilities:

• Create Requirement Specifications and Estimations.

• Create High Level and Low-Level Design after thorough analysis.

• Handle relevant technical communication with OSC’s and Client through Telecoms.

• Construct Programs based on program specifications.

• Design, implement and test fixes (for bug reports) and changes (for Change Requests) to

programs.

• Create and execute unit test plans.

• Document all program construction, release documents and capture defects if any.

• Tested new enhancements and bug fixes before deployment into the production environment.

• Involved in preparing functional specifications, release notes and documentation.

Environment: COBOL, JCL, VSAM, CICS, DB2, IMS-DB, Easytrive, ENDEVOR, SPUFI,

FILE-AID, C7, CA7, PANVALET.



Contact this candidate