Post Job Free
Sign in

Java Developer Project

Location:
India
Posted:
December 27, 2013

Contact this candidate

Resume:

NAGA VENKAT

Phone: 609-***-**** Email:

************@*****.***

Professional Summary

Around 12 years of IT experience including 2 years in Hadoop Eco Systems in

Solution Design, Implementation and Development.

Worked extensively on Design, Development, Testing, and Integration of

applications using various Java/J2EE technologies .i.e. Struts, Hibernate,

JSF, EXTJS, Spring MVC, J2EE, JSP, Servlets, JDBC, EJB, Web Services,

Oracle and XML.

Key Highlights

V Cloudera Certified Developer for Apache Hadoop CDH4.

V Experience in Hadoop cluster installation, capacity planning and

development in Cloudera and Horton works Hadoop distributions.

V Experience in Design and Setup Hadoop Cluster Environment for new

Projects.

V Providing Business Solutions for Hadoop Implementation.

V Expert in importing and exporting data into HDFS and Hive using Sqoop.

V Experienced in writing Map Reduce Programs and using Apache Hadoop API

for analyzing the logs.

V Expert in writing HiveQL queries and Pig Latin scripts.

V Experience in working with flume to load the log data from multiple

sources directly into HDFS.

V Experience in Data migration from existing data stores and mainframe

NDM (Network Data mover) to Hadoop.

V Experience in upgrading the existing Hadoop cluster to latest

releases.

V Experienced in using NFS (network file systems) for Name node metadata

backup.

V Experience in using Cloudera Manager 4.0 for installation and

management of Hadoop cluster.

V Experience in working with BI team and transform Big Data requirements

into Hadoop centric technologies.

V Experience in performance tuning the Hadoop cluster by gathering and

analyzing the existing infrastructure.

V Experience in automating the Hadoop Installation, configuration and

maintaining the cluster by using the tools like puppet.

V Good Experience as part of Due Diligence Team.

V Good Experience in developing applications using Core Java, Multi-

Threading and J2EE Technology Stack.

V Experience in using code review tools like Sonar and Find Bugs &

integration tools like Jenkins.

V Experience in writing Build Scripts like Ant, Debian.

V Experience in Test Driven Development (Junit, Mockito, Power Mock).

V Experience in client interactions with both onsite-offshore delivery

models and have worked with globally diversified population.

V Define, communicate and developing technical solutions end-to-end.

V Define the development tools and environment.

V Coach the development team in understanding and developing of the

technical architecture.

V Recommend the development methodologies and frameworks for the

project.

V Assist in removing roadblocks on technical front -PoCs, code blocks

etc.

V Ensure that all components of the technical architecture are properly

integrated and implemented.

V Co-ordinate vendor services related to technology selection and

implementation.

V Establish and enforce compliance with coding guidelines using code

reviews etc.

V Good Experience in Agile Development Methodology.

V Certified Java Programmer, Six Sigma Green Belt, Agile Scrum Master.

V Good Experience in implementing Six Sigma processes.

V Create and proliferate templates for deliverables.

V Guide the team in doing POCs and early risk assessments.

Technical SUMMARY

Bigdata Technologies Hadoop, HDFS, MapReduce, Hive, Pig, HBase, Scoop, and

Cassandra.

Hadoop Distributions Cloudera CDH3 & CDH4, Horton Works.

Languages Java (J2SE&J2EE), Python, C, C++, SQL, and PL/SQL.

Technologies EJB, JSP, JDBC, Servlets, JNDI, JavaScript, Ajax,

spring, EXT JS, JSF.

Methodology UML, and Agile RUP, SCRUM.

ORM technology Hibernate.

App/Web servers Jakarta Struts 1.x, Spring MVC, Spring AOP.

Databases Oracle 8i/9i, MySQL, HBase, MongoDB, Cassandra.

Operating Systems Windows 7/XP/9X/NT/2000, UNIX, Linux, Solaris.

Tools ANT, JUNIT, log4J.

IDEs Eclipse, PL/SQL Developer, TOAD.

Content Management Percussion, Drupal, Fast Content Management.

System

Scripting Languages: HTML, DHTML, Java Script.

Web services SOAP, WSDL, AXIS, JAX-RPC and JAXB, DOM, SAX, REST,

JAX-WS.

Certifications

. Sun Certified Java Programmer.

. Cloudera Certified Developer for Apache Hadoop CDH4.

Projects

Client : Sprint Nextel Corporation, Nov 2012 to Till Date

Overland Park KS

Project : Point Of Service (POS) Logging

Analysis

Role : Technology Lead

Sprint POS is a system features the Sprint Point-Of-Sale application, which

runs in all Sprint retail stores. This software automates sales by enabling

comprehensive functions vital to the Sprint retail marketplace, such as

reporting, processing of returns, sales, discounts/exceptions, and receipt

printing and configuration. There were huge amount of data flow of the logs

from different servers about the transactions at different stores

Nationwide. This POS primarily works out the type of exceptions that are

frequently occurring at the sites and also reduce the overhead of the

exceptions and recommends some best solution to the system.

Responsibilities:

V Involved in gathering the POS transactional Logs and analyzed for

the logs for developing Map reduce program.

V Installed and configured the Hadoop, HDFS in Linux Machine.

V Analyzed the requirements from the customers and participated in

Agile.

V Experienced in writing the Map Reduce Program by using various

Apache API for reading and analyzing the transaction logs.

V Involved in writing the hive queries for analysis on the structured

data in the output folder of HDFS.

V Experienced in integrating hive and HBase for better performing the

Map reduce algorithm.

V Involved in clustering of Hadoop in the network of 20 nodes one

making the master and rest as slaves.

V Tested a sample of raw data and executed performance scripts and

turned over to production.

V Past 2 years' data is collected from RDBMS (Oracle) and pushing

into Hadoop using Sqoop.

V T logs are pulled from log server and stored in the FTP server

hourly wise; this data is pushed into Hadoop by deleting data in

the FTP server.

V Data was pre-processed using MapReduce and stored into the Hive

warehouse.

V Conducted KT sessions on Hadoop framework and Map Reduce Algorithm.

Environment: Core Java 1.6, Hadoop 1.0.4, Hive, HBase, HDFS, Map Reduce

Programming, flume, Sqoop.

Client : Sprint Nextel Corporation, Oct 2011 to Oct 2012

Overland Park KS

Project : RFISS

Role : Technology Lead

RFISS, "Radio Frequency Interference Surveillance System" is the firm wide

standard for reporting CDMA 1X, EVDO, LTE data. The reporting system data

allows RF Engineers to understand network cell sites behavior and enrich

network performance. The RFISS reporting system gets the logs from

Mediation.

Responsibilities:

V Launched and Setup of HADOOP/ HBASE Cluster which includes configuring

different components of HADOOP and HBASE Cluster using CDH

Distribution on Linux.

V Experienced in loading data from UNIX local file system to HDFS.

V Experienced on loading and transforming of large sets of structured,

semi structured and unstructured data.

V Written Scripts to deploy monitors, checks and critical sysadmin

functions automation.

V Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the

Hadoop cluster.

V Managed and scheduled Jobs on a Hadoop cluster.

V Migrated the existing RFISS system to Hadoop by extracting files from

RDBMS through Sqoop and placed in HDFS and processed.

V Performance tuning and troubleshooting of MapReduce jobs by analyzing

and reviewing Hadoop log files.

V Developed Map Reduce programs in Java for parsing the raw data and

populating staging tables.

V Created Hive queries to compare the raw data with EDW reference tables

and performing aggregates

V Experienced in defining job flows.

V Installed and configured Hive and also written Hive UDFs.

V Involved in creating Hive tables, loading with data and writing hive

queries

V Develop Hive queries for the analysts.

V Cluster co-ordination services through Zookeeper.

V Collected the logs data from web servers and integrated in to HDFS

using Flume.

V Part of Due Diligence team.

V Acted as Scrum Master.

Environment: Core Java 1.6, Hadoop 0.20.2, Hive, HBase, HDFS, Flume, Sqoop,

Map Reduce Programming.

Client : Data Cash, UK ( Offshore India) April 2011 to July 2011

Project : Translation System

Role : Java Team Lead

DataCash, in addition to being acquired by MasterCard, has also

over the last few years acquired many other payment processing

companies. The above has resulted in a situation where DataCash now

has multiple active processing gateways with merchants processing

actively via all of these gateways. This fragmented 'architecture'

does not allow MasterCard and DataCash to fully leverage the assets

and merchants it has inherited through the acquisitions. Cross sell

of services into merchants are limited by the fact that not all

merchants are integrated and processing through the primary

processing platform i.e. Data Cash Payment Gateway Migrating

merchants to the central platform is not easy as the interface and

messaging to each of the platforms are different. As a result, the

investment from the merchant to reintegrate to DataCash Payment

Gateway is very high and as a result we have seen resistance. The

long term objective of this project is to migrate all processing

centrally to DataCash Payment Gateway with minimal impact or effort

required from the merchant. The extent of the impact to the

merchant should be configuring a change in URL and possibly IP

addresses.

Responsibilities:

V Actively participated in design and code reviews.

V Involved in Developing Spring MVC Controller, Service and DAO layers.

V Involved in preparing UI and JUnit and Integration Test Cases.

V Responsible for writing build Scripts

V Responsible for writing shell scripts where necessary.

V VM Creations and Environment setups for Team members.

V Responsible for monitoring Coding standards using Sonar

V Responsible for implementing code integration using Jenkins.

V Acted as Scrum Master.

Environment: Java 1.6, JSP, Servlet, Spring MVC, XML, Mockito,

Power Mock, MYSql, Tocat, Debian, Windows 7

Client : CITI Group, TX( Offshore April 2010 to April 2011

India)

Project : Relationship Pricing

Role : Sr. Java Developer

Relationship pricing model is the user interface where relationship

regional managers can calculate prospective net income and respective

returns for the full contractual life. Term Loan and Lines of Credit are

the core products of the relationship pricing model. All other products are

viewed as a cross-sell contribution to these core products .For Term Loan

and Lines of credit products, the relationship pricing model will calculate

prospective net income (spread + fees) and respective returns for the full

contractual life .For all other products prospective revenue will be

limited to 12 months. For the pricing of existing (historical) the pricing

model will use the prior 12 months of net income to project the next 12

months.

Responsibilities:

V Leading Offshore Team and SPOC

V Involved in Development of code.

V Responsible for developing code for various user Stories and tasks.

V Providing Solutions/fix to issues identified as technical bugs.

V Has Done Unit and integration testing.

V Written unit test cases for all Sprints.

V Provide technical support and technical quality control throughout

all Sprints of the project.

V Acted as Scrum Master.

Environment: JSF1.2, Rich Faces, Java1.6, JSP, Html, Java script, Ajax,

XML, RAD7, Web Sphere Application Server7.0

Client : CITI Group, TX ( Offshore March 2008 to March 2010

India)

Project : Product Management Repository

Role : Sr. Java Developer

Product Management Repository (PMR) is a Master Data Management system to

manage Product and Product Bundle life cycle. PMR contains two sub-systems:

Authoring Tool and Central Catalog. Authoring tool will provide User

Interface (UI) to maintain Product and Product Bundle data and automated

workflows to review the data by various stakeholders. The approved Products

and Product Bundles will be published to a Central Catalog. The client

applications will access Product and Product Bundle information from the

Central Catalog either in real-time or batch.

Responsibilities:

V Leading Offshore Team and SPOC

V Involved in Development of code.

V Responsible for developing code for various user Stories and tasks.

V Providing Solutions/fix to issues identified as technical bugs

during development phases.

V Written unit test cases for all Sprints and validated.

V Has Done Unit testing.

V Has Done Project Documentation.

V Provided technical support and technical quality control throughout

all Sprints of the project.

Environment: EXTJS, AJAX, Spring MVC, Spring Security, XML, RAD, Web Sphere

Application Server7.0, Sonar.

Other Projects Executed

Client : GE Money Home Lending, UK ( Jan 2007 to Feb 2008

Offshore India)

Project : Home Lending

Role : Java Developer

Client : GE Money Bank, Austria ( Aug 2005 to Dec 2006

Offshore India)

Project : Lending

Role : Java Developer

Client : GE Operations ( Offshore Jan 2004 to July 2005

India)

Project : GDC Tools Automation

Role : Java Developer

Client : Eagle Global Logistics, USA ( Jan 2003 to Dec 2003

Offshore India)

Project : 4S ESupply (Supply Chain

Management)

Role : Java Developer

Client : Apollo Health Street, India ( Jan 2002 to Dec 2002

Offshore India)

Project : Demand IR Gen

Role : Java Developer

[pic]



Contact this candidate