Post Job Free

Resume

Sign in

Project Developer

Location:
Gandhinagar, GJ, India
Posted:
January 27, 2015

Contact this candidate

Resume:

[pic]

Venkata Krishna Reddy Sanepalli

Email: acn1d6@r.postjobfree.com

Phone: 210-***-****

Education Qualifications:

. Completed Bachelor of Engineering from Osmania University in 2005 in

first grade.

Experience Summary:

. Cloudera certified developer for Apache Hadoop (CCDH 410)

. SUN certified java programmer (SCJP 6.0)

. IBM certified web services developer

. 10 years' experience in Applications development using Java, J2EE

. Hands on experience HDFS, PIG, Hive, Sqoop, Map Reduce,

. Exposure to HBase, OOZIE, Mongo DB, ORM tools, spark, YARN, scala,

flume, storm, Kafka

. Hands on experience on Development Boot strap, Hibernate and Spring

. Hands on experience in building web services on JAX-RPC and JAX-WS.

. experience working with XML and XML binding and parsing of XML

. Hands on experience on Development of Eclipse Plug-ins.

. Working on High level and low level Design of project using UML

. Conducting Design reviews and Code reviews.

. Leading, guiding and developing, Debugging, performance optimization

during implementation of project

. Testing code with JUnit, and SOAP UI

. Strong Analytical and Problem solving skills.

. Good understanding of Object Oriented principles.

Technical Skills:

Below is a list of important hardware, software products, tools and

methods that I have worked with.

Line of Business: Insurance, Telecomm, Banking

Operating Systems: Windows, UNIX, mainframes

Technologies: JAVA, J2EE, Hadoop, Map Reduce, Pig, Hive, SQOOP, HBase,

Mongo DB, OOZIE, Flume, JSP, Servlet, Perl, Shell,XML, SOAP, JDBC, UML,

web Services, Hibernate, SQL,PL/SQL, spring, spring-mvc, Selenium, JQuery,

java script

Database/ Database Tools: DB2, IMS-DB, Oracle 8i & 9i, NoSQL

Special Software/ Tools: JAZZ Team Server, Rational Team concert, Rational

Software Architect(RSA), Star team, Billing Broker, JUnit, ECLIPSE,

iCharge, TOAD, PS Frame work, Version One, Mingle, smart bear, Confluence,

Jenkins, Sonar Cube, maven, SVN, Pig Unit, MR unit.

Web/Application Servers: web sphere application server, Tomcat, Web Logic,

JBoss

Assignments:

The details of the various assignments that I have handled are listed here.

Project: Web intelligence re-hosting Dec 2013 - Till

Date

Client: Bank of Am rica, Charlotte, NC, USA

Role: Hadoop Developer

Environment : Map Reduce, Hadoop, apache pig, java, php, sqoop, apache

oozie, mysql, spring mvc, jcharts

Hardware : Virtual Machines, UNIX

Description:

Analyse price of items on ecommerce websites like amazon, Walmart, target

etc using web scrawling. Pull the product& model data from e-commerce

websites to local file system. This data will be shipped off to HDFS for

high level processing using PIG. Invalid data will be dumped to local file

system and purged through cron jobs. Find the best price offered for a

given product& model on all ecommerce websites for the given feed. Valid

data and processed data will be exported MySQL database.

Responsibilities:

. Setting-up the Hadoop-cluster on 50 VMs

. Setting Password less Hadoop

. Setup PIG

. Copy web scrawled data into hdfs location

. Written PIG script that will take input as log files and parse

the logs and structure them in tabular format to facilitate

effective querying on the log data.

. Debugging, performance optimization the code.

. Export data to MySQL using SQOOP

Project: Device Fault Prediction Nov 2012 - Nov2013

Client: CISCO, San Jose, CA, USA

Role: Hadoop Developer

Environment : JDK 1.6, HDFS, Map Reduce, Hive, Mahout, RTC, smart bear

Hardware : Virtual Machines, CentOS

Description:

Cisco's support team on a day-to-day basis deals with huge volumes of

issues related to their network products like routers, switches etc. The

support teams have been operating on a reactive model i.e. based on the

customer tickets/queries being raised. Hence, to improve customer

satisfaction, they would like the system to predict network faults based on

the logs being generated by various network devices i.e. by loading them

into Hadoop cluster and analysing them using some of the machine learning

algorithms implemented in Apache Mahout or custom built.

Responsibilities:

Setting up cron job to delete Hadoop logs/local old job files/cluster temp

files

Setup Hive with Mysql as a Remote Metastore

Moved all log files generated by various network devices into hdfs location

Written MapReduce code that will take input as log files and parse the logs

and structure them in tabular format to facilitate effective querying on

the log data

Unit Testing.

Debugging, performance optimization.

Created External Hive Table on top of parsed data

Project: USAA auto event November 2009 - Nov 2012

Client: USAA, San Antonio, TX, USA

Role: Java developer, Java Lead

Environment : JAVA, J2EE, mainframes, DB2 on mainframes, SQLserver, XML,

Web services, EJB, Perl, shell, REXX, Eclipse plug-in Development, RTC plug-

in Development, hibernate, web sphere application server, Ant, RAD 7.0,

Client look up table (CTL), Start team, PUTTY, Presentation services

framework, RTC, JAZZ team server, RTC SCM

Hardware : Virtual Machines, Windows

Description:

USAA auto event project is to provide an end to end solution so that the

complete automobile life-cycle needs like vehicle research, decision

making, vehicle purchase, vehicle insurance, vehicle financing, vehicle

maintenance, and vehicle selling can be achieved through USAA itself

there by the customer need not to depend on any external sources to achieve

some of these needs. Currently USAA supports only vehicle insurance,

vehicle financing and this project aims to add the capabilities like

vehicle research, decision making, vehicle purchase, interaction with

automobiles related social networking sites and quick quote to display

average member pricing. This project aims to achieve all these

functionalities by integrating the existing USAA policy administration

system with several in-house applications like USAA banking application

external applications like, ZAG, HANK, KBB and Bazar Voice

Responsibilities:

. Co-ordinate with client to understand requirements.

. Creating UML diagram for High level and low level design of the

project

. Training and mentor the subordinates.

. Developing web services, eclipse plugins code.

. Reviewing code.

. Writing Junit test cases

. Debugging, performance optimization

. Ensuring smooth release of project by co-ordinating with testing,

release team and business team.

Project: Prepaid Billing for Data Services 06-2007 to

Oct 2009

Client: Tata Tele Services, Hyderabad, India

Role: Java Developer

Environment : JAVA, J2EE, XML, Oracle10, Unix Shell and Perl Script

Programming, SOAP, Oracle PL/SQL.

Hardware : Windows, UNIX

Description:

The Solution is developed for charging the prepaid customers for data

services. With the incapability of Content Platforms to communicate with IN

the solution provides an interface between the data platforms and IN.

Barring the data services of prepaid customers in case of not able to debit

the customer and again restoring the services after recovering the

outstanding amount was also covered in the solution. Incorporating Change

Requests. Writing scripts for various user requirements like Alert

Notifications (mails & alarms).

Responsibilities:

. Understanding the RTEC protocol and developing an application which

debits the customer IN using RTEC protocol, and handles all the

exceptions.

. Understanding the CDR format for different Data services and

converting them into a common format.

. Verifying and Validating all the CDR.

. Generating usage and charging reports.

. Deactivating the Data services of the subscriber on the HLR.

. Maintaining the usage history of the un-billable CDR and recovering

the amount when the customer recharges.

Project: Household Card management April 2005 to May

2007

Client: Civil Supplies Department, Andhra Pradesh govt, India

Role: Java Developer

Environment : JAVA, XML, JDBC, JSP, Servlets, Html, Tomcat5.0, Oracle10

Hardware : Windows, UNIX

Description:

This project aims at receiving the information from household card holders

such as name, father name, address, cast, religion, and land in acres;

annul income, photo, Iris of the entire family. According to Civil Supplies

Commissioner instructions, BPL card holders divided into categories

depending upon their economical, physical, maternal status. In this entire

project we have used 54 tables in database. This project mainly

distinguishes the holders on the identity of the retina, based Irish

technology. Each district is divided to mandal wise, and each mandal is

divided into cluster and each cluster is assigned Fair Price shops. Fair

Price shops are unique in each mandal and range from 1 to 999.

Responsibilities:

Scripting the constraints in Servlets that CCS given such as if one owns a

four wheeler or telephone or having the income 24000 exceeded, he is

eligible for only pink card, in DPL Server. And I have also designed the

declaration form in DPL client. JDBC code to store all the information in

tables



Contact this candidate