Post Job Free
Sign in

Data Developer

Location:
Ambavaram, AP, 523112, India
Posted:
March 23, 2015

Contact this candidate

Resume:

Lekhaj

Email: ******.******@*****.***

Phone: 858-***-****

Professional Summary

* Over 7 years of experience in IT which includes Analysis, Design, Development, Testing and user training of Big Data using Hadoop, design and development of web applications using JAVA, J2EE and data base and data warehousing development using My SQL, Oracle and Informatica

* Hands on experience in installing, configuring and using ecosystem components like Hadoop Map reduce, HDFS, HBase, Zookeeper, Hive, Sqoop, Pig, Flume, Cassandra, Cloudera and Horton Works

* Experience in building maintaining multiple Hadoop Clusters of different sizes and configuration and setting up the rack topology for large cluster

* Good Understanding of Hadoop architecture and Hands on experience with Hadoop components such as Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce concepts and HDFS Framework

* Experience in Developing and Implementing Map reduce Programs using Hadoop to work with Big Data

* Experience in using Sqoop to import data into HDFS from RDBMS and vice-versa

* Experience in using different Hive Serde’s like Regex Serde and HBase Serde

* Developed Hive queries for the analysts.

* Cluster co-ordination services through ZooKeeper

* Experience in analyzing data using Hive, Pig Latin and custom MR programs in Java

* Experience in scheduling and monitoring jobs using Oozie and Zookeeper

* Hands on writing custom UDFs for extending Hive and Pig core functionality

* Experience in dealing with log files to extract data and to copy into HDFS using flume

* Developed Hadoop test classes in Hadoop for checking Input and Output

* Experience in integrating Hive and Hbase for effective operations

* Developed the Pig UDF'S to pre-process the data for analysis

* Experience in Impala, Solr and MongoDB, HBase, Cassandra

* Experience in Database design, Entity relationships, Database analysis, Programming SQL, Stored procedure's PL/ SQL, Packages and Triggers in Oracle and SQL Server on Windows and LINUX.

* Strong understanding of Data warehouse concepts, ETL, data modeling experience using Normalization, Business Process Analysis, Reengineering, Dimensional Data modeling, physical & logical data modeling.

* Experience in JAVA, J2EE, WEB SERVICES, SOAP, HTML and XML related technologies with strong analytical and problem solving skills and ability to follow through with projects from inception to completion

* Have good interpersonal, communication skills, strong problem solving skills, explore/adopt to new technologies with ease and a good team member

Hadoop/Big Data Technologies

Hadoop (Horton Works, Cloudera): HDFS, Map Reduce, Pig, HBase, Zookeeper, Hive, Oozie, Sqoop, Flume and Solr

Programming Languages

Java JDK1.4/1.5/1.6 (JDK 5/JDK 6), C/C++, HTML, SQL, PL/SQL, AVS &JVS

Frameworks

Hibernates 2.x/3.x, Spring @.x/3.x, Struts 1.x/2.x

Web Services

WSDL, SOAP, Apache CXF/Xfire, Apache Axis, REST, Jersey

Client Technologies

JQUERY, JAVA Script, AJAX, CSS, HTML 5, XHTML

Operating Systems

UNIX, Windows, LINUX

Application Server

IBM Web sphere, Tomcat, Web Logic, Web Shere

Web Technologies

JSP, Servelets, JNDI, JDBC, Java Beans, Java Scripts

Databases

Oracle 8i/9i/10g & MySQL 4.x/5.x

Java IDE

Eclipse 3.x, IBM Web Sphere Application Developer, IBM RAD 7.0

Development Tools

TOAD, SQL Developer, SOAP UI, ANT, Maven, Visio, Rational Rose, Endur 8.x/10.x/11.x, Informatica 9.1

USAA, San Antonio, TX Aug 2014 - Present

Hadoop Developer

Responsibilities:

* Gathered Business requirements from the Business partners and Subject Matter Experts

* Developed simple and complex map reduce programs in java for data analysis on different data formats

* Experience in administrative works

* Developed workflows using Oozie to automate the task of loading the data into HDFS and pre-processing with Pig

* Worked on partitioning the HIVE table and running the scripts in parallel to reduce the run time of the scripts

* Optimized map/reduced jobs to use HDFS efficiently by using various compression mechanisms

* Analyzes the data by performing Hive queries and running Pig scripts to study data

* Implemented business logic by writing PIG UDF’s in java and used various UDF’s from Piggybank and other source

* Worked with application team to install operating system, Hadoop updates, patches, version upgrades as required

* Exported the analyzed data to the relation database using Scoop for visualization and to generate reports for the BI team

* Experience in ETL process consisting of data transformation, conversion, loading and data analysis, design and development of mapping

* Supported in setting up QA environment and updating configuration for implementation scripts with Pig and scoop

* Implementation testing scripts to support test driven development and continuous integration

Environment: Hadoop, Map Reduce, HDFS, Hive, Pig, Java, SQL, Scoop, Flume, Oozie, Java, Maven, Eclipse, informatica

Merck & Co, Boston, MA May 2013-July 2014

Big Data/ Hadoop Developer

Responsibilities:

* Developed Map Reduce programs in Java for Data Analysis

* Handled importing of data from various data sources, performed transformation using Hive, Map Reduce, and loaded data into HDFS

* Experienced working with real time analytical operations using HBase

* Experience in implementing CRUD operations on HBase Data using HBase client Java API and Rest API

* Experience in integrating HBase with Map Reduce to work on Bulk data

* Monitoring and managing the Hadoop cluster through cloudera manager

* Experience in setting up standard and processes for Hadoop based application design and implementation

* Developed Hive Queries for data analysis to meet the business requirements

* Used Pig as a ETL tool to do Transformations, even joins and some pre-aggregations before storing data into HDFS

* Experience in creating Hive Tables and working on them using Hive QL

* Experience in extending HIVE and PIG core functionality by using UDF’s

* Experienced with performance tuning against Hive Operations

* Experienced in Designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and Hadoop ecosystem

* Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa

* Experience in utilizing Informatica Power Center, ETL planning of distinctive reporting prerequisites are conveyed

Environment: Java, Eclipse, Linux, Apache Hadoop 1.0.3, Map Reduce, HBase, Sqoop, PIG, Hive, Flume, Oracle 10g, informatica

Met Life, NY Feb 2012- Apr 2013

Hadoop Developer

Responsibilities

* Gathered Business requirements from the Business partners and Subject Matter Experts

* Analyzed and developed simple to complex Map/reduce Jobs using Hive and Pig

* Streamlined Map Reduce Jobs to utilize HDFS effectively by utilizing different pressure systems

* Handled importing of information from different information sources, performed changes utilizing Hive, MapReduce, stacked information into HDFS and separated the information from MySQL into HDFS utilizing Sqoop.

* Analyzed the information by performing Hive questions and running Pig scripts to study client conduct.

* Used UDF's to execute business rationale in Hadoop.

* Implemented business rationale by composing UDFs in Java and utilized different UDFs from Piggybanks and different Sources.

* Exported the investigated information to the social databases utilizing Sqoop for visualization and to produce reports for the BI group.

* Analysed a lot of information sets to focus ideal approach to total and investigate it.

* Supported in setting up QA environment and overhauling arrangements for actualizing scripts with Pig and Sqoop.

* Worked on stacking strategies utilizing Sqoop, Hive and pressure methods, for example, Avro.

* Developed Java APIs for conjuring in Pig Scripts to take care of complex issues

* Developing Scripts and Batch Job to timetable different Hadoop Program utilizing Oozie.

ENVIRONMENT: Linux, Java, Hadoop, MapReduce, Pig Latin, Hive, HBase, Zookeeper, Sqoop, Oozie, HDFS, Cloudera, XML, MySQL, Eclipse, Oracle, PL/SQL,

CAN Insurance, Chicago, IL June 2010-Dec 2011

Java/ J2EE Developer

Responsibilities:

* Requirement gathering and preparing requirement document

* Experience in involving high level and low level designs as per the requirement/solution

* Involving in designing of various class diagrams and sequence diagrams by using rational rose enterprise edition

* Developed the application JSP, JSF, Servlets, Struts, Spring, Java Beans, Hibernate

* Involved Java/J2EE based application and Java/JEE based portal projects

* Enhancement of the existing application which included bug fixing, new feature request and refactoring using PHP, AJAX, JavaScript, MySql, CSS, DHTML

* Designing and developing the servelets, JSP and JAVA classes for the presentation layer

* Implemented software chances and enhancement used and JUNIT testing for all the enhancements

* By using Angular Js with the MVC made both the development and the test easier

* Experience in installing Tomcat application server

* Developing web-tier using Struts framework

* Creating custom tags for JSP for maximum re-usability of user interface components

* Testing and deploying the application on Tomcat

* Maintaining the database required for report generation

Environment: Java, J2EE, Hibernate, Tomcat, Juint, JSF, JSP, Servlets, Batch Processing, UNIX, Struts, JavaScript, UML, Angular JS, Asynchronous concepts

ECIL, Hyderabad, India Jan 2008 -Apr 2010

Java Developer

Responsibilities:

* Worked as software developer for ECIL on developing a supply chain management system.

* The application involved tracking invoices, raw materials and finished products.

* Gathered user requirements and specifications.

* Developed the entire application on Eclipse IDE.

* Developed and programmed the required classes in Java to support the User account module.

* Used HTML, JSP and JavaScript for designing the front end user interface.

* Implemented error checking/validation on the Java Server Pages using JavaScript.

* Developed Servlets to handle the requests, perform server side validation and generate result for user.

* Used JDBC interface to connect to database.

* Used SQL to access data from Microsoft SQL Server database.

* Performed User Acceptance Test.

* Deployed and tested the web application on Web Logic application server.

Environment: JDK 1.4, Servlet 2.3, JSP 1.2, JavaScript, HTML, JDBC 2.1, SQL, Microsoft SQL Server, UNIX and BEA Web Logic Application Server.



Contact this candidate