Post Job Free
Sign in

Data Developer

Location:
United States
Posted:
February 25, 2016

Contact this candidate

Resume:

Dhanpati Rana

If you have any queries, please contact my employer:

Sunny

Phone: 510-***-****

Email: *****@*************.***

Summary:

An accomplished result oriented professional with over eight (8+) years of experience in IT industry, which includes hands on experience in Big Data/Hadoop ecosystem.

Proficient in dealing with Apache Hadoop components like Hadoop, MapReduce, HDFS, HiveQL, HBase, Pig, Sqoop, and Big Data Analytics.

Have hands on experience in writing MapReduce jobs in Java.

Experience with Oozie Scheduler in setting up workflow jobs with Map/Reduce and Pig jobs.

Skilled in Collecting logs data from various sources and integrating in to HDFS using Flume.

Hands on experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

Experience working in Oracle, DB2, SQL Server and My SQL database.

Used JavaScript for client side validations and implemented jQuery for reducing data transfer between user and server.

Experience in implementing Java/J2EE technologies for application development in various layers of projects.

Skilled in data management, data extraction, manipulation, validation, and analyzing huge volume of data.

Proficient in using OOPs Concepts (Polymorphism, Inheritance, Encapsulation) etc.

Expertise in interacting with business users and understanding the requirement and providing solutions to match their requirement.

Analytical, organized, enthusiastic to work in a fast paced and team oriented environment.

Excellent communication and inter-personal skills, flexible and adaptive to new environments, self-motivated, team player, positive thinker and enjoy working in multicultural environment.

Technical Skills

Big Data Ecosystem

Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Flume, Oozie

Development Tools

Visual Studio, Eclipse

Languages

C++, C#, Java, Fortran

Databases

Oracle, MS SQL Server, SQL/PLSQL

Script languages

JavaScript, R

Modeling Tools

R, SAS, MATLAb

Environments

UNIX, Linux, Windows 2000, Windows XP, Windows 7

PROFESSIONAL EXPERIENCE:

Pacific Electric and Gas Company, San Francisco, CA

Hadoop Developer (Nov2014 – Present)

Description: The Pacific Gas and Electric Company, commonly known as PG&E, is the investor-owned utility that provides natural gas and electricity to most of the northern two-thirds of California, from Bakersfield almost to the Oregon border. It is the leading subsidiary of the PG&E Corporation.

Responsibilities:

Gathered the business requirements from the Business Partners and Subject Matter Experts.

Involved in installing Hadoop Ecosystem components.

Responsible to manage data coming from different sources.

Supported Map Reduce Programs that are running on the cluster.

Load and transform large sets of structured, semi structured and unstructured data.

Wrote MapReduce jobs using Java API.

Installed and configured Pig and also written PigLatin scripts.

Involved in managing and reviewing Hadoop log files.

Imported data using Sqoop to load data from MySQL and Oracle to HDFS on regular basis.

Developing Scripts and Batch Job to schedule various Hadoop Program.

Written Hive queries for data analysis to meet the business requirements.

Creating Hive tables and working on them using Hive QL.

Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.

Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Linux, XML, MySQL

Capital One, NY

Hadoop Developer (June2013-Oct2014)

Description: The Retail businesses are responsible for delivering a range of banking products and services to retail customers. The division also has a dedicated Merchant analytics management business designed to meet the needs of high net worth individuals. This Solution is concerned with the development of a cost-effective Data Warehouse using Hadoop and Hive for storage of large amount of historical data and log data. The raw data will be coming from various sources and dumped directly into Hadoop file system through Sqoop and then data is processed using hive queries.

Responsibilities:

Predicted consumer behavior, such as what products a particular user has bought and made recommendations based on recognizing patterns by using Hadoop, Hive and Pig queries.

Installed and configured Hadoop, MapReduce, and HDFS.

Developed MapReduce jobs for data cleaning and preprocessing.

Importing and exporting data into HDFS and HIVE from a Oracle database using Sqoop

Responsible to manage data coming from different sources

Monitoring the running MapReduce programs on the cluster.

Responsible for loading data from UNIX file systems into HDFS.

Installed and configured Hive and also wrote Hive UDF.

Involved in creating Hive Tables, loading data and writing Hive queries which will invoke and run MapReduce jobs in the backend.

Wrote Pig scripts to process unstructured data and create structure data for use with Hive.

Cluster coordination services through Zookeeper.

Environment: Hadoop ecosystem such as Pig, Hive, Sqoop, HBase, Oozie, Zookeeper, Flume, and SQL

Cigna, Windsor, CT

Role: Java Developer (April2012-May2013)

Description: Greater Health care project maintains a website quote for the people looking for health insurance and provides the various offers available and suggests a plan that suits them better based on the various inputs form them.

Roles and Responsibilities:

Analyzed all the test cases based on the requirements gathered and documented for unit testing as well as for integration testing.

Designed the user interface required for the portal with all the components for selection of plan.

Provided the design using Restful Web Services to populate the individual details of plans available for the customers to pick.

Programmed functionality for all the components in the user interface interacting with the database using MySQL Server.

Developed various Controller classes and business logic using the spring libraries which interact with the middle tier to perform the business operations.

Responsible to develop the custom tools as per the client needs.

Tested the application by programming the test cases using JUnit for both units testing and Integration testing and bug tracking for the entire application.

Environment: Core Java, JSP, MySQL, SOAP, JUnit, Eclipse, HTML, JavaScript, XML

MetLife Inc-Somerset, NJ

Software Developer (Jan2011-March2012)

Description: Metropolitan Life Insurance Company (MetLife) is a leading innovator and a recognized leader in protection planning and retirement and savings solutions around the world.

Responsibilities:

Actively participated in requirements gathering, analysis, design, and testing phases.

Designed use case diagrams, class diagrams, and sequence diagrams as a part of Design Phase.

Developed the entire application implementing MVC Architecture integrating JSF with Hibernate and Spring frameworks.

Implemented Service Oriented Architecture (SOA) using JMS for sending and receiving messages while creating web services.

Developed XML documents and generated XSL files for Payment Transaction and Reserve Transaction systems.

Developed Web Services for data transfer from client to server and vice versa using SOAP and WSDL.

Used JUnit Framework for the unit testing of all the java classes.

Implemented various J2EE Design patterns like Singleton, Service Locator, DAO and SOA.

Worked on AJAX to develop an interactive Web Application and JavaScript for Data Validations.

Environment: J2EE, JDBC, JSP, Web services, SOAP, Design Patterns, MVC, HTML, JavaScript XML.

HSBC, NY

Java/J2EE Developer (Aug09-Dec2010)

Contribution:

Involved in Understanding the requirements by interacting with business users and mapping them to design.

Analyzed all business functionality related to back end database interfaces.

Developed technical specifications for various back end modules from business requirements.

Created Use case, Sequence diagrams, functional specifications and User Interface diagrams using Star UML.

Developed user interfaces using JSP, HTML, XML and JavaScript.

Developed the code which will create XML files and Flat files with the data retrieved from Databases and XML files.

Created Data sources and Helper classes which will be utilized by all the interfaces to access the data and manipulate the data.

CSIR, India

Research Associate (Nov07–July09)

Responsibilities:

Defining the application or problem and developing mathematical model

Deciding the solution pathways and parametric/physical models

Creating the algorithm, coding it in C++, and testing it

Used Object Oriented techniques for obtaining solutions.

Running simulations to optimize parameters

Finding faster convergence and accuracy of numerical results

Analysis of results, physical interpretation and created detailed reports.

Educational Accreditations

Ph.D. Applied Mathematics from Indian Institute of Technology, Roorkee, India

M.Sc. Applied Mathematics from Guru Jambheshwar University Hisar, India

B.Sc. Computer Science, KUK Kurukshetra, India



Contact this candidate