Post Job Free
Sign in

Data Developer

Location:
Fort Lauderdale, FL
Posted:
March 26, 2015

Contact this candidate

Resume:

Sneha Abraham

****, ** **** *******

Plantation, FL 33322

Mobile No: 1-786-***-****

Alternate No: 1-786-***-****

Email: **************@*******.***

LinkedIn Profile: www.linkedin.com/in/sneha13abraham

Experience Summary

Overall 7+ years of experience in analysis, design, coding, testing and support in IT industry. This

also includes 3 years of experience in Big Data Hadoop technologies and solutions

Cloudera certified Hadoop Developer with hands on experience in installing, configuring and

using Apache Hadoop ecosystem components like HDFS, Hadoop MapReduce, HBase, Zoo

Keeper, Oozie, Hive, Sqoop, Pig and Flume.

Expertise in writing Hadoop Jobs for analyzing data using Hive and Pig

Experience in working with MapReduce programs using Apache Hadoop for working with Big Data

Experience in importing and exporting data using Sqoop from HDFS to Relational Database

Systems(RDBMS) and vice versa

In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS,

Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts

Extending Hive and Pig core functionality by writing custom UDFs

Extensive experience with SQL, PL/SQL and database concepts

Knowledge of job workflow scheduling and monitoring tools like Oozie and Zookeeper

Experience in optimization of Map reduce algorithm using combiners and partitioners to deliver

the best results

Proficient in using Cloudera Manager, an end to end tool to manage Hadoop operations

Worked on different operating systems like UNIX, Linux and Windows

Page 1 of 6

Good knowledge of treasury operations covering front, middle and back office functionalities

in a bank

Certifications: CCD 410 Cloudera Certified Developer for Apache Hadoop

(CCDH) : License num : 100 009 517; Jan 2015

Technical Expertise:

Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop,

Big Data Ecosystem

Oozie and Flume

Programming Languages C, C++, Java, SQL, Oracle PL/SQL,UNIX/Linux Shell Scripts, Python

Framework JUnit, log4j

Database MS SQL Server, Oracle 8i/9i, TOAD, HBase

Operating Systems Linux, Unix, WINDOWS

Methodologies Agile

Tools Datameer(5.4.0), Microsoft Visual Basic(VB)

Project Experiences

Client : Citi Bank, Irving, Texas (Mar 2014 Till Date)

Project : Hadoop Applications Development

Role : Hadoop Applications Developer

Responsibilities

As Hadoop Applications Developer, my responsibilities include creating detailed technical

specifications, developing application and system code, participating in code reviews and module

testing, and supporting on going maintenance

Involved in design and developing components of Big data processing using HDFS, Mapreduce, PIG

and HIVE.

Page 2 of 6

Worked on data extraction strategies using SQOOP and a ssisted in data transfers between hadoop

and non hadoop systems like DB2 databases.

Developing data pipelines using Oozie.

Using JSON Serde and XML in HIVE

Good experience with Datameer tool to integrate, prepare, analyze and visualize datasets to provide

better insights to data in faster time.

Experience with YARN architecture and Mapreduce using MRV2

Participated in installation and maintenance of Hadoop software applications.

Automated processes for application of new technologies as per latest changes.

Worked on Multi Clustered environment and setting up Cloudera Hadoop echo System.

Data ingestion and Load and transform large sets of structured, semi structured and unstructured

data using Flume and Sqoop

Responsible for managing and reviewing Hadoop log files

Worked with support team to resolve performance issues

Able to query the data, perform analyses and present findings in a clear and understandable

language.

Produce accurate analyses of the datasets using smart algorithms and the latest big data

technologies.

Environment: Hadoop 0.23, Hadoop 2.x, Pig 0.12, Hive 0.13.1, HBase 0.96.0, Flume 1.4.0, Java, JDK 1.6

and 1.7, Eclipse, Oracle, JSP, JQuery, JUnit 4, Log4J, Visio, TOAD, Unix, Linux, Sqoop 1.4.x

Client : AllState, Northbrook, Illinois (Oct 2013 Feb 2014)

Project : Capital Purchase Program Analysis

Role : Hadoop Developer

Responsibilities:

Extracted the data from raw files and separated for processing

Productize and operationalize their big data infrastructure

Page 3 of 6

Software installation and configuration

Build automation and internal tools and to manage the deployment and configuration

Presented knowledge sharing sessions on general architecture to other team members.

Prepared Technical Design Document (TDD) & Detailed Analysis Document (DAD) and provided

walk through on DAD’s to stake holders

Coded Custom Mapreduce depending on the specifications.

Implemented Unit Testing using JUNIT testing during the projects.

Developing HIVE and PIG UDFs.

Log analysis and ETL processing.

Gathered performance metrics on monthly project accomplishments and reported to upper

management

Environment: Hadoop 0.23, Hadoop 2.2, Cloudera CDH3, Cloudera CDH4, Java 1.6, Eclipse, JQuery,

JUnit, Log4J, Visio, Oracle, SQL Developer, JSON, XML, Cygwin

Employer : Edureka (Dec 2012 Sept 2013)

Project : Hadoop Applications Development and Support

Role : Hadoop Developer

Responsibilities:

Involved in review of functional and non functional requirements.

Designed and implemented Map Reduce jobs for analyzing the data collected by the flume server.

Support customers for deployment of Hadoop clusters

Created Hive tables and working on them using Hive QL

Used Apache Log4J for logging.

Involved in Bug fixing of various modules that were raised by the Testing teams in the application

during the Integration testing phase.

Facilitated knowledge transfer sessions.

Page 4 of 6

Environment: Hadoop 0.23, Cloudera CDH3, Java 1.6, Eclipse, Log4J, Pig 0.12, Hive 0.13.1, Ubuntu

12.04.x, shell scripting

Employer : Infosys Limited, Chennai, India

Project : Finacle Treasury Development and Support

Roles : Systems Engineer (Oct 2007 – April 2009)

Senior Systems Engineer (May 2009 – April 2010)

Technology Analyst (June 2010 – Aug 2012)

Responsibilities:

Analysis of the specifications provided by the client.

Preparing Design document as well as the High Level document for the flow of changes based on

the analysis as per the Quality Procedures.

Mentoring my team to code and test the design enhancements/developments.

Review of code written by developers

Testing various scenarios and finding latent bugs, mismatches in previous and current behavior and

correcting all of them

Developing the entire enhancement

Preparing Unit Test Plans and involving in Unit Testing and System Integration Testing

Tracking/Verifying & Closing Defects

Supporting testing and implementation teams at the time of migration of Finacle Treasury to higher

releases

Train and mentor freshers coming into the team

Environment: C, C++, Java, SQL, Oracle PL/SQL, UNIX/Linux Shell Scripts, Visual Basic Front End

Development

Honors and Awards

Page 5 of 6

Won Young Achiever Award in Jan 2009 for porting of entire code from SUN flavor of UNIX to AIX.

The code was highly incompatible with AIX environment which number of errors during entire

compilation. All the errors were fixed well before the schedule provided and delivered in time to IDBI

bank, Mumbai.

Won Team Excellence Award in July 2011 for the Structured Products enhancement implemented

for Raiffeisen Bank International, Prague.

These awards are highly prestigious as the winners are selected out of more than 5000 employees by

Head of Finacle business unit (which includes Treasury, Core, E Banking and CRM)

University Education

Bachelor of Technology Degree from Cochin University of Science and Technology (CUSAT) 2003–2007

Page 6 of 6



Contact this candidate