Post Job Free

Resume

Sign in

Hadoop Developer

Location:
United States
Salary:
$80 o/hr
Posted:
July 25, 2016

Contact this candidate

Resume:

SIVA MAMILLAPALLI

(Sr. Hadoop Developer)

acvuvb@r.postjobfree.com

972-***-****

SUMMARY:

Hadoop developer with Over 10 (Ten) years of experience in IT including 2+ years of experience in Hadoop and Big Data related technologies.

Certified Scrum Master, Oracle Java Developer, exclusive experience in SDLC activities

Strong development skills in Hadoop, Hive, Map Reduce, Pig and Hbase with solid understanding of Hadoop internals.

Familiar with HDFS, Pig Latin, Hive, SQOOP, JSON, Compression, OOZIE, Flume and Zookeeper, the various components of Hadoop Ecosystem.

Experience on ETL development using Kafka, Flume, and sqoop

Knowledge of NO SQL databaseslike Cassandra, Mongodb and HBase.

Aware of File formats like Parquet, Sequence and Avro and writing custom SerDe in HIVE.

Excellent programming skills with experience in Java, C, SQL and Python Programming.

Experience in Performance tuning in Hadoop map reduce, HBase and HIVE.

Excellent working experience on SQL & PL/SQL.

Hands on experience in Hadoop clusters like Ericsson RND cluster, AWS Elastic Map Reduce and Cloudera.

Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation on HIVE.

Good knowledge in Kafka and Messaging systems.

Experience in UNIX shell scripting and has good understanding of OOPS, OOAD, Data structures and Design Patterns.

Good experience in Java and JDBC related technologies.

Experience in XML related technologies like XML, DTD, XSL, XSD and parsers such as SAX/DOM, JAXB.

Aware of Java Building tools like Maven and Ant.

Good knowledge of Core Java concepts like Collections, Generics, Multi-threading, Serialization, File I/O and Socket Programming

Experience with performance/scalability tuning, algorithms and computational complexity.

Good commitment towards the work assigned and excellent analytical skills.

Highly motivated team player with the ability to work as a team or independently.

Ability to learn and adapt quickly to the emerging new technologies, thus adding value to the business.

Proven ability to work cross functionally to deliver appropriate resolution of technical, procedural and operational issues

Experience in SCM activities using GIT and Clearcase

Expert in build & release management using Jenkins, BuildForge and Cruisecontrol

Strong experience in System Integration using Continuous Integration tools

Good in writing Python, Shell and Perl scripts to automate activities via Jenkins and Monitoring Build System

Experience in supporting Developers for merge issues, commit and Gerrit issues in the PLF teams

Provided extensive support to the Applications RBS, RNC and MGW for various PRAs

Extensive working knowledge on Telecom 3G Domain and Testing on various networks.

Hands on experience on Agilent N2X, HP Procurve switches, Wireshark, Juniper and Cisco routers

Good understanding on RNC and RBS Network elements

Having good work experience on Solaris, Linux and Windows environment & IT tools.

CERTIFICATIONS:

CERTIFICATION

Month & Year

1.Oracle Certified Java Programmer

July 2015

2.Scrum Master from Scrum Alliance

March 2015

3.IBM Clearcase SCM

March 2009

TRAININGS UNDERGONE :

COURSES

Month & Year

4.IBM Clearcase SCM training

Feb 2009

5.GIT training while at Ericsson, Sweden

Jan 2010

6.Core Java training

May 2010

7.Scrum Master training while at Ericsson, Sweden

Oct 2011

8.Hadoop Developer

Dec 2012

TECHNICAL SKILLS:

Programming Languages : Core Java, JDBC, C, PL/SQL, Perl, Shell and Python

Integration Tools : Clearcase, Git, SVN, CVS, Gerrit, Jenkins, BuildForge, ant, Maven, Cruise Control

Tools : JIRA, JUnit Frame Work, ClearDDTS, MHWeb, Eclipse, Hansoft

Ericsson R&D Tools : JIRA, SWDI Tools for Onetrack, TPT2/TL2000 for test reporting, MIA, Infobank

RDBMS : MySQL, Oracle

Frameworks : Hadoop, AWS Elastic Mapreduce, Cloudera

Open Source : Hadoop, Map Reduce, Pig, Hive, Sqoop, OOZIE, Flume, ZOOKEEPER

NOSQL : HBase.

IDE : Eclipse

Operating Systems : Linux, UNIX and Windows.

Methodologie : Agile Methodologie (Scrum, Kanban)

EDUCATION:

M.SC – Information Systems from Andhra University, Visakhapatnam, A.P., India – May.2006.

BCA (Bachelors of Computer Applications) from Nagarjuna University, Guntur, A.P., India – May.2003.

ACHIEVEMENTS:

Received Outstanding Innovation Award for 2014 for the MIA & Infobank project

Appreciated by client on a number of occasions for proactive handling of Scrum,finding critical stopping faults during the System development

Got appreciation for migrating data sources from Clearcase to GIT

Outstanding performance award for handling multiple tasks

Received star of the quarter award during my journey in TCS

Identified as one of the Good Team Player with positive attitude by Project Management

Achieved 3rd Prize in IIT Mumbai for Micro mouse, Robotics Competition (TECH FEST)

PROFESSIONAL SUMMARY:

PROFESSIONAL SUMMARY:

Client: Daimler Truck North America Feb, 2016 to Till date

Role: Sr. Hadoop Developer

Location: Portland Oregon

Project Description: Daimler Trucks North America LLC, a Daimler company, is the largest heavy-duty truck manufacturer in North America and a leading producer of medium-duty trucks and specialized commercial vehicles.

Module: Manufacturing Data Hub & Vehicle Build Status

Responsibilities:

Involved in all phases of development activities from requirements collection to production support.

Detailed understanding of current system and find out the different sources of data

Involved in Cluster Troubleshooting and upgrades

Implemented Map reduce jobs for Vehicle Build Summary and populate which parts are short and in turn which Trucks will not be able to build in stipulated time.

Analysis of various parts included in manufacturing are available and various tests during manufacturing phase.

Design and implement map reduce jobs to support distributed processing using Java, Hbase

Create Hive external tables on the map reduce output before partitioning, bucketing is applied on top of it.

Providing pivotal graphs in order to show the trends

Written Oozie workflow and scheduling for the jobs automation

Developed and maintain several batch jobs to run automatically depending on business requirements

Import and export data between the environments like DB2 and HDFS/HBase using sqoop

Worked on Storm Topology integrated with Kafka in order to achieve near to real time processing of various Tests involved in Manufacturing of Trucks.

Performed unit testing and internal deployments for all the deliverables to the Business

Environment: HDP2.3.4, HBase, Phoenix, Hive, HDFS, Java Map-Reduce, Core Java, Maven, SVN, Jenkins, UNIX, MYSQL, Eclipse, Hue, Oozie, Sqoop, Hortonworks Distribution, Ambari, Storm, Kafka and DB2

Client: Target Aug.2015 to Jan.2016

Role: Sr. Hadoop Developer

Location: Minneapolis, MN.

Project Description: Target is a Financial & Retails services which has online shopping and store shopping of all household products.

Module: Data analytics, which will be, require on day to day data, weekly, monthly and yearly

Responsibilities:

Involved in all phases of development activities from requirements collection to production support.

Detailed understanding of current system and find out the different sources of data

Involved in Cluster setup

Performed Batch processing of logs from various data sources using Map reduce

Predictive analytics (which can monitor inventory levels and ensure product availability)

Analysis of customers' purchasing behaviors

Response to value-added services based on clients' profiles and purchasing habits

Defined UDFs using PIG and Hive in order to capture customer behavior

Design and implement map reduce jobs to support distributed processing using java, Hive and Apache Pig.

Create Hive external tables on the map reduce output before partitioning, bucketing is applied on top of it.

Providing pivotal graphs in order to show the trends

Maintenance of data importing scripts using Hive and Map reduce jobs

Developed and maintain several batch jobs to run automatically depending on business requirements

Import and export data between the environments like MySQL, HDFS and

Unit testing and Deploying for internal usage monitoring performance of solution

Environment: Apache Hadoop, Hive, PIG, HDFS, Java Map-Reduce, Core Java, Python, Maven, GIT, Jenkins, UNIX, MYSQL, Eclipse, Oozie, Sqoop, Flume and Cloudera Distribution, Oracle, Teradata and My Sql

Client: Ericsson, Stockholm, Sweden Oct 2013 to July 2015

Role: Hadoop Developer

Project Description: Ericsson provides a platform product for high availability applications to be used when developing IP/ATM based Telecommunication network infrastructure such as LTE, RBS, RNC and M-MGW

Module: MIA & Infobank is an online tool, which stores and provides all the build and product information to employees, several applications like LMR, WMR, RNC, MGW clients.

Responsibilities:

Detailed Understanding on existing build system, Tools related for information of various products and releases and test results information

Designed and implemented map reduce jobs to support distributed processing using java, Hive and Apache Pig.

Developed UDF’s to provide custom hive and pig capabilities.

Built a mechanism for automatically moving the existing proprietary binary format data files to HDFS using a service called Ingestion service.

Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation

Performed Data transformations in HIVE and used partitions, buckets for performance improvements.

Written custom Input format and record reader classes for reading and processing the binary format in map reduce

Written Custom writable classes for Hadoop serialization and De serialization of Time series tuples.

Implemented Custom File loader for Pig so that we can query directly on the large Data files such as build logs

Used Python for pattern matching in build logs to format errors and warnings

Developed Pig Latin scripts for validating the different query modes in Historian.

Created Hive external tables on the map reduce output before partitioning, bucketing is applied on top of it.

Improved the Performance by tuning of HIVE and map reduce.

Developed Daily Test engine using Python for continuous tests.

Used Shell scripting for Jenkins job automation.

Building a custom calculation engine, which can be programmed according to user needs

Ingestion of data into Hadoop using Sqoop and apply data transformations and using Pig and HIVE.

Handled the performance improvement changes to Pre Ingestion service which is responsible for generating the Big Data Format binary files from older version of Historian

Automated job submission Via Jenkins scripts.

Worked with support teams and resolved operational & performance issues

Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system

Prepared graphs from test results posted to MIA

Environment: Apache Hadoop, Hive, PIG, HDFS, Java Map-Reduce, Core Java, Python, Maven, GIT, Jenkins, UNIX, MYSQL, Eclipse, Oozie, Sqoop, Flume, Oracle, My SQL and CDH4.X.

Client: Ericsson, Stockholm, Sweden Jan 2011 to Oct 2013

Role: CMA/MOM Developer, SCRUM Master

Project Description: Ericsson Connectivity Packet Platform (CPP) is a platform product for high availability applications to be used when developing IP/ATM based Telecommunication network infrastructure such as LTE, RBS, RNC and M-MGW

Module: IpTransport MOM Split & Cluster Member Redundancy

Responsibilities:

Design, develop and deploy the MOM models on the RBS platform and also to remove the dependency between the available platforms. Thus reducing the executiontime.

Prepared Functional Specifications for MOM.

Prepared resource plan for the activities of CRs and New Objects in pipeline

Involved in cluster SS7 cluster member redundancy & IP transport Split features of RNC & RBS networks

Worked on emx files, UML diagrams, XML, Adobe Dreamweaver.

Worked on improving the existing code and also reducing the redundancy.

Performed Low Level Design Document reviews.

Organized Knowledge Sharing Sessions within the team.

Provided timely estimations for the project for various releases.

Evaluated the performance of team members.

Maintained the weekly and monthly status of work done in team.

Developed MOM models as per the FS and executed them without any errors and delays.

Performed Analysis, Design, data modeling and development.

Involved in bug analysis, estimates and fixing.

Environment: Core Java, Ant, GIT, Jenkins, Unix, MYSQL, JIRA, JUnit Frame Work, ClearDDTS, MHWeb, Eclipse, Hansoft

Client: Ericsson, Hyderabad, India Feb 2010 to Dec 2010

Role: JCAT Developer

Project Description: Ericsson Connectivity Packet Platform (CPP) is a platform product for high availability applications to be used when developing IP/ATM based

Telecommunication network infrastructure such as LTE, RBS, RNC and M-MGW

Module: Automated Test framework using JCAT

Is a test framework to test the telecom nodes like RNC & RBS networks with Core platform on the node which verifies the functional and System behavior of the node and network.

Responsibilities:

Responsible for automation of test activities for team at each baseline.

Responsible to generate test scripts that were used for both System and Function tests in JCAT (an extension to JUNIT).

Pushed code to the gerrit & Validated the code through gerrit & mavens build system

Coordinated activities between Offshore and Onsite teams.

Provided knowledge transfer to offshore team on JCAT scripting.

Updated Offshore Automation activities every week in Line Meetings.

Organized and participated in code reviews with Onsite team.

Performed HLD, LLD Document reviews.

Provided timely estimations for the project for various releases.

Estimated the performance of teammates.

Environment: Ant, Core Java, GIT, Jenkins, Maven, UNIX, MYSQL, JIRA, JUnit Frame Work, ClearDDTS, MHWeb, Eclipse, Hansoft

Client: Ericsson, Hyderabad, India & Stockholm, Sweden -- Jan.2006 to Feb 2010

Role: Build Release Engineer

Project Description: Ericsson Connectivity Packet Platform (CPP) is a platform product for high availability applications to be used when developing IP/ATM based Telecommunication network infrastructure such as LTE, RBS, RNC and M-MGW

Module: CPP Build System

Responsibilities:

System Integration of Control System parts of CPP5, CPP6, CPP7, CPP8 & CPP9 Project tracks

Expert in integration activities with GIT and Clearcase till C13 tracks

Provided new baselines, Config specs in Clearcase for design & verification teams

Monitored in-deliveries of bug fixes as per the delivery plan

Managed on different work package teams, worked on new functional changes w.r.t. creating new baseline config specs, rebasing with new released baselines.

Merged the code to the delivery branch and resolve the merge conflicts in clearcase.

Merged different work package teams corrections into the delivery branch and resolve the conflicts well before.

Analyzed the build dependencies based on the in-delivered code changes.

Built the interface libraries and communicating with other subsystems for the dependencies

Built the required binaries and prepared the config specs.

Performed basic smoke tests on the simulated environment with the newly built software and then released it to verification teams for further tests.

Supported design teams to facilitate problem free deliveries for TR fixes, build environment.

Involved in One track way of working for complete CPP platform integration in Clearcase

Educated the teams on the latest developments in the way of working when the new product track starts & one track methodology

Involved in uplifts to various branches in Clearcase

Attended trainings in GIT at Ericsson

Performed GIT migration activities such as modification of build specs, make files

Configured lm builds and various job configurations in Jenkins

Fixed infrastructural issues and Design issues for various build failures

Involved in build activities using Build forge, escalated the issues to SWDI team for configuration issues

Provided Technical support for Developers and Testers in merge issues.

Involved in releasing LSV, EP baselines and also in staging baselines for Applications and PRA activities

Educated offshore team members and Design team members with GIT essentials

System Tester for RNC & RBS reference test networks

Performed and planned various test activities like System Regressions, Performance and Function tests

Prepared test plan, test cases and test scripts.

Performed various activities like Defect Tracking, Bug Reporting and Design support for Bugs(Trouble Reports)

Coordinated with the delivery manager for the delivery report preparation

Performed quality reviews like IQA/EQA/FI activities.

Environment: Ant, Core Java, ClearCase, Git, SVN, CVS, Gerrit, Jenkins, BuildForge, ant, Maven, Cruise Control, Maven, Unix, MYSQL, JIRA, JUnit Frame Work, ClearDDTS, MHWeb, Eclipse, Hansoft

-X -



Contact this candidate