Kunal Anil Gaikwad Mumbai, Maharashtra
Hadoop Developer
Email: *****.*****@*****.*** Contact No: 902-***-****.
Summary:
• Over 2 year of experience in Hadoop Technologies, Java, AngularJS, JavaScript,
HTML, CSS, and Bootstrap
• Big Data and Hadoop Certified from edureka
• I have worked on Cloudera, HortonWorks, and Apache Hadoop
• Expertise in Hadoop eco systems, HDFS, MapReduce, Pig, Hive, Flume and Sqoop
for scalability, distributed computing, and high performance computing
• Well versed in Core Java
• I am able to setup Pseudo Distributed Cluster using Apache Hadoop and I have also
setup Multi Node Cluster on a single machine for the testing purpose using the VM
tool
• I have very good knowledge of Hbase
• I have worked on a sample project using Pig and Hive
• I have used the tool Sqoop to load my data from RDBMS to HDFS. (From Oracle,
MySql and SQL Server)
• I have used Flume to get the tweets from the twitter account
• Having Good Knowledge on Single Node and Multi Node Cluster Configurations
• Extensive experience in Software Development Life Cycle (SDLC)
• Design, Development and Testing of application in Java, J2EE
• Experienced in creating web pages using HTML, CSS, Bootstrap
• Experienced in Technical editing and code testing technical books
• Excellent verbal, written and communication skills
• Self motivated and eager to learn new technologies
Work Experience:
CIO Matrix - Associate Developer (Oct 2014 – Feb 2015)
• Creating interactive web pages using HTML, CSS, and Bootstrap
• Creating functions using JavaScript and jQuery
• Learning different technologies for the web site (MongoDb, AngularJS)
• Designing the site map, technology map and plan for the project
• Creating RESTFul services
• Understanding the commercial implications of the project
• Planning which project would need Big Data implementation
Packt Publishing PVT LTD – Technical Editor/Subject Matter Expert (Sept 2013 – Oct
2014)
• Editing the technical and grammatical aspect of the drafts provided by the author
• Code testing on different technologies (Hadoop (Single node and Multi Node cluster),
SketchUp, and so on)
• Responsible for Quality checks and auditing the Technical editors
• Project and time management
• Author relationship management
• Coming up with smart and innovative ideas for the book
• Giving presentation on Big Data and programming languages
• Conducting Scrums for the team
Reliance Communications – Diploma Engineer Trainee (Jun 2009 – Jun 2009)
• Managing network based on Leased Lines, Virtual Private Networks (VPN), and
Direct Internet Access (DIA)
• Communicating with Site engineers and resolving the faults in the network
• Solving traffic and network related issues
Technical Skills:
Core Java, Hadoop, MapReduce, Pig, Hive, Flume, Sqoop, Hbase, Eclipse, Yarn, Shell
Scripting, JBOSS, HTML, CSS, Bootstrap, JavaScript, AngularJS, Amazon Web Services
(AWS), Sketchup, Qlikview
Certifications:
• Big Data and Hadoop edureka 2015
• Cisco Certified Network Engineer NIIT 2012
• Workshop on Ethical Hacking AIESEC IIT Kharagpur 2011
Education:
Finolex Academy of Management and Technology
Information Technology, 57%
2009 – 2012
Pravin Patil College of Diploma Engineering and Technology
Information Technology, 77%
2006-2009
N.H English Academy
High School, 68.40%
Project 1:
Project Name: Sentiment analysis
Project Role: Hadoop Developer
Environment: Hadoop, Apache Pig, Hive, Flume, Sqoop, and Linux
Brief Description:
The purpose of this project is to perform sentiment analysis on reddit and store the post of the
users and extract meaningful information out of it. The solution is based on the open source
Big Data software framework Hadoop. The data will be stored in Hadoop file system and
processed using MapReduce jobs, which in turn includes getting the raw data, process the
data to obtain sentiments on the posts and export the information from Sqoop to SQL for
further processing.
Roles and Responsibilities
• Importing data to HDFS from the website
• Written Apache Pig scripts to process the HDFS Data
• Created Hive tables to store the processed results in a tabular format
• Monitoring Hadoop scripts which takes the input from HDFS and load the data into
Hive
• Created external tables in Hive
• Exported data from Sqoop to SQL
Project 2:
Project Name: www.ciomatrix.in
Environment: Java, HTML,CSS, AngularJS, Bootstrap, JavaScript
Brief Description:
The purpose of this project is to create a vertical networking platform for the CIOs and all C-
level managers throughout the world proving them an easy, collaborative, and secure
platform.
Roles and Responsibilities:
• Creating RESTful services for the services provided by the website
• Creating web pages
• Creating functions for the services
• Using MongoDb for storing the user’s data
Project 3:
Project Name: Asian arthouse
Environment: HTML, CSS, Bootstrap, JavaScript, Php, SQL
Brief Description:
This is a website to buy and sell paintings. Here an Artist uploads his painting according to
the category and places a price for the same. The painting is then verified by the arthouse and
then uploaded on the website for sale. The site has different features as Auction house and
Exchange house.
Roles and Responsibilities:
• Gathering requirement from the clients, analysing them and creating a prototype
• Finalizing the design of the site
• Creating web pages using HTML, CSS, and Bootstrap
• Creating function for the arthouse
Project 4:
Project Name: SpiceMix
Environment: HTML, CSS, Bootstrap, JavaScript, Php, SQL
Brief Description:
This is a website to buy and sell Spices herbs and Tea. Here a Seller uploads his stocks
according to the category and places a price for the same. The site has a unique USP that the
spices are only made after an order has been made to sustain the quality and freshness of the
products. Also buyers can propose their own quantities of spices to be mixed in a specific mix
(the quantities are maintained by the seller as to how much can be changed).
Roles and Responsibilities:
• Gathering requirement from the clients, analysing them and creating a prototype
• Finalizing the design of the site
• Creating web pages using HTML, CSS, and Bootstrap
• Creating function for the SpiceMix
Projects in Packt Publishing:
• Cloudera Administration Handbook
• Programming MapReduce with Scalding
• Hadoop Beginners Guide
• Learning Magento Theme Development
• Microsoft Exchange Server 2013 High Availability
• iOS7 Game Development
• Visual Studio 2013 Cookbook
• vSphere Virtual Machine Management
• Scala for Java Developers
• Building Scalable Apps with Redis and Node.js
• Arduino Networking
• OpenGL 4 Shading Language Cookbook
• Getting Started with BizTalk Service
• Mastering Magento Theme Design
• SketchUp 2014 for Architectural Visualization Second Edition
• Mapping and Visualization with SuperCollider
• WebGL Game Development
Roles and Responsibilities:
• Technically editing the book
• Code testing
• Communicating with Authors and different stakeholders for the completion of the
books
Awards and other Activities
• Interhouse Football Competition First place 2005 - 2006
• NIITAT Merit Holder Top 25% in India 2009
• Star Trainee at Packt Publishing Best Trainee 2013
• Star editor of the week Best Editor 2013
• Utopia 2K12 Fashion CC 2012
• Best Personality Final year B.E 2012
Organizations:
www.mumbaiitpro.org Volunteer/Offline Assistant 2006 - 2009
• Microsoft sponsored event
• Presenter of Microsoft SDLC
• Conducting seminars/Workshop on Technology
• Providing Hands-on experience to the members
• Managing different presenters for the event
Personal Details:
10th October 1990
Date of Birth:
Address: 102, Avani Park near Bharti Park, Mira Road (E). Thane: 401107
Twitter: @kunalgaikwad
LinkedIn: http://in.linkedin.com/in/kunalgaikwad
Websites: www.divulgetech.wordpress.com, www.kunalgaikwad.blogspot.com