Geetha
************@*****.***
Career objective:
To obtain a Full time position in the Information Technology field that enables me use my technical skills, academic background and ability to work with people.
Professional summary:
Have good knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce programming paradigm.
Experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Zookeeper and Flume.
Good Exposure on Apache Hadoop MapReduce programming, Pig Scripting and Distributed Application and HDFS.
Good knowledge on Apache Spark and Scala.
Good knowledge in managing and reviewing Hadoop log files.
Hands on experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
Good knowledge on Teradata Utilities (BTEQ, Fast load, Multiload, FastExport, and TPump) for development.
Hands on experience in developing applications using Core Java and multi-threading.
Detailed understanding of Software Development Life Cycle (SDLC) and sound knowledge of project implementation methodologies including Waterfall and Agile.
Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.
Technical Skills:
Languages:
C, Core Java, SQL, Scala
Big Data Eco System:
HDFS, MapReduce, YARN, Hive, Hbase, Ambari, Pig, Storm, Sqoop, ZooKeeper, Oozie and Flume, Apache Spark.
Operating system:
Windows, Linux and Unix
DBMS / RDBMS:
Oracle, MySQL, Teradata
IDE:
Eclipse IDE
Tools:
Putty, SQL Oracle Developer
Certifications:
Certified in Apache Spark and Scala from Edureka.
Educational Qualifications:
Bachelors in Information Technology, JNTU, Kakinada, India in 2014.
Masters in Computer Science, University of Central Missouri, MO, USA in 2016.
Professional Experience:
InfoLabs Inc. Jan2016 – Jun2016
Role: Hadoop Developer
Project: Twitter data analytics using Hadoop
Description: The objective of the project is to ingest twitter data from twitter firehose into multi-node Hadoop cluster using Apache Flume. Organize data into HIVE to enable data visualization tools to perform analytics and visualization activities. Implement visualization capabilities using Microsoft powerview.
Technologies Used: Hadoop data platform, JDK 1.7, Apache Flume, HIVE, Microsoft powerview, twitter developer API.
Roles & Responsibilities:
Involved in Hadoop cluster implementation, Hadoop Software installation and configuration.
Involved in Getting Data from a Live-Streaming Source
Involved in getting Input Data from Twitter to the HDFS
Loaded a data file into a Hive table
Created a table using RCFILE Format
Created Hive Managed tables and External tables
Created a partitioned table and load data into HDFS
Analyzed the Complex Data with Hive
Academic project: Online DVD rental system
Description: Online DVD rentals are a system for renting DVDs. Our proposed system having the facility to rent/buy particular CD’s or DVD’s through online.
Technologies Used: Jsp, CSS, JavaScript, Jquery, JDK 1.7, Spring MVC, Hibernate, MySQL 5.6 and Eclipse IDE.
Roles & Responsibilities:
Analyzed the requirements of the project and developed feasibility
Lead the team of three in working with the development of web applications
Involved in analysis, design, coding and testing of the project
Designed use case diagrams, class diagrams, sequence diagrams and object diagrams using UML to model the detail design of the application.
Designed responsive user interface to Online DVD rental application.
Used GIT for code sharing.
Responsible writing test cases to meet the exact requirement functionalities.
Project: Winds of Change: From Vendor Lock-In to the Meta Cloud
Description: The concept of Meta cloud that incorporates design and runtime components. This Meta cloud would abstract away from existing offerings technical incompatibilities, thus migrating vendor lock-in. It helps users find the right set of cloud services for a particular use case and supports an application’s initial deployment and runtime migration.
Technologies Used: HTML, Java, Jsp, JavaScript, Java Server Pages, MySQL, JDBC, Tomcat 5.0/6.x.
Roles & Responsibilities:
Analyzed the requirements of the project and developed feasibility.
Worked with the team of four in developing the application.
Involved in analysis, design, coding and testing of the project.
Designed use case diagrams, class diagrams, sequence diagrams, activity diagrams using UML to model the detail design of the application.
Designed responsive user interface using HTML, JSP and JavaScript.
Performed various testing methods (Unit, Integration and Acceptance) to obtain the best results.