Vishnu
****.*******@*****.***
SUMMARY
* ***** ** ******* ** experience in a various industries, 2 years of hands on experience in Big Data technologies and extensive experience of 4+ years in Java.
In depth knowledge on Hadoop architecture and its various components such as HDFS, Resource Manager, Node Manager, Application Master, Name Node, Data Node and MapReduce concepts.
In depth understanding of the functionalities of Hadoop components, interaction between them, resource utilizations and dynamic tuning to make cluster efficient.
Hands on experience in working with Ecosystems like Hive, Pig, Sqoop, Map Reduce, Flume, Oozie, Spark, Hue, Zookeeper.
Strong knowledge of Pig and Hive’s functions, used custom UDF’s to extend their functionality to achieve the required transformation.
Expertise in migrating terra-bytes of data using Sqoop from HDFS to Relational Database Systems and vice-versa, trouble shooting problems that arise in the process.
Implemented a POC solution for data streaming with Kafka, Storm.
In depth understanding of job workflow scheduling and monitoring tools like Oozie and Zookeeper.
Expertise in working with NoSQL databases such as HBase, Cassandra and distinguishing what use-cases they are ideal for.
Hands on experience with administrative tasks such as installing Hadoop, Commissioning and decommissioning nodes, installing and configuring ecosystem components such as Flume, Oozie, Hive and Pig.
Experienced in reviewing Hadoop log files to troubleshoot and optimize job execution.
In-depth experience in developing and executing shell scripts.
Hands on experience in Amazon AWS cloud services (EC2), VM Ware, Oracle VirtualBox.
Expertise in Object Oriented Analysis, Design (OOAD) and development of software using UML Methodology, good knowledge of design patterns and Core Java design patterns.
Expertise as a Java Developer in Web/Intranet, Client/Server technologies using J2SE, J2EE, Servlets, JSP and JDBC.
Hands on experience working with JAVA, Java Eclipse, Java Beans.
Extensive experience in working with different databases such as MySQL, RDBMS, SQL Server, MS Access and writing Stored Procedures, Functions, Joins and Triggers for different Data Models.
An excellent leader team player and self-starter with good communication skills and proven abilities to finish tasks before target deadlines.
Core Knowledge & Skills
Hadoop Ecosystem
Hive
Flume
Spark
JSP
HDFS
Pig
Oozie
Kafka
Java Script
MapReduce
HBase
Cassandra
SQL
Spring
YARN
SQOOP
Storm
Core Java
Tableau
Professional Experience
Client: PNC Bank, Cleveland, OH
Role: Big Data Developer
Duration: Sep 2014 - Present
Responsibilities:
As a part of Data Science team, gathered business requirements and mapped them to various sources of data.
Analysed large data sets by running Hive queries and Pig scripts.
Created complex Pig and Hive UDF’s for every specific business calculations.
Created Hive tables and loaded them with data for business to access and query.
Implemented and optimized MapReduce jobs for processing millions of records, data cleaning and pre-processing.
Setup automated jobs that load data from UNIX file system to HDFS.
Assisted in exporting analysed data to relational databases using Sqoop, as input to legacy reporting systems.
Worked with HBase for large volume transaction sales data, created daily snapshot of HBase table for downstream analytics.
Proof of Concept using Kafka and Strom for streaming the data from one of the sources.
Proof of concept using Spark for data analysis.
Responsible for implementation of one of the data source transformations in spark.
Environment: CDH5.0, Hadoop, HDFS, Pig, Hive, MapReduce, Sqoop, HBase, Shell, Spark, Storm, Hue and Big Data.
Client: Play Phone, San Mateo, CA. Duration: Jul 2013 – Aug 2014 Role: Hadoop Developer
Responsibilities:
Assisted the Admin team while building up the cluster, optimized configuration.
Installed and configured Flume, Hive, Pig, Sqoop and Oozie on the Hadoop cluster.
Worked on developing MapReduce programs for data cleansing.
Designed and developed MRUnit for unit testing MapReduce jobs.
Implemented incremental import job using Sqoop to import data from OLTP-RDBMS.
Installed and configured Hive and also written Hive UDFs.
Created complex Pig UDFs for specific business calculations.
Hands on experience in loading and transforming heavy sets of structured, semi structured and unstructured data.
Involved in defining job flows.
Extracted files from using Sqoop and stored in Hadoop distributed file system and processed.
Involved in designing job flows.
Implemented a non-conflicting job setup in Oozie.
Involved in loading data into Cassandra using sstaleloader and accessing data using CQL and Java API.
Environment: Hadoop, MapReduce, MRUnit, HDFS, Hive, Pig, Java, SQL, Cassandra, CQL.
Client: Express Scripts, St. Louis, MO Duration: Sep 2012 - Jun 2013
Role: Sr. Java Developer
Responsibilities:
Created design documents and reviewed with team in addition to assisting the business analyst / project manager in explanations to line of business.
Worked as part of the Agile Application Architecture (A3) development team responsible for setting up the architectural components for different layers of the application.
Responsible for understanding the scope of the project and requirement gathering.
Involved in analysis, design, construction, and testing of the application.
Developed the web tier using JSP to show account details and summary.
Used web services Client for making calls to data.
Generated Client classes using WSDL2Java and used the generated Java API.
Designed and developed the UI using JSP, HTML, CSS and JavaScript.
Utilized JPA for Object/Relational Mapping purposes for transparent persistence onto the SQL Server database.
Used Tomcat web server for development purpose.
Involved in creation of Test Cases for JUnit Testing.
Used Oracle as Database and used Toad for queries execution and also involved in writing SQL scripts, SQL code for procedures and functions.
Used CVS for version controlling.
Developed application using Eclipse and used build and deploy tool as Maven.
Environment: Java, J2EE [Servlet, JSP], Web Services, Tomcat, JavaScript, CVS, Maven, Eclipse and SQL Server.
Client: ITC INFOTECH, Bangalore, India Duration: Sep 2010 – Jun 2012
Role: Java Developer
Responsibilities:
Involved in requirements gathering and creation of technical specification creation document.
Developed the application using Struts MVC, and Hibernate as ORM.
Designed and developed Hibernate mapping files.
Involved in Database design and developing SQL Queries, stored procedures on SQL Server.
Designed and developed user interfaces using JSP, Java script and HTML.
Used CVS for maintaining the Source Code.
Created Stored Procedures to manipulate the database and to apply the business logic according to the user’s specifications.
Developed the Generic Classes, which includes the frequently used functionality, so that it can be reusable.
Developed Exception Management mechanism using Exception Handling framework handle the exceptions
Designed and developed Junit Classes.
Implemented the project according to the Software Development Life Cycle (SDLC).
Environment: JAVA, Java Script, HTML, Shell scripting, Struts, Hibernate, and SQL Server.
Client: InfoTech Inc., Hyderabad, India. Duration: Oct 2009 - Sep 2010 Role: Software Engineer
Responsibilities:
Utilized the base UML methodologies and Use cases modeled by architects to develop the front-end interface. The class, sequence and state diagrams were developed using Rational Rose and Microsoft Visio.
Designed application using MVC design pattern.
Developed front-end user interface modules by using HTML, XML, Java AWT, and Swing.
Front-end validations of user requests carried out using Java Script.
Designed and developed the interacting JSPs and Servlets for modules like User Authentication and Summary Display.
Designed and developed Entity/Session EJB components for the primary modules.
Java Mail was used to notify the user of the status and completion of the request.
Developed Stored Procedures on Oracle 8i.
Implemented Queries using SQL (database triggers and functions).
JDBC was used to interface the web-tier components on the J2EE server with the relational database.
Environment: Java, Java Script, HTML, XML, Microsoft Visio, JSP, Servlets, JDBC, JSP and Oracle