Post Job Free

Resume

Sign in

Software Engineer Data

Location:
Santa Clara, CA
Posted:
February 22, 2020

Contact this candidate

Resume:

Shreyas Sanjiv Kothavade

adbw3m@r.postjobfree.com www.linkedin.com/in/skothavade 408-(821)4207 www.shreyaskothavade.com 1559 Lewis St, Santa Clara, CA, 95050 EDUCATION

Master’s in Computer Science Santa Clara, CA

Santa Clara University 09/18/2017 – 03/22/2019

Relevant Courses: Algorithms, Pattern Recognition and Data Mining, Object Oriented Analysis and Design, Network Technology Bachelor’s in Computer Engineering Pune, India

University of Pune 08/07/2012 – 06/01/2016

SKILLS

Languages: Java, C, C++, Python, R, SQL, MongoDB, Unix/Shell/Bash Scripting Web Technologies: HTML, CSS, JavaScript

Big Data Ecosystems: Hadoop, HDFS, Apache Hive, Apache Spark, MapReduce, Talend Big Data Integration, Pentaho EXPERIENCE

Company: Hewlett Packard Enterprise Milpitas, CA

System/Software Engineer 2 (Java, REST API) 04/15/2019 – Present

• Work in Agile environment to develop new backend features and bug fixes in Java for the HPE OneView web app.

• Triage and Debug issues in quick time to roll out new software patches for the web application.

• Awards and Achievements: Second prize in the HPE hackathon and a Certified SAFe practitioner. Company: Hewlett Packard Enterprise Palo Alto, CA

System/Software Intern (Python, MongoDB, Flask, REST API) 06/18/2018 – 03/22/2019

• Performed Data Mining on Regression Test Logs to segregate them and store them in a database using MongoDB

• Designed an interactive Log Analysis Web Application to reduce time and manual effort required to analyze test logs by 80% Company: PRGX, India Pvt. Ltd. Pune, India

Developer (Talend Big Data Integration, Hadoop, Java, UNIX, MS, SQL Server Studio) 07/25/2016 – 07/31/2017

Designed and implemented various Talend ETL Models to load CSV, EBCDIC and Positional data into HDFS

Conducted professional trainings for newly joined candidates in the organization on Talend ETL Tool

Reduced data load failure due to server overload by 90% by designing Disk Usage Automation System using Talend ETL Tool

Achievements : Winner of PRGX, India Hackathon

Company: PRGX, India Pvt. Ltd. Pune, India

Project Intern (Java, Hadoop, Hive, HDFS) 06/15/2015 - 06/01/2016

Developed a SerDe (Serializer-Deserializer) to eliminate the conversion of EBCDIC data to ASCII

Reduced the time required to store EBCDIC data to HDFS by 50% by loading data directly to HDFS

Award: Best Project Award in Vishwakarma University in Big Data/Hadoop Domain

Journal Publication: ‘Avoid Data Conversion in Hadoop Using Custom SerDe’ in IJAET (ISSN: 2231-1963) PROJECTS

File Security in Cloud Using Division and Encryption (Python, OpenStack)

Introduced the concept of jigsaw division to divide the file into fragments using the binary encoding of file

Mapped the fragments into undirected graph and applied vertex coloring algorithm for their distribution on cloud

Encrypted the mapped fragments to increase security levels, and replicated all fragments to handle system failures Course Project - Yelp Dataset Challenge (SQL, Java, Java Swing, Eclipse, Oracle 11g)

The objective was to query academic dataset in an innovative way to extract useful information for local businesses. Course Project - Performance Evaluation of Hadoop MapReduce and Spark Engine using HiveQL (AWS, Hive, Spark, Hadoop)

Created a 4-node Hadoop Cluster using AWS EC2 service to analyze performance of different Hive Queries.



Contact this candidate