Post Job Free

Resume

Sign in

Software Developer Data

Location:
Atlanta, GA
Posted:
May 30, 2020

Contact this candidate

Resume:

SriHarsha Devisetty

Phone: 210-***-****

Email: addig6@r.postjobfree.com

Certifications:

Microsoft Certified Azure Data Engineer Associate ( DP-200 & DP-201) Professional Summary:

Ingenious and Competent Software Engineer contributing 4 years of experience in IT Industry and Programming world

Doyen of Bigdata-Hadoop. Steered several new end-end implementations and configured, customized certain components in the existing functionality.

In-depth experience of Spark, Scala, Hive, Python, Pig and traditional technologies which include Core Java, Python, C#

, HTML,CSS, Java Script, jQuery.

Illustrious experience in building Spark applications. Noteworthy coding skills.

Good working knowledge in cloud integration with Amazon Cloud Compute (EC2), Amazon's Simple Storage Service

(S3)

Experienced in using Azure Data storage and Processing patterns like Data factory, ASA, Event Hub, Azure Datalake Storage Gen2

Working Knowledge of Salesforce and Comprehensive knowledge of Apex coding techniques. Proficient in integrations, deployments, data migrations and continuous integrations.

Magnificent communication skills, interpersonal skills and problem solving skills

Scuffle in deadline driven environments, remediable. Adept in understanding the business process

Client facing role-Agile methodology .Prompt sprint deployments Technical Skills:

Hadoop Ecosystems HDFS, Map Reduce, Hive, Sqoop, Flume, YARN, Oozie, Zookeeper, Presto, Spark, Spark SQL, Spark Streaming, NiFi, AWS, ELK, Azure Stream Analytics, Azure Datafactory, Azure Databricks

Languages C, C++, Java, Scala, Swift, SQL, PL/SQL

Frameworks J2EE, Spring

Web Technologies HTML, CSS, Java script, jQuery, Ajax, XML, WSDL, SOAP, REST API No-SQL HBase, Mongo DB, Azure Cosmos DB

Security Kerberos, OAuth

Relational Databases Oracle 11g, MySQL, SQL-Server Development Tools Eclipse, NetBeans, Visual Studio, IntelliJ IDEA, XCode Build Tools ANT, Maven, sbt, Jenkins

Application Server Tomcat 6.0, WebSphere7.0

Version Control GitHub, Bit Bucket, SVN

Miscellaneous Salesforce, Slack, Box

Education:

Masters of Science in Applied Computer Science

Northwest Missouri State University April 2017

Professional Experience

AT&T, Hadoop Developer, Consultant Jan 2018 – till date Responsibilities:

Developing enterprise applications using multithreading, relational databases and frameworks.

Re-architect existing high priority applications to avoid data loss and to improve query efficiency.

Developed a custom framework using Spark to perform a strict schema check while ingestion.

Worked with tools such as Message router/Kafka for collecting real time feeds.

Implemented framework to unpack contents/elements from the NoSQL databases to Hadoop.

Working Experience in using Apache NiFi to build custom processors.

Fine-tune complex Hive queries and PIG Scripts with a goal of providing faster and more efficient platform for business users.

Work on solution to process Terabytes of data using advanced Hadoop Technologies like Kafka, Logstash, Elastic Search and Spark.

Working on implementing a framework with NiFi which can be reused for all application ingestion in Datalake.

Architect Big Data Social Intelligence Platform.

Good Understanding on using the machine learning algorithms like Regression, Decision trees and Clustering.

Involved in designing and developing tables in HBase and mapping hashing and salt data with EFPE encrypted from Hive table

Worked on Migrating the existing scripts/jobs to be compatible with Hadoop 3.0

Drive POC Initiatives for finding the feasibilities of different traditional and Big Data tools.

Implement solutions to meet client requirements in serving data on agreed upon SLA’s.

Working experience migrating on prem Data lake to Azure Data Lake Storage Gen2 and using Azure Services Environment: Datalake, HWX, AWS EC2, S3, Spark, Hive, Scala, Eclipse, YARN, TWS, Java, Kerberos, HDF, Kafka, Sqoop, AWS, Spark Streaming NiFi, ELK, Redshift, Datalake Azure Anthem Inc, Hadoop Developer, Consultant June 2017 – Dec 2017 Responsibilities:

Experienced in transferring data from different data sources into HDFS systems using Kafka producers, consumers, and Kafka brokers.

Involved in performance tuning of Spark jobs using cache and using complete advantage of cluster environment.

Involved on development of RESTFUL web services using JAX-RS in spring based project.

Designed and developed External and Managed Hive Tables with data formats like Avro, ORC, and Parquet. Environment: Spark, Hive, Sqoop, Scala, Eclipse, YARN, Oozie, Java, Kerberos, CDH, Kafka, Web services Exadata Inc, Software Developer, Intern June 2016 – Dec 2016 Responsibilities:

Involved in collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis.

Creating files and tuned the SQL queries in Hive utilizing HUE.

Involved in Agile methodologies, daily Scrum meetings, Sprint planning.

Worked with team which analyses system failures, identifying the root cause and taking necessary action. Environment: HDFS, Hive, Hue, Flume, Sqoop.

HSBC, Software Developer, Intern Jan 2015-Dec 2015 Responsibilities:

Responsible for creating Java Servlet pages, Spring Controllers, DAO, Service classes

Experienced in retrieval and manipulation of data from the Oracle database by writing queries using SQL and PL/SQL.

Issue findings from the production system and providing the information to the app support team.

Developed complex Map Reduce jobs in Java to perform data extraction, aggregation, transformation and performed rule checks on multiple file formats like XML, JSON, and CSV.

Involved in the Bug Traise meetings with QA, UAT teams Environment: Java 1.6, Spring MVC, Spring IOC, JUNIT, Java Script, JQuery, Oracle, Maven, Map Reduce Special Projects:

Student Evaluation Jan 2017 – April 2017

Responsibilities:

Collected the requirements from the client and end users

Collaborated with a team of seven by following agile methodology

Implemented core functionality like students posting their responses and reflecting back to faculty.

Worked with team which analyses system failures, identifying the root cause and taking necessary action. Environment: NetBeans, Bootstrap, Javascript, PHP, MySQL, GoDaddy DealDash – iOS App Feb 2016 - March 2016

Responsibilities:

Responsible for collecting the requirements from end users and translating into functional tasks

Designed the user interfaces using UIKit framework

Used MapKit to find location of the store

Maintained and retrieved the records from database using back4App

Responsible for documenting the user manual, test cases, readme, client acceptance testing. Github Link: https://github.com/devisetty123/Mixed-bag-of-nuts--DealDash Environment: Xcode, Swift 2.1, back4App



Contact this candidate