Post Job Free

Resume

Sign in

AWS data Engineer

Location:
Milton, GA, 30004
Salary:
110
Posted:
January 28, 2022

Contact this candidate

Resume:

Kiran Atmakuri

Email: adp17z@r.postjobfree.com

Ph: 469-***-****

Summary

Over 17+ years of experience in Java/J2EE technologies and Big Data/Hadoop development which includes analysis, design and development. Hands-on experience in designing & developing analytics solutions involving AWS Datalake & Large-scale enterprise DWH.

Specific Expertise:

7+ years of experience on Hadoop Technologies with good knowledge of Hadoop Distributed File System and Eco System components like MapReduce, Pig, Hive, HBase, Flume, Sqoop, kafka, Spark, Kinesis and Oozie.

Experienced in using AWS Cloud Services like S3, EC2, EMR, Glue, Athena, Appflow, Lambda.

Strong experience in implementing data warehouse solutions in Confidential Redshift.

Hands on experience on Cloudera and Hortonworks hadoop environments.

Perform architecture design and implementation of cost-effective & dynamically scalable software solution using AWS/EMR/Hadoop echo system to suit the requirement.

Good experience working with different Hadoop file formats like SequenceFile, RCFile, ORC, AVRO and Parquet.

Involved in design and development, tuning, deployment and maintenance of NoSQL database like HBase and MongoDB.

Extensive expertise in creating and Automation of workflows using Oozie workflow Engine.

Experience in all phases of SDLC including Analysis, Design, Implementation, Integration, Testing and Maintenance of applications using technologies like Java 1.6, Flex, Spring, Struts, EJB 2.0, AJAX, Servlets, JSP, JSTL, Java Beans, JDBC, HTML, DHTML, CSS, UML, XML, XSL, XSLT, SAX, DOM, JavaScript & DOJO.

Proven ability to design and implement quality applications that meet or exceed customer expectations.

Strong working experience in distributed object-oriented component analysis and design according to industry leading J2EE framework on WebLogic, WebSphere, JBoss, Tomcat and Apache Servers

Experienced in using waterfall, Agile and Scrum models of software development.

Identify and analyze user requirements and recommend appropriate applications or modifications.

Experience on system performance monitoring and tuning.

Have excellent analytical, problem solving, communication and interpersonal skills with ability to interact with individuals at all levels and ability to work independently

Lead business requirement workshop for operational and analytical reporting and data feed need for multiple business teams.

Strong Hands-on experience in creating and modifying SQL stored procedures, functions, views, indexed, and triggers.

Technical skills

BigData Ecosystem

Hadoop MapReduce, Hive, Pig, HBase, Sqoop, Flume, Oozie, Zookeeper, Spark

Languages/APIs

AWS

J2EE (EJB, JMS, RMI, JDBC, JSP, JSTL, Servlets), XML, JDK, PL/SQL

EC2, EMR, RDS, Redshift, Data Pipeline, Kinesis, DynamoDB, S3, Data Pipeline, Cloud Watch, IAM

RDBMS

Oracle, SQL Server, DB2, Postgres

Web Technologies

DOJO, Flex, XML, JQuery, XSL, XSLT, HTML, DHTML, CSS, JavaScript and VB Script

Frame Work

Apache Struts, Hibernate, Spring, iBatis, Drools

Web & App servers

WebSphere, WebLogic, JBoss, Apache & Tomcat

Source Control

GitHub, Visual Source Safe, TFS, CVS, SVN and PVCS

Tools

JBuilder, Apache Ant, Eclipse, NetBeans, Quality Center, Rational ClearCase, Hue, Ambari, Cloudera Manager, Jenkins, Nexus

Testing Tools

JUnit, Test Director, Win Runner, QTP

OS

Unix, Windows, Sun Solaris, CentOS, Linux

Project Name: TotalSource Data Platform Client: ADP Inc, GA

Position: Lead Hadoop Developer Duration: Sep’16 – Till Date

ADP TotalSource provides small and medium-size businesses outsourcing solutions to reduce the complexities and costs related to employment and human resources. Company’s integrated solution incorporates employee benefits management, human resources administration, regulatory and compliance management, safety and risk management, payroll administration and retirement plan services. To achieve all of this it needs to integrate with multiple systems.

The TotalSource data platform project will serve as a consolidation and transformation engine for the data from above systems. This is done by building a Data lake and a Data warehouse using cloud technologies.

Roles and Responsibilities:

Built a data lake, using S3 as data substrate and EMR hadoop components, to store & analyze massive volumes and heterogeneous types of data.

Responsible for building framework to ingest/transform/dispatch data to/from data lake.

Ingested data in incremental/full loads from different data source into datalake (oracle/salesforce/flatfiles/mysql) using Sqoop and Hadoop Connectors.

Built a platform for real-time data ingestion for streaming data using kafka, mongodb and spark streaming technologies.

Extensively used pig & hive for data cleansing/transformation/aggregation and dispatch the data to redshift.

Developed a java utility to use Redshift copy command and dispatch the transformed data to data warehouse (redshift)

Extensively used AWS Glue to organize, cleanse and format data for storage in redshift data warehouse.

Used AWS Appflow to ingest data from Salesforce application to S3 storage.

Used Oozie workflow engine to manage interdependent Hadoop jobs and to automate several types of Hadoop jobs such as Java map-reduce, Hive, pig and Sqoop as well as system specific jobs.

Integrated S3 with Athena (serverless technology) to enable analytics team to access & query the data interactively.

Involved in directing all the application logs into splunk and use splunk tool for debugging and monitoring across all environments.

Lead designer for various generic components such as SCD and Data Validation.

Supported DevOps team during deployment, data loads and troubleshooting as well as during regular production fix activities.

Ensure the quality of entire data platform systems meet the needs of the business.

Worked on QA support activities, test data creation and Unit testing activities.

Technical Environment:

AWS EMR, Pig, Hive, Sqoop, Mongo DB, Redshift, Kafka, Spark, S3, RDS (mysql), Oozie, Salesforce, Kinesis, Kafka, MongoDB, Alteryx, GitHub, Splunk, Athena, Rally, Jenkins, Java, Eclipse IDE, sqlplus, Hue, Linux, Oracle, Spring Reactive.

Project Name: IDW (Integrated Data Warehouse) Client: T-Mobile, WA

Position: Lead Hadoop Developer Duration: Aug'15 – Sep'16

IDW (Integrated Data Warehouse) project’s purpose is to improve the modernizing of T-Mobile’s Business Intelligence solutions. The IDW on a Hadoop Data Lake is a scalable BI platform that can adapt to the speed of the business by providing relevant, accessible, timely, connected, and accurate data. This new data warehouse is an element of the larger BI ecosystem that spans reporting and analytical technologies. The intent of this project is to address the specific pain points in our current data warehouse, e.g., latency, integration, cost, security and multiple versions of the truth. IDW Project is complementary to our technical solutions regarding Big Data and discovery analytics

Roles and Responsibilities:

Involved in ingesting data into IDW staging directly from BEAM, (an inbuilt component for ingesting real-time data into Hadoop) using Apache Storm to push data into HDFS.

Used Oozie workflow engine to manage interdependent Hadoop jobs and to automate several types of Hadoop jobs such as Java map-reduce, Hive, pig and Sqoop as well as system specific jobs.

Part of the design team of the various generic components such as SCD and Data Validation.

Performed data profiling rules based on the S2TM using Drools rules framework.

Development of the solution for several data ingestion channel and patterns, also involved in production issues.

Mentored application teams with the development of Hive/UDF, MapReduce programs and Oozie workflows

Extensively worked on creating End-End data pipeline orchestration using Oozie.

Worked on QA support activities, test data creation and Unit testing activities.

Used HBase in accordance with Hive/Pig as per the requirement.

Worked on PIG joins, and Join optimization, processing the incremental data using Hadoop.

Used Sqoop efficiently to export the data from Hadoop to Teradata database.

Designed and developed custom scripts to automate routine tasks around Hadoop ecosystem.

Involved in developing a customized in-built tool Data Movement Framework (DMF) for ingesting data from external and internal sources into Hadoop using Sqoop, Shell script.

Used Control-M workload automation tool as a scheduler to execute all the jobs sequentially along with their dependencies.

Worked in Agile development approach and managed the Hadoop teams of various Sprints

Participating in Defect triage meetings and solving the open issues in HP Application Lifecycle management (ALM) quality center.

Technical Environment:

Devops, Rally, GitHub, Jenkins, Nexus, Control-M, Hadoop MapReduce, Pig, HBase, Hive, Sqoop, Linux, Java, Eclipse IDE, Hue, Oozie, Ambari, Teradata, Solr, Drools, Mysql, Java 1.7.x, Quality Center.

Project Name: CIP (Consumer Insights Platform) Client: AT&T, GA

Position: Senior Hadoop Developer Duration: July'14 – Aug'15

Consumer Insight Platform (CIP) uses large AT&T data sets (both internal and external), big data analytics, and web services to deliver strategic insights for AT&T enterprise customers. Consumer Insights Platform will provide the location and audience patterns that help businesses re-think marketing decision-making. Specifically, learn who, what, when, and where, specific to locations of interest and evaluate customer segments to learn about the locations they visit.

Roles and Responsibilities:

Data Ingestion into HDFS using tools like Sqoop and HDFS client APIs.

Data Transformation using tools like Pig and MapReduce. Also developed UDFs using Java for Pig.

Data Analysis using Hive and working on them using HiveQL.

Created Hive external tables, added partitions and worked to improve the performance of hive.

Automated all the jobs starting from exporting the Data from different Data Sources using Oozie.

Build the batch jobs and validate the DATA for the given business requirement.

Involved in HBase data modelling and row key design

Used HBase as NoSQL database for the web application developed using angular js to visualize the statistics.

Prepared complex Hive queries and involved in data loading and performed various ad-hoc reports.

Deep understanding of schedulers, workload management, availability, scalability and distributed data platforms

Involved in various performance tuning activities.

Prepare test plans with different inputs.

Technical Environment:

Hadoop MapReduce, Pig, HBase, Hive, Sqoop, Linux, Flume, Java, Eclipse IDE, Hue, Oozie, CentOS, Ambari, Oracle, SVN, Java 1.7.x, JIRA, Quality Center.

Project Name: PDC Retail Credit Client: AT&T, GA

Position: Lead Java Designer Duration: Nov'11 – Jun'14

PDC Suite provides a retail POS application used by indirect partners to do activation and customer maintenance for ATT products. PDC supports wireless, U-verse IPTV, High Speed Internet Access, VOIP, and home phone products and services for consumers and small businesses and a back-office application used by Company Owned Retail employees and Indirect partners to support the capture and retrieval of customer provisioning information for offline sales and manual orders. PDC interfaces with Telegence and is used in markets that were formerly Southwestern Bell Wireless markets.

Roles and Responsibilities:

Define functional and technical requirements relating to the Direct TV, Uverse modules of PDC project.

Design Use Case Diagrams, Class Diagrams, Activity Diagrams and Sequence Diagrams for Direct TV Module using UML and Rational Unified Process (RUP).

Involve in finalizing the project scope discussion for the application.

Developing the enhancements and fixing the bugs in the project which is developed using Java, Struts, Spring, EJB, and Webservices.

Involve in code reviews and mentoring the team members.

Assist QA and end-users with verifying application functionality and that application functionality adheres to requirements.

Ensure that coding standards are maintained throughout the development process by all developers.

Perform Unit, Integration and System testing.

Technical Environment:

Java, J2EE, Struts 2, Spring, EJB 2, JDBC, Oracle 11g, JQuery, Java Script, Weblogic 10.3, SVN, Ant

Project Name: Order Track Client: AT&T, GA

Position: Sr Programmer Analyst Duration: Mar'11 – Nov'11

AT&T needs an effective tracking tool for returns and associated refunds, of customer devices. This solution will drive down cost by providing to the AT&T customer and customer support professionals with enough information about returns and refund processing statuses. Proactive notifications to the customer and web-based information availability to the AT&T support professionals will further enhance customer experience.

The purpose of this project is to enhance the Ordertrack DF application to store, manage and display return, refund related information. This project will allow for customers to receive via email notifications the status of there returns and associated refunds.

Roles and Responsibilities:

Developed JSP Pages for view layer of the application using Struts Action tags, CSS, Dojo, JSTL and Java Script.

Coded the request processing layer (controllers) using the Java and Struts 2.

Involved in discussions for improving the performance of the functionalities.

Developed SQL queries, stored procedures and triggers based on the requirements and fixed the bugs.

Integrated Bing maps ajax version in the JSP to display the scan event locations based on the geo coordinates.

Created necessary JUNIT test cases to perform the Unit Testing of classes.

Developing the enhancements and fixing the bugs in various projects which are developed using Java, Struts, EJB, iBatis.

Used iBatis for database access and table mapping.

Used Ant to build and package the application.

Developed Velocity templates for generation of email notifications based on specified configuration in the application.

Technical Environment:

Java, J2EE, Struts 2, EJB 2, iBatis 2.3, Oracle 11g, Java Script, Weblogic 10.3, SVN, Ant, velocity, Bing Maps 7.0

Project Name: Galley Planning Client: Gate Gourmet, VA

Position: Sr Programmer Analyst Duration: Oct'09 – Mar’11

Gate Gourmet (www.gategourmet.com) is the world's largest independent provider of airline catering and provisioning services, which they deliver daily on a global basis to more than 250 airline customers.

The Galley Planning software will enable airlines to optimize galley design, loading, and balancing with automatic monitoring of weight loaded and available space. Functionality includes creation of tailored galley manuals with customized aircraft, galley, and tray layouts. Using a drag-and-drop user interface, the software will make it possible to design, load, and publish galley loading instructions and check and balance weights in accordance.

Roles and Responsibilities:

Primarily worked for the sub modules Container Catalog, Parts Catalog and Fuel Burn of GP4 Module of IF4 Portal.

Involved in finalizing the project scope discussion for the application.

Involved in discussions with customer for design the Click Through of the application.

Implemented Cairngorm MVC framework in the Flex UI.

Took an active part in designing database schema for Course and Core.

Customized and extended certain Adobe Flex Components like Data Grid, Tree, Panels, Windows and other components for the need of the application.

Involved in code reviews and mentoring the team members.

Responsible for writing controller and business logic using Java, J2EE.

Responsible for developing the business logic using core java and supporting business with Unit Testing.

Used Hibernate for database access and table mapping.

Involved in discussions for improving the performance of the functionalities.

Good at writing test cases using JUnit strictly following Test Driven.

Used Ant to build and package the application.

Involved in Unit, Integration and System testing.

Technical Environment:

Java, J2EE, Hibernate, Spring 2.5, Flex 3.4, Oracle 11g, Java Script, Action Script, JasperReports, Cairngorm MVC 2.2.1, BlazeDS 3.2, JBoss 5.1.0GA, SVN, Ant

Project Name: Electronic Service Scheduling (IFX4) Client: Gate Gourmet, VA

Position: Programmer Analyst Duration: Oct'08 – Oct’09

IFX4 4.0 is a web-based In-Flight Exchange™ portal application that helps airline carriers to do Service Scheduling (ESS), Service Design (Meal Manager), Service Ordering (ESO), Galley Planning (GP) and Service Verification (ESV). To accommodate an increasingly international customer base, eGate Solutions shall ensure that the next generation of In-Flight Exchange software supports multiple carriers as well as multiple languages and locales simultaneously. This development effort is intended to establish the IFX4 4.0 portal application as a leading-edge solution that addresses the specific needs of the carrier and simplifies incorporating complex business rules into their process.

Roles and Responsibilities:

Primarily worked for the modules IFX4 and ESS4 of IFX4 Portal

Extensively worked in the presentation layer using Dojo, Jsp, Jstl and Flex

Wrote front end validation scripts using javascript and involved in the integrating of presentation and persistence layer

Wrote simple HQL Queries in Dao's which will access Data from Data Base Using POJOs and which are mapped with Hibernate-Mapping xml Files

Extensively worked in the persistent layer using Hibernate

Responsible for developing the business logic using core java

Monitored the error logs using Log4J and fixed the problems.

Technical Environment:

Core Java/J2EE, Spring 2.5, Hibernate, Flex, DOJO, JSP, Servlet, JasperReports, CVS, Eclipse, JBoss, Junit, Windows XP, Redhat 9x, Mantis, Putty, WinSCP, Maven

Project Name: Colonial Life Client: Colonial Life, SC

Position: Sr. Software Engineer Duration: Mar'08- Oct’08

Colonial Life & Accident Insurance Company is an insurance company based in Columbia, South Carolina (USA). Colonial Life is committed to helping working Americans and their families minimize personal financial risk with a comprehensive offering of voluntary benefits through the workplace.

For employees whose insurance plans leave them feeling vulnerable, Colonial Life can help restore peace of mind through personal insurance products that complete their coverage. Colonial Life offers a broad line of personal insurance products including disability, accident, life, cancer, critical illness, hospital confinement and limited benefit medical coverage.

Roles and Responsibilities:

Responsible in defining and designing the layers, components of the project using OOAD methodologies and standard J2EE patterns and guidelines.

Developed the modules using Hibernate, Spring, HTML, DHTML, Servlets, JSP, CSS, Java Mail

Used J2EE design patterns like Value Object Pattern, MVC and Singleton Pattern.

Worked with CSS for look and feel of the webpage

Ensured configuration management using Concurrent Version System.

Extensively involved in designing the database using Oracle9i.

Involved in writing PL/SQL Stored procedures and Triggers.

Responsible for exporting and deploying the enterprise application on Development and Testing Environments.

Used Java mailing Service APIs for mailing notifications depending on the success and failure once the backend process is complete, also for mailing Administrator of any system related problem.

Integrated and deployed the application on WebLogic 8.1 Application Server.

Developed the automatic build scripts using ANT for the application to deploy and test

Used UNIX commands to debug the logs the application on the weblogic application server.

Used Bugzilla for bug tracking.

Technical Environment:

Java 1.4, Spring 2.5, Hibernate 3.1, Oracle 9i, Weblogic 8.1, AJAX, Java Mail API, Design Patterns, Log4J, Ant 1.4, Junit 4.0, CSS, HTML, DHTML, UML, Toad, WinCVS, Bugzilla, Eclipse, Windows XP, DOM Parser, SAX Parser, Javascript.

Project Name: Unicru Hourly Solutions Client: Kronos Inc, OR

Position: Sr. Software Engineer Duration: Oct'05- Dec’07

Kronos Inc is the first and only fully integrated workforce selection platform able to simultaneously streamline the hiring process and cut recruitment costs, increase the quality of your incoming workforce, enforce corporate policy yet facilitate local operations decisions, enable you to use your workforce as a competitive weapon, only solution with closed loop reporting to prove benefits attainment, only technology scientifically proven to predict employee tenure and performance for business, learning from your hiring experience

Roles and Responsibilities:

Rewriting the existing application from pure servlets to Struts/Hibernate architecture

Designed and developed screens using JSP and Struts framework

Developed business logic using Java beans

Used Hibernate framework for persistence

Used WebLogic as the application server.

Created Junit test cases and test suites for unit testing.

Work with designers, developers, architects, and QA to ensure comprehension and feasibility of requirements.

Used Clear case as the source code controller, Used Clear Quest for issues and defects.

Assist QA and end-users with verifying application functionality; and that application functionality adheres to requirements

Technical Environment:

Java, J2EE, Struts Framework 1.2.8, Hibernate 3.0, WebLogic, Eclipse 3.2, Rational Clear Case.

Project Name: Online Traffic School Client: Metro General, OR

Position: Software Engineer Duration: Jan'05 - Sep'05

This project is to create an online traffic school where users can take mandatory traffic school courses after being charged with a traffic violation. At the end of the course, a user has to pass an online exam conducted by NTSA to get the violation removed from their traffic records.

This project is mainly classified into 2 sections:

User modules: It includes home page, course registration form, Account details page, Personal information form, Courses, chapters etc. It totally integrates entire end user features.

Admin modules: It includes the forms to add/modify/delete the courses, chapters, pages. It also provides to provision for exam management and user management.

Roles and Responsibilities:

As a Java developer, I was involved in all stages of the project.

Involved in the initial design of components.

The Develop code for EJBs (Session and Entity Beans) /JSP/Servlets using Struts Framework.

Wrote Filter Servlets for performance Monitoring.

Developed code for user access profiles in Security module using JSP, Servlets.

Wrote reusable utility classes for various components to get EJB handles using JNDI

Lookups.

Used multi threaded Servlets for optimum performance.

Involved in the performance tuning of the stored procedures and SQL queries and PL/SQL

for improving the enrollment module performance.

Used Visual Age for Java 3.5.3 as development tool.

Created session beans and Entity beans and deployed in Web Sphere 3.5.3.

Tested, debugged and implemented the application using JUnit for unit testing.

Technical Environment:

JDK 2.0, EJB, Eclipse, Servlets, JSP, Java Beans, RMI, JDBC, HTML, JNDI, Java Script, WebLogic Server 8.1, Oracle, SQL, PL/SQL, MicroSoft Visual Source Safe, Win NT.



Contact this candidate