Post Job Free

Resume

Sign in

Data Java Developer

Location:
Dublin, OH
Posted:
August 17, 2017

Contact this candidate

Resume:

614-***-**** Sekhar P ac1vu4@r.postjobfree.com

SUMMARY

Certified Hadoop professional for Cloudera and Hortonworks.

Experienced Hadoop Architect/Data Scientist with ability to training and lead the Big data Hadoop teams.

Architected highly-scalable, high performance stream-based data processing, predictive analytics system.

Extensive experience and knowledge in Cloud, Hadoop and Spark architect and designing.

Specialized in design and development of Big Data Technologies in highly scalable end-to-end Hadoop Infrastructure.

Designed and Implemented the real-time analytics and ingestion platform using Lambda architecture.

Extensive Knowledge in AWS Cloud platform, Data modeling.

Passionate about data and focused on building next generation Big Data applications.

Extensive knowledge in providing business solutions using Hadoop technologies.

Experience in Apache SPARK, HADOOP 2.0 Ecosystems, Scala java and Python and R.

Over 15 years of total experience that in Architect design and developing enterprise applications using Java J2EE technologies and NOSQL databases.

Hands on experience in installing, configuring, and using Hadoop ecosystem components like Map Reduce, YARN, HDFS, HBase, Casandra, Oozie, Hive, Sqoop, Pig, and Flume.

Experience in working with large scale Hadoop environments build and support including design, configuration, installation, performance tuning and monitoring.

Experience in importing and exporting terabytes of data using Sqoop from HDFS to Relational Database Systems and vice-versa.

Experience in architecting Hadoop clusters using major Hadoop Distributions – CDH4 & CDH5 and Hortonworks.

Experience in managing and troubleshooting Hadoop related issues.

Extensive knowledge in ETL tools.

Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.

Knowledge in job/workflow scheduling and monitoring tools like Oozie & Zookeeper.

Experience in analyzing data using HIVE QL, PIG Latin and custom Map Reduce programs in JAVA. Extending Hive and PIG core Functionality by using custom User Defined Functions.

Worked with application teams to install operating system, Hadoop updates, patches and version upgrades as required.

Hands on experience in virtualization and worked on VMware Virtual Center

Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the ecosystem in Hadoop.

Extensive experience in Requirements gathering, Analysis, Design, Reviews, Coding and Code Reviews, Unit and Integration Testing.

Experience in using different applications development frameworks like Hibernate, Struts, and spring for developing integrated applications and different light weight business components.

Experience in developing service components.

Experience in developing and designing Web Services (SOAP and Restful Web services).

Experience in developing Web Interface using Servlets, JSP and Custom Tag Libraries.

Good knowledge and working experience in XML related technologies.

Experience in using Java, JEE, J2EE design Pattern, for reusing most effective and efficient strategies.

Expertise in using IDE like WebSphere (WSAD), Eclipse, NetBeans, WebLogic Workshop.

Extensive experience in writing SQL quires for Oracle, Hadoop and DB2 databases using SQLPLUS. Hands on experience in working with oracle (9i/10g/11g), DB2, NoSQL, MySQL and knowledge on SQL Server.

Extensive experience in using SQL and PL/SQL to write Stored Procedures, Functions and Triggers.

Excellent technical, logical, code debugging and problem-solving capabilities and ability to watch the future environment, the competitor and customers probable activities carefully.

Proven ability to work effectively in both independent and team situations with positive results. Inclined towards building a strong team/work environment, and have the ability to accustom to the latest technologies and situations with ease.

TECHNICAL SKILLS

Hadoop

SPARK, HDFS, Map-Reduce, TEZ, Pig, Hive, HBase, Flume, Sqoop, Zoo Keeper and Oozie, Storm, Kafka, ELK, Graphx

Hadoop Cluster

Cloudera CDH4/5, Hortonworks

Deployment Frameworks

Puppet

BI Tools

Tableau

IDE’s

Eclipse and Net beans

NoSQL Databases

HBase, MongoDB, Cassandra

Frameworks

MVC, Struts, Hibernate and Springs

JEE Technologies

JSP, Servlets, Spring, Strut JDBC, EJB, JMS, SOAP, Restful Webservices, IBatis, Hibernate

Programming languages

C, Java, Scala, Python and Linux shell scripts

SQL Databases

Oracle 9i,10g,11g, MySQL

Web Servers

Web Sphere, Weblogic,JBoss, and Apache Tomcat

Web Technologies

HTML, CSS3, JavaScript and JQuery

PROFESSIONAL EXPERIENCE

NOKIA September 2015 - Present

Nokia’s Motive analytics solution (MAS) platform has been designed to build home analytics products like Home Access Analytics (HAL). HAL monitor the digital home ecosystem to proactively detect and resolve issues before they impact the custom experience. Home Analytics (HAL) collects data from home networks and connected devices, proactively identifying issues. HAL delivers continuous improvement by combining analytics-based insights with a closed-loop optimization process. Nokia Home Analytics (HAL) monitors subscribers’ experience with home devices, proactively detects home issues, and provides automatic and interactive recommendations to resolve issues across the digital home ecosystem. Nokia HAL provides all the capabilities needed to optimize customer support, remediate problems, and fix home issues before they impact the customer experience.

Technical Architect/DataScientist for global team spread across distinct locations in US, UK and India.

Understanding the business requirements and needs and drawing the road map for Big data initiatives for Nokia’s customers.

Responsible for building scalable distributed data solutions using Hadoop and Spark on Nokia’s Cloud with MAS platform.

Playing key role in design, develop and implementing Nokia’s home analytics (HAL) products.

Responsible for Cluster maintenance on Nokia’s Cloud, commissioning and de-commissioning cluster nodes, Cluster Monitoring and Troubleshooting.

Orchestrate hundreds of HIVE Jobs using Oozie workflows.

Implementation of distributed stream processing and predictive analytics ecosystems.

Assisted with automated deployments using Puppet on Nokia’s Cloud platform.

Designed and Implemented the real-time analytics and ingestion platform using Lambda architectures based on kafka, Sqoop, Flume, Hive, Oozie, Cassandra, Hbase, SparkSQL, SparkML, Python, java.

ICC OHIO

March 2014 - Septemeber 2015

Role: Hadoop Lead

Description: Macy’s is one of largest department store, catalog, and e-commerce Retailers in Atlanta The purpose of the project is to store terabytes of log information generated by the e-commerce website and extract meaningful information out of it The solution is based on the open source Big Data framework Hadoop. The data will be stored in the Hadoop file system and processed using Hive to obtain product and pricing information, extract various reports out of the product, pricing information and export the information for further processing.

Responsibilities:

Evaluated suitability of Hadoop and its ecosystem to the above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative.

Estimated Software & Hardware requirements for the Name Node and Data Node& planning the cluster.

Extracted the needed data from the server into HDFS and Bulk Loaded the cleaned data into HBase.

Written the Map Reduce programs, Hive UDFs in Java where the functionality is too complex.

Involved in loading data from LINUX file system to HDFS

Develop HIVE queries for the analysis, to categorize different items.

Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.

Given POC of FLUME to handle the real-time log processing for attribution reports.

Sentiment Analysis on reviews of the products on the client’s website.

Exported the resulted sentiment analysis data to Tableau for creating dashboards

Used Map Reduce JUnit for unit testing.

Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive).

Reviewing peer table creation in Hive, data loading and queries.

Monitored System health and logs and respond accordingly to any warning or failure conditions.

Responsible to manage the test data coming from different sources.

Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs

Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.

Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts

Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool.

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java.

Feb 2013 – March 2014

Role: Hadoop Developer

Description: Capital One is implementing a Data Quality project. The purpose of the project is cleansing of data. The raw data is splitted into chunks of data and then it is keyed and it is validated for the data quality. Once the data is validated it is then used for analysis.

Responsibilities:

Worked on the Hadoop File System Java API to develop or Compute the Disk Usage Statistics.

Experience in Developing the Hive queries for the transformations, aggregations and Mappings on the Customer Data.

Worked on importing and exporting data into HDFS and Hive using Sqoop.

Worked on analyzing/transforming the data with Hive and Pig.

Developed map reduce programs for applying business rules on the data

Developed and executed Hive Queries for DE-normalizing the data.

Automated workflow using Shell Scripts.

Performance Tuning on Hive Queries.

Involved in migration of data from one Hadoop Cluster to the Hadoop Cluster.

Worked on configuring multiple Map Reduce Pipelines, for the new Hadoop Cluster.

Performance tuned and optimized Hadoop clusters to achieve high performance.

Implemented schedulers on the Job tracker to share the resources of the cluster for the map reduces jobs given by the users.

Worked on Integration of Hiveserver2 with Tableau.

Environment: Hadoop, Map Reduce, HDFS, Hive, Java, Hadoop distribution of Cloud era, Pig, Hbase, Linux, XML, Java 6, Eclipse, Oracle 10g, PL/SQL.

Senior Java Consultant – Nationwide Insurence Jan 2011 - Feb 2013

Worked as Senior java consultant on Class Download application. The class download application is basically intended to provide the auto redaction and creation of legal claim documents for Nationwide. This application provides the ability to the users to handle auto redaction process which will save the manual effort of redaction and will give the ability to print the documents as per Nationwide legal print protocol.

Project Responsibilities :

Interaction with Nationwide Legal Business Analysts and analyze the requirements.

Involved in Architectural decisions.

Interaction with team members and participation in technical meetings .

Involved in High level and low level technical design.

Provide an estimation of hours for development.

Daily stand up meeting with team member and provide the work progress.

Involved in Coding and mentoring junior developers on complex issues.

Extensively used J2EE design pattern.

Wrote test cases for unit testing using Junit and involved in integration testing.

Used Struts framework for application front end development.

Integrated application with Spring framework to implement dependency injection.

Application deployments in Websphere server.

Involved in code builds using SVN and Maven.

Senior Java Consultant– Abercrombie and Fitch

Worked as Senior Java consultant on A&F Job App Management system application. The basic scope of this project is to provide Job application system to international customers to allow them to apply for Job electronically both at store and on the internet. It also provides the ability to the management to view, schedule the interview, process and completed submitted applications.

Project Responsibilities:

Interaction with Business Analysts and analyze the requirements.

Interaction with team members and participation in technical meetings.

Involved in High level and low level technical design.

Provide an estimation of hours for development.

Daily stand up meeting with team member and provide the work progress.

Involved in Coding and mentoring junior developers on complex issues.

Extensively used J2EE design pattern.

Wrote test cases for unit testing using Junit and involved in integration testing.

Used Struts framework for application front end development.

Used JavaScript and Jquery for client side validation and interactive web pages.

Integrated application with Spring framework to implement dependency injection.

Application deployments in Tomcat server.

Involved in code builds using SVN and Maven.

Nationwide Insurance 2009 – 2010

Java Technical Lead

Provided Java Technical leadership at Nationwide on three different project initiatives:

The ICP project which allowed My Nationwide customers to have a branded Standard Auto policy to make changes to their policies online and obtain a premium change without requiring manual intervention from Service Centers.

The NSS lowers costs of Nationwide by selling products and services at a lower cost as well as using internet capabilities to reduce transaction/service costs from service centers and producers.

The NIAX Project is a new initiative to redesign current Internet applications onto a new technical platform that will enable a rich user experience and a resilient and responsive IT environment to support changing business needs. This gave the ability to incorporate new functionality quickly and efficiently.

Project Responsibilities:

Interacted with clients to collect business requirements, analyze and design the system.

Designed various UML Diagrams like Class diagrams and Sequence Diagrams using Rational Rose.

Provided an estimate of hours for develoment.

Utilized the Agile methodology for several projects.

Utilized the SOA architecture.

Developed prototypes of the application in coordination with the offshore team for business approval.

Developed JUnits and code builds using Maven.

Extensively used Struts with Tiles to build the presentation layer.

Utilized Struts Validation Framework for client side validations.

Extensively used J2EE Design patterns.

Utilized Hibernate and Spring to build persistent and reliable application modules.

Used iBatis as an ORM tool for OR mappings and configured Hibernate.cfg.xml and Hibernate.hbm.xml files.

Integrated the application with Spring framework to implement dependency injection and provide abstraction between the presentation layer and the persistence layer.

Implemented Web Services using XML, WSDL, and SOAP over HTTP.

Used JavaScript, AJAX for client side validations and creating interactive web pages.

Wrote test cases for Unit Testing using JUnit, and involved in Integration Testing.

Developed the application on RAD and deployed the application on IBM WebSphere .

Created a process in Eclipse for more efficient coding for all developers.

Implemented a logging system for the project using Log4j.

Used Subversion as the version control system on Windows.

Helped team members to debug issues with the application.

Participated actively in the application production launch.

Prepared the test case documents for enhancements

JUNIT is used for unit testing and prepared JUNIT Test cases document.

Participated in code review and involved in integration, unit, functional testing, peer testing and integration testing.

Environment: JDK 1.5/1.4, J2EE, Servlets, Strut, Spring, Hibernate 3/3.5/4.0, HQL, Maven 3.0, JAX-WX, JAXB, XML, XSD, SOAPUI, JQuery, CSS, JUNIT, Oracle 9i/10g, SQL, PL/SQL, Quality Center, SSH shell, SSH Client, Putty, VSS, WAS, Web Sphere, Visual Studio, Microsoft Visio, Microsoft Project, UML, Share point, Windows XP and UNIX.

Java Technical Lead Cardinal Health 2005-2009

Was a part of 4 major initiatives as the Technical Lead, which include:

The DMS application involved business process modeling of orders and their states across different EIS.

Credit Approval Automation which provided a means of approving Credit Approval requests that were raised by Cardinal Health sales representatives. This was originally intended for only workflow applications, but now after helping on the 2nd release, it helps with non-workflow applications.

The CAR Automation that provided a means of approving Cardinal Projects that required funding.

The CORE implemented business logic that involved managing transmissions and transactions, translating a customer request to a common format, controlling the workflow of the business process and collaborating with other shared services (such as the Mapping and Validation) to complete its business process.

Project Responsibilities:

Gathered and analyzed requirements from the client.

Utilized the SOA architecture.

Coordinated design and development with the offshore team.

Used complex workflow features using VITRIA businessware.

Used Jakarta Struts as the MVC framework to design the complete Web tier.

Involved in end-to-end application development using J2EE, Struts and deployment using JBOSS, Weblogic application severs.

Developed several Action Servlet, Action, Form Bean, and Java Bean classes to implement business logic using the Struts framework.

Developed Command objects and Business Entity objects.

Developed JAXB objects and consumed Web Services.

Used JBoss application server as JMS provider to manage sessions and queues.

Developed a Data Access Object (DAO) pattern to abstract and encapsulate the data access mechanism.

Utilized Oracle as the database for Data persistence.

Wrote ANT scripts to build the web application.

Deployed the Java war file on the Development/Test Servers.

Used VSS for version control of the code and configuration files.

Marsh Insurance 2004 – 2005

Sr. Java Developer

Involved in development of different J2EE components like EJBs, Client jars, Web Modules, and Application EAR modules.

Involved in deployment of application in Jboss Server.

Used Apache’s Jakarta Struts as the MVC framework to design the Web tier.

Developed several Action Servlet, Action, Form Bean, and Java Bean classes to implement the business logic using Struts framework.

Integrated with a third party vendor package to store the molecular structure storage and retrieval.

Developed Data Access Object (DAO) patterns to abstract and encapsulate the data access mechanism. Utilized Oracle as the database for Data persistence.

Wrote ANT scripts to build the web application.

Used CVS for version control of the code and configuration files

Post Finance 2004

Sr. Java Developer

The base architecture was designed as per the MVC architecture using the Front Controller Design pattern based on the application requirements.

Used Eclipse for a Java IDE that supported the development and management of Java applications. Developed Business objects utilizing Session and Entity beans.

Involved in End to End layers coding.

Converted well-designed HTML pages to JSP pages by adding dynamic content to it.

Developed the Web Interface using JSP, Servlets, HTML, and CSS.

Consulting Company 2002 – 2004

Sr. Java Developer

Utilized Eclipse extensively on this project.

Developed business objects (DAO). Coded Session Beans.

Converted well-designed HTML pages to JSP pages by adding dynamic content to it.

Developed the Web interface using JSPs, HTML, CSS, and JavaScript.

Auto Serve Information System 2002 – 2003

Java Developer

Implemented Servlets and JSPs for the presentation layer.

Performed development using EJB in WebLogic. Used Apache Struts Framework for the presentation layer. Responsible for writing action classes.

Extensively used JSP and Struts Tag libraries. Used ANT for the build.

Strictly followed Iterative development best practices.

Used the MVC pattern for the presentation layer.

EKMS 2001

Java Developer

Implemented Servlets and JSPs for the presentation layer.

Extensively used JSP and Tag libraries.

Used ANT for the build. Utilized the Iterative development method.



Contact this candidate