Post Job Free
Sign in

Data Java Developer

Location:
Detroit, MI
Posted:
April 03, 2020

Contact this candidate

Resume:

Harish

adclyb@r.postjobfree.com

248-***-****

PROFESSIONAL SUMMARY:

7years of versatile experience in analysis design development and implementation of software applications and developing n-tier architecture based solutions with distributed components and internet/intranet applications

Extensively worked on core java concepts like Multithreading, Inheritance, Abstraction, Encapsulation, Polymorphism, Exception handling and Collections Framework.

Thorough working knowledge in application development using Java SE, with strong experience in MVC (Model View Controller) architecture using Spring, Servlets, JSP, Struts, JDBC, Java Beans, Hibernate, Rest API/SOAP web services, EJB, JQuery, JavaScript, JSF, Angular JS, AJAX, JSON, XML and HTML5.

Experienced in MEAN/MERN stack development using AngularJS 1.5, Angular 2, ReactJS, Express JS, NodeJS and minimal exposure of hands-on with Firebase as a database.

Experience in building API’s as micro services using Spring 4, Spring Boot ORM frameworks like Hibernate, Apache Camel, JPA.

Experience identifying Java 1.6/1.7 garbage collection issues and tuning JVM parameters for high throughput or low latency.

Experience with rules engines (Camel, Drools, JRules) & modern tooling (Gradle, Maven, Git, SVN).

Design, develop, test and debug large scale complex data platform using cloud and big data technologies Analyze, Engineer and improve stability, efficiency, and scalability of the platform

Experience in developing Highly Scalable and large distributed systems using core Java multi-threading.

Experience in writing application level code to interact with APIs, Web Services using AJAX, JSON, XML, AngularJS and Node JS.

Proficient in NoSQL transactional databases like Cassandra, Mongo DB and big data technologies like Spark/Storm, Kafka and Hadoop Map Reduce.

Experience with data caching services (Redis, Elastic Search and Memcached), database schema design and data access technologies.

i have bit knowledge on Apache NiFi .Worked for few tickets on my last project .

Experience in middleware design and development experience on Oracle SOA Suite which includes BPEL, ESB (Mule), WSM, Oracle Rules Engine, BPM and Work List Application .

Developed various Task flows and screens for DAS and IDA Applications using ADF AND JDeveloper

Provisioned, deployed applications into AWS cloud environment using services like EC2, Elastic beanstalk, Lambda, Docker, Cloud Formation, RDS, DynamoDB, S3, SQS Cloud Watch, SES and SNS.

Experience in implementing security models OAUTH2 and SAML for authentication/authorization using Spring Security and IAM Cloud Security.

Knowledge and experience working with various data sources like web services (REST, SOAP), unstructured data files, flat files, message queues, xml based events, databases.

Expertise in developing applications using Spring Framework’s Spring MVC, Spring DAO, Inversion of Control, SpringBootand Dependency Injection.

Experience with Unix/Linux and shell scripting and Python.

Extensive experience and actively involved in requirements gathering, analysis, design, coding and Code Reviews, Unit and Integration Testing.

Experience in implementing software best practices, including Design patterns, Use Cases, Object Oriented analysis and design, Agile methodologies, and Software/System Modeling (UML).

Experience working in Agile (Scrum) methodologies and test-driven development, continuous integration (Jenkins/Hudson) and version control (SVN, GIT, etc.).

EDUCATION:

Bachelor Degree in Engineering from Jawaharlal Nehru Technological University (JNTU),Hyderabad,India.

TECHNICAL SKILLS:

Languages

C/C++, Java, SQL, PL/SQL, UML, J2EE, HTML, DHTML, XHTML, UML2.0

Java Technologies

Core Java, JAVA, J2EE, JSP, Servlets, JSF, EJB 3.0, SOAP/REST Web Services, NoSQL, Camel, DOJO, Spring, Hibernate, Python, Oracle SOA, AWS, JSF, Hibernate, JPA.

Web Technologies

JSP, Servlets, Struts 2.x, Spring MVC, RESTful, AJAX, jQuery, HTML, CSS, Liferay 6.0, Alfresco 3, JAXB, JSON, XML, Jersey API, Angular-JS, JSF, React-JS.

IDEs

Eclipse, RAD 7.5/8.5.

Web/App. Servers

Web Logic, Web sphere, JBoss, Apache Tomcat, Tomcat7.x, IBM WebSphere7.x/8. X, JBOSS, Apache Kafka, RabbitMQ.

Tools

Maven, Git, Putty, SVN, JIRA, Jenkins, Hudson, JUNIT.

Operating systems

Windows, Mac OS, UNIX, Linux.

Databases

Oracle 9i/10g, SQL Server PostgreSQL 9.0, MySQL, DB2, Mongo DB, Cassandra.

Design Patterns

MVC, Singleton, Business Delegate, Service Locator, Session Facade, DTO, DAO, Factory Pattern

Frameworks

Spring Dependency Injection, Spring MVC, Kafka, Spring Core, Spring Context, Spring AOP, Spring DAO, Spring IOC, Spring JDBC, Hibernate, DWR, Log4j.

Version Control

Rational Clear Case, CVS, VSS, SVN, GitHub.

Cloud Technologies

AWS services: EC2, EBS, S3. Elastic Search, Solr, Spring Cloud

PROFESSIONAL EXPERIENCE:

FORD MOTOR COMPANY, Dearborn, MI May 2019 – till date Role: Software Engineer

Project: SCA-V

Description: Ford connected vehicle strategy by enabling data from products to flow through SCA-V Data Platform and integrate with other internal Ford systems utilizing the Global Data Insight and Analytics Data Supply Chain and Discovery Zone. Create a Single Complete and Actionable view of vehicle data by supporting the landing, transforming, discovery, and archiving of various data streams from our products. Enable advanced new technologies in the big data and analytics space that will lead to improved quality, design, marketing, and development of our current and future products. Maintain leadership around enabling connectivity in the next generation of vehicles allowing Ford to remain a technical leader in the automotive space.

Responsibilities:

Review and understand the business requirements and functional specifications of the client modules using the Mainframe technologies;

Develop technical specification using Microsoft office suite and version control them. The documents that are prepared will be reviewed using the four-eye principle for accuracy;

Provide technical assistance from design and development standpoint using the debugging and analysis tools like Xpeditor.

Used Atlassian products like JIRA for task tracking and Implemented SPLUNK logging.

Design and develop the REST based Microservices using the Spring Boot for the application to seamlessly integrate with supporting sub systems

Work with JIRA to resolve the issues, Bitbucket for Source Control and version management.

Plan test activities with customer team and prepare test scenarios per customer requirements;

Review test cases / test procedures using customer approved test management tools;

Involve in functional analysis of specifications, track tickets arising from customers as part of Integration/system testing and resolving problems using the ticketing tools;

Identify customer’s business flow and gap analysis; provide recommendations in key strategic areas involving competitor performance.

Created auto deploy jobs using Jenkins and Maven.

Design, track and coordinate project activities to meet project deliverable.

Develop enhancements, system documentation, and production support and implement procedures for quality improvement and development using the Mainframe technologies;

Integrate the new modules within the existing system and perform the end to end testing in the Mainframe environment;

Customize various applications/reports and implement selected business model;

Knowledge transfer to the new team members;

Status checks with onsite and offshore (if any) team and prepare the weekly status report (WSR) for client management; and

As part of agile methodology, participate actively in daily customer meetings including stand-ups and scrum meetings.

Involving in creating Hive Tables, loading with data and writing Hive queries to do analytics on the data.

Analyzed the SQL Scripts and designed the solution to using pyspark.

Developing the Sqoop scripts in order to make the interaction between HDFS and MySQL Database.

Designing and Developing MapReduce Jobs using Java

Design API’s using RESTful services for leveraging the vehicle data to client facing side according to the business Case.

Complete end-to-end testing of the Modules developed and response time testing to achieve efficiency.

Developing POC’s (Proof of concepts) for the new technological implementations to help improve architectural efficiency.

Developed PCF (Pivotal Cloud Foundry) jobs to expose the data to business teams for internal applications use case.

worked on IBM Sterling OMS which works with DOM which is a central Hub that connects all other systems.

contributed to an OMS using DOM features to locate inventory and availability and also support Pricing, for cross-channel consistency, and multi-tiered pricing for enhanced loyalty or discount program support.

integrated the DOM with other applications to achieve cross-channel which helps customers experience seamless in-sync experience.

The IBM Sterling Distributed Order Management (DOM) module within the IBM Sterling OMS increases your fill rates, lowers buffer inventory, and reduces your manual processes.

Streaming the data to a business client using Kafka (producer and consumer model)

Creating Kafka Topics to stream-line the data required by multi-functional teams to work effectively.

Used Jenkins CI/CD Pipeline to build and deploy the applications to different environments (DEV/QA/PROD).

Data Migration, landing, Transformation are the main goals of the team.

Used Rally tool for planning, estimating and projecting the scope of requirements and meeting the timelines.

Environment: Java 1.8, Spring Framework, Spring boot, MapReduce, Yarn, Sqoop, Alteryx, Apache Hadoop, Python, HDFS, Kafka, Accurev, Git, Apache Spark, PCF, REST API’s, Rally, Jenkins, Hive, HQL, IntelliJ, Eclipse, gradle, Putty, API Connect, Ranger, Ambari, Postman.

Target Corporation, Minneapolis, Minnesota Nov 2018 – May 2109

Project: Retail Co-existence in Finance (FIT Integrations)

Sr.Java Developer

Description: Retail Co-existence team in finance tier deals with modernizing the mainframe application into java platform where we take the decommissioned application data from a Kafka topics and apply the integration flow which contains the business logic for the respective shims. All the shims leverage FIT integration pattern allowing more time to focus on business logic.

Responsibilities:

Agile Scrum Methodology, participated in two week sprints to deliver the user story points assigned at the start of each sprint.

Used FIT pattern to accelerate the development process of the integration flow.

Used Kafka topics to leverage the producer and consumer features.

Designed, developed, and deployed a multitude-applications utilizing AWS stack (EC2, RDS, Elastic Beanstalk, S3, SNS, SES and SQS).

Experience in building API’s as Microservices using Spring 4, Spring Boot ORM frameworks like Hibernate, Apache Camel, JPA.

Developed micro-services using of Spring Boot and exposed REST web services. Packaged the API’s as Docker images and deployed on Kubernetes platform on AWS.

Migrating existing application into Micro services architecture using Java, Rest APIs and Oracle SQL .

Extensively used Akka http which helps lay the ground work for the team to use different libraries to code the functionality required.

Camel with Spring boot and Kafka used extensively to develop integration flows in each shim to achieve data flow across the modernized tech-stack.

Developed a Balancing program where the camel route is triggered every 5 hours pulling last 5 days data of all the flags and update the balancing table in the Cassandra database.

Developed an Error-reprocessing program where the all the records that are in error state/non- processed state are sent to the raw Topic(a Kafka Topic) where it is sent through a producer and submitted for reprocessing.

Used JUnit and Mockito for unit and functional testing.

Used TAP (internal platform) for deploying the applications in Dev environment for testing the flow.

Environment:Java 8, Spring Boot, Camel, Kafka, Cassandra, PostgreSQL, JUnit, Mockito, TAP,FIT Pattern, Camel DSL,CQL,SQL, Web-Services, Slf4j, Spring AOP.

Price Waterhouse Coopers, Tampa, Florida Aug 2018 – Nov 2018

Project: PwC Assurance Innovation (Micro Services)

Full-Stack Developer

Description: Micro Services team mainly focuses on breaking the Monolithic/Legacy Application into the most granular functionality state with an advanced tech-Stack of Scala with Akka(libraries) which help in achieving the multi-module application through a functional approach. Team uses a Gateway to proxy the services to legacy/Micro Services application based on the use case. Major Functionality revolves around OCR (optical Character Recognition) which is an engine used to achieve document extraction into tags/terms for the user.

Responsibilities:

Extensively used Akka http which helps lay the ground work for the team to use different libraries to code the functionality required.

Developed fully functional responsive modules based on Business Requirements using Scala with Akka, ArangoDB.

Having experience with Cloud Computing Service environment like Amazon Web Services (AWS) and managed AWS like EC2, S3 bucket, Virtual Private Cloud (VPC) through AWS Console and API Integration.

Experience in identifying Java garbage collection issues and fine-tuning JVM parameters for high throughput or low latency.

Experience in migration of monolith applications to Micro services platform to AWS or PCF.

Exposed data with Restful endpoints in addition to other application specific functionality and implemented micro services using Scala and Actor System libraries to deploy and enable discovery using Oauth 2 for authentication.

Used various Data stores like ArangoDB, HDFS, Couchbase, and Cassandra.

Used RabbitMQ to put messages on the queue to be consumed by Cassandra to journal various events while migrating data from ArangoDB.

Using CQRS in Akka for building a distributed system where there is a read-side and a write -side which helps a user check for the events on the write-side (journal) and query the read-side.

Used Docker Container to automate the deployment of applications inside as images which can be shipped easily and used anywhere it is required.

Used Scala for the concurrency and synchronizing process for the functional yet Object-Oriented features.

Maintained various DevOps related tools for the team such as deployment tools and development and staging environments.

Environment: Scala, Akka http, play framework, ArangoDB, Postgres SQL, SL4J, Log4j, Docker, Couchbase, Cassandra, JVM, Event Bus, JSON, Micro services, RabbitMQ, IntelliJ, Scala testSpec, Jenkins.

GEICO, Chevy Chase, Maryland Sep2016–Aug 2018

Project: MDM Probabilistic Matching Engine

Sr. Java /J2EE Developer

Description: The MDM Probabilistic Matching Engine scoring algorithm compares a preconfigured set of attributes between two or more parties and derives a final score. PME (Probabilistic Matching Engine) Sync Service has been developed to provide accurate matching results for PME client search. It consumes Client data and updates PME database incrementally based on various data change events (Insert, Update, and Delete) at client side. The Probabilistic Matching Engine matching score is a numerical value, and the magnitude of this value indicates the degree of the match and signifies the likelihood that any two parties are the same. The higher the score is, the higher the likelihood that the two parties are the same.

Responsibilities:

Experienced in Agile Methodology, participated in Sprints and daily Scrums to deliver software tasks on-time and with good quality on basis with onsite and offshore teams.

Used Angular 2 Components to view logic on the page and on click execution at the core part in the controller class.

Worked on Angular 2 Metadata, data binding, dependency injection to attach functionality at run time.

Extensively used AWS to integration with Enterprise and web applications.

Developed end to end application on Spring Boot framework (REST Application/Spring JPA using Crud repository).

Developed fully functional responsive modules based on Business Requirements using HTML5, CSS3, Bootstrap, AngularJS, jQuery.

Developed Angular JS views, controllers, directives, and http services. Made AJAX calls to the backend services.

Experience in writing application level code to interact with APIs, Web Services using AJAX, JSON, XML, AngularJS and Node JS.

Used Angular JS in development of the web application to bind the data/model that is being retrieved from a database through services provided in a controller to the view using scope.

Developed DAO layer using Hibernate Template and JDBC Template Operations. Integrated Ehcache as a second level cache in the DAO layer for Hibernate.

Developed Application using Spring DAO, Spring AOP and Hibernate, Spring annotations and published SOAP and Restful Web services.

Experienced in RDBMS Design, DB normalization, ER Modelling, SQL, PL/SQL, procedures, cursors, functions, triggers, and good Understanding in creating SQL and HQL Queries in Oracle, MySQL, DB2.

Used Spring data JPA with Hibernate 5 in data access layer to build the persistence layer.

Secured web applications, user authentication, role-based access control, N-Tier architecture, DB/file encryption, input validation techniques. Implemented Spring Security with LDAP integration and implemented OAUTH2, SAML authentication/authorization schemes.

Responsible for developing systems using messaging bus such as Kafka, Rabbit, and implementing both SOA and micro service architectures.

Implemented SOLID principles, Transaction Management and Layering for n Tier Application using patterns like Repository, Facet, Unit of Work etc.

Development and implementation of Camel Routes for Notices

Implementation of Caching for services using REDIS etc.

Developed individual reusable micro services small functional unit sand responsible to sync the micro services network and to ensure their smooth functionality.

Created auto deploy jobs using Jenkins and Maven.

Created and maintained various DevOps related tools for the team such as deployment tools and development and staging environments on AWS and Cloud.

Developed all the Junit test cases for all the different layer implementations.

Used Jenkins tool and made customizations according to the local build to track out the build status.

Environment: Java 1.7/1.8, AWS, MicroServices, Mongo DB, Agile, JDBC, Hibernate, SpringCore, Spring AOP, Spring Boot, AngularJS, XML, SQL, Oracle, JUnit, jQuery, JSON, Log4j, Web Services, Maven, Jenkins, Unix, JAX-WS, JAX-RS.

WELLS FARGO, Raleigh, North Carolina Aug 2014 - Aug 2016

Project: Branch Foundation

Sr. Java Developer

Description: Branch Foundation Eminently conducts on the Data integration for International payments processing. Work includes partnering with oracle flex cube application to batch process urgent and non-urgent payments.

Responsibilities:

Worked on managing and configuring jobs for Build/Deploy/ on Continuous Integration toollike Jenkins.

Worked on writing reusable/common shell scripts and managing the packages on Linuxoperating system.

Worked on writing internal wrapper API’s on top of cloud API’s like Amazon S3/SNS/SQS/SES using spring boot.

Used ADF Frame work in developing providing the services.

Knowledge of application clustering/load balancing concepts and technologies.

Worked on developing highly performant distributed system that can process the millions of events generated from the upstream payment system clients.

Used ADF extension for personalization on cloud services.

Written Kafka Client programs including use of Kafka Streams and Kafka Connect.

Used Hibernate in data access layer to access and update information in the database.

Migration of Web Sphere application server to Tomcat application and used Eclipse IDE and deployed the application on TOMCAT server.

Responsible to support/manage Apache Kafka in production environment, deep understanding of Kafka operations including: Scaling Services, Managing partitions, and optimizing cluster rebalancing times

Responsible for releases and packaging of the application to pre-production environments. Experience with containers and virtual environments (Vagrant, Docker, Kubernetes).

Conduct systems analysis and code reviews. No piece of code stays untested, write unit-tests to cover all scenarios.

Worked on SQL(Oracle) and/or NoSQL databases (Redis, Cassandra) and developed common persistence services to ingest and vend data from the data stores. Also integrated with Spring.

Responsible for hands-on Java development using Java 7/8/9 on large-scale, full stack web application.

Developed highly performant RESTful web services using Spring MVC, Sping Core, Spring Boot.

Responsible for developing systems using messaging bus such as Kafka, Rabbit, and implementing both SOA and micro service architectures.

Environment: Java 1.7/1.8, AWS, MicroServices, Agile, JDBC, JSP, Servlet, Hibernate, SpringCore, Spring AOP, Spring Boot, XML, SQL, Oracle, JUnit, Angular JS, JSON, Log4j, Web Services, Maven, Jenkins, Unix, JAX-WS, JAX-RS, Kafka, Cassandra.

Reliance Energy Ltd., Hyderabad, India July 2013 – July 2014

Java Developer

Responsibilities:

Implemented Struts framework based on the Model View Controller (MVC) design paradigm

JUnit was used to implement test cases for beans in a test driven development (TDD) environment

Used J2EE Design Patterns (Session Façade)

Used AJAX components in developing UI

Designed and developed business components using Session and Entity Beans in EJB

Worked on triggers and stored procedures on Oracle database

Worked on Eclipse IDE to write the code and integrate the application

Used Log4j as a debugging tool

Used CVS repository for version control

Environment: Java 1.5/1.6, JDBC, Java EE(EJB, Entity beans), JSP, Servlet, JPA, Struts, XML, SQL, Oracle, JUnit, Log4j, Web Services, Maven, Unix, JAX-WS, JAX-RS, Weblogic.

KCP Technologies Ltd, India July 2012 – June 2013

Software developer

Responsibilities:

Developed integrated end-to-end solutions for both corporate and consumers with a wide range of products and services delivered over a common Internet backbone Infrastructure.

Developed user interfaces for different task types.

Migration of web sphere application server to Tomcat application and used Eclipse IDE and deployed the application on TOMCAT server.

Redesigned task management framework to support custom task types.

Migration and backward compatibility issues handled to support previous two releases.

Developed authorization panels that assist the administrator to assign roles to users, actions to roles

Developed translation panels to update language and translation specific information.

Developed framework to support, creation of new reports, new dashboards, activate & deactivate reports.

Validated the application in order to maintain multi browser support (IE10, Chrome and Firefox).

Environment: Java 1.5/1.6, JDBC, JSP, Servlet, JPA, XML, SQL, Oracle



Contact this candidate