Post Job Free

Resume

Sign in

Java Developer Web Services

Location:
Prosper, TX
Posted:
March 01, 2024

Contact this candidate

Resume:

Arun Kumar

Full Stack Java Developer

Linked In: linkedin.com/in/arun-kumar-438190218

Phone: 940-***-**** Email: ad31ia@r.postjobfree.com

PROFESSIONAL SUMMARY

IT professional with 10 years of experience as Full stack developer with expertise in Design, Development, Analysis and Testing and implementation of distributed Web Based Applications and Client/server applications using Java/J2EE Technologies.

Have a good understanding of Agile and Waterfall methodologies.

Have worked on various Service Oriented, Client/Server, GUI and Web based applications in these years.

Expertise in design and development of various web and enterprise applications using JPA, Hibernate, JSP, JavaScript, Servlets, JDBC, Web Services, JAX-WS, Axis, and RMI

Expertise in AngularJS controllers, directives, factory and service resources, routings, and events.

Experience in using JDBC to connect to a database and perform operations.

Very good implementation knowledge and hands-on SOAP (JAX-WS) and RESTful (JAX-RS) web services.

Knowledge of using SOAP UI and browser extension Postman to test RESTful web services.

Good Knowledge on Spring Core, Spring Boot.

Experience in developing web service applications using SOAP and WSDL.

Experienced in GUI/IDE tools like IBM Rational Application Developer (RAD), Eclipse, and IntelliJ.

Experience in using Spring ORM module and integration with Hibernate ORM framework.

Strong Expertise in Core Java, data structures, algorithms, Object Oriented Design (OOD) and Java concepts such as OOP Concepts, Collections Framework, Exception Handling, I/O System, Multi-Threading, Reflection, Generics, Interfaces, Synchronization, and other new features in Java 7 and 8.

Developed GUI interfaces using HTML5, CSS3, JavaScript, jQuery, Typescript, AngularJS, Backbone JS, JSP and Servlets.

Good experience in writing SQL Queries, Stored Procedures, functions, packages, tables, views, triggers and data connectivity and data integration of Oracle.

Hands-on experience in relational databases like Oracle, MySQL, MS SQL Server, PostgreSQL using SQL, PL/SQL programming and NoSQL databases like MongoDB, Cassandra.

Proficient in designing, implementing, and managing data warehousing solutions using Snowflake, leveraging its unique architecture for scalable and high-performance analytics.

Strong command of SQL within Snowflake environment, adept at writing complex queries, optimizing query performance, and creating views, stored procedures, and user-defined functions (UDFs).

Test Driven Programmer with thorough knowledge in Unit testing with Junit and using SoapUI, Postman for Web Service testing, performance testing with JMeter.

Experience in automated testing with Test Driven Development (TDD) in the Extreme Programming model.

Developed a robust data access layer abstraction in Angular applications, decoupling frontend components from backend data sources and providing a unified interface for CRUD operations.

Implemented security measures such as authentication, authorization, and data encryption in NoSQL databases, ensuring data confidentiality, integrity, and compliance with regulatory requirements.

Engineered algorithms and data structures tailored for real-time systems, ensuring low-latency processing and high throughput for critical applications such as financial trading platforms and online gaming servers.

Led the successful implementation of Camunda BPM (Business Process Management) in enterprise environments, including process analysis, modeling, design, configuration, and deployment.

Utilized Camunda BPM to automate complex business processes, resulting in increased efficiency, reduced operational costs, and improved process visibility.

Managed the deployment of Camunda BPM applications in various environments, including on-premises and cloud-based infrastructures, using modern DevOps practices such as CI/CD pipelines and containerization.

Implemented multi-threaded applications in Java, utilizing thread pooling, synchronization, and coordination mechanisms to improve performance and concurrency.

Developed real-time systems incorporating multi-threading concepts to handle concurrent tasks such as data processing, event handling, and communication.

We use SonarQube for self-managed, automatic code review tool that systematically helps you deliver Clean Code

Proficient in Java and Python programming languages for developing robust and scalable applications.

Experienced in Unix/Linux environment for system administration and development tasks.

Skilled in database management, including Oracle, with expertise in designing schemas, writing complex queries, and optimizing database performance.

Familiar with AWS (Amazon Web Services) cloud computing platform, including EC2, S3, and RDS, for deploying and managing applications.

Knowledgeable in big data technologies such as Hadoop and Spark for processing and analyzing large datasets efficiently. Worked on optimizing Python code for performance and scalability in real-time environments.

Collaborated with cross-functional teams to deliver real-time solutions that meet business requirements.

Proficient in Hive for querying and managing structured data in Hadoop ecosystem.

Experienced in developing applications using Spring framework, leveraging its features for building enterprise-grade solutions.

Led the end-to-end implementation of multiple AEM projects, including requirements gathering, architecture design, development, testing, and deployment phases.

Successfully integrated AEM with third-party systems such as CRM platforms, e-commerce solutions, and marketing automation tools to ensure seamless data flow and enhanced customer experiences.

Ensured high availability of applications by deploying them across multiple Kubernetes nodes and configuring load balancing with tools like Kubernetes Ingress or Service Mesh

Expert in coding using core java concepts like multithreading, collections, serialization, synchronization, exception handling and database connections.

Configured SOAP endpoints and operations, ensuring precise parameterization of SOAP messages through meticulous XML schema definitions, thereby enhancing interoperability and data consistency.

Spearheaded the implementation of SOAP-based communication protocols, establishing reliable and standardized interfaces for efficient data exchange across diverse systems and platforms.

Collaborated closely with cross-functional teams to design and deploy SOAP web services, leveraging industry best practices to ensure seamless integration and interoperability with existing systems.

Conducted unit testing, integration testing, and system testing for MuleSoft applications.

Collaborated with QA teams to validate the correctness and reliability of integration solutions.

Utilized Any point Studio to create robust integration flows.

Hands-on experience coding and building enterprise-level applications, ensuring high performance, scalability, and maintainability.

Implemented Apache Camel routes to facilitate seamless integration between legacy systems and modern applications, enabling data exchange and business process automation without disrupting existing infrastructure.

Developed reactive microservices using Spring Web Flux, enabling non-blocking, asynchronous communication between services and achieving high throughput and responsiveness in distributed systems.

Utilized Selenium Grid with Sauce Labs to scale test automation efforts and execute tests concurrently on multiple virtual machines.

Integrated Cypress for modern web application testing, leveraging its built-in features for fast, reliable, and deterministic test execution.

Implemented Cypress test suites to perform UI testing, API testing, and end-to-end testing of web applications with minimal setup and configuration.

integrated Spring Reactive with reactive streams libraries like Reactor or RxJava, enabling seamless interoperability with external reactive APIs, frameworks, and data sources, and facilitating stream processing and composition.

Implemented Redux for managing application state in large-scale React applications, defining actions, reducers, and selectors to centralize and synchronize data across components, enabling predictable and scalable state management.

Configured Apache Camel routes to transform messages between different formats like XML to JSON and route them to appropriate endpoints based on content, headers, or routing rules, ensuring interoperability and flexibility in message processing.

Leveraged advanced SOAP features such as attachments, headers, and intermediaries to enhance the functionality and performance of SOAP-based communication channels, addressing complex business requirements and use cases.

Optimized SOAP message processing and transmission, employing techniques like compression, caching, and asynchronous messaging to improve scalability, latency, and throughput of SOAP-based interactions.

Designed Apache Camel routes to monitor directories, FTP servers, or cloud storage services for file transfers, processing incoming files, and routing them to downstream systems for further processing or archival.

Developed and implemented algorithms for various problem domains, optimizing for time and space complexity to achieve high performance and scalability.

Proficient in setting up Splunk Universal Forwarders to collect data from various sources.

Knowledgeable in designing Splunk architectures for high availability and scalability.

Led Splunk version upgrades and migrations, including planning, testing, and execution phases.

Implemented best practices for minimizing downtime and ensuring a smooth transition to new Splunk versions.

Utilized AI techniques to enhance data processing and analysis tasks, improving efficiency and accuracy.

Executed complex SQL queries for data retrieval, manipulation, and analysis across diverse datasets.

Developed and optimized Hive queries for efficient data transformation and analytics on large-scale datasets.

Implemented PySpark scripts to perform data processing and analytics tasks, leveraging distributed computing capabilities.

Applied regression and performance testing methodologies to validate the accuracy and efficiency of data models and algorithms.

Designed and implemented relational and dimensional data models for efficient data storage and retrieval.

Utilized Hadoop/HDFS and Spark ecosystems for scalable and distributed data processing and analytics.

Implemented Unix scripting solutions for automation and orchestration of data processing workflows.

Leveraged GCP and AWS cloud platforms for scalable and cost-effective data storage and processing solutions.

Utilized Kafka for real-time data streaming and event-driven architectures.

Implemented CI/CD pipelines for automated testing, deployment, and continuous integration of data applications.

Managed and optimized Couchbase NoSQL databases for high-performance data storage and retrieval.

Implemented HBase for distributed, scalable, and fault-tolerant storage of structured and semi-structured data.

Utilized Hadoop ecosystem tools like Hive and Spark for real-time data processing and analysis.

Optimized Hadoop cluster performance by fine-tuning configurations and resource allocation.

Documented upgrade procedures and provided training to team members on new features and changes.

Demonstrated expertise in a wide range of data structures such as arrays, linked lists, stacks, queues, trees, graphs, and hash tables, applying them appropriately to solve complex problems.

Successfully solved algorithmic challenges on platforms like LeetCode, HackerRank, and Codeforces, showcasing proficiency in algorithm design, analysis, and optimization.

Utilized Anypoint MQ for asynchronous messaging and event-driven architectures.

Developed RESTful and SOAP APIs using MuleSoft to expose and consume services.

Proficient with Amazon Web Services (AWS) Cloud platform with its features including EC2, SNS, EBS, S3, Lambda, CloudWatch, Cloud Trail and IAM Security Groups that are utilized for different environments like dev, testing, Production.

Experience in using Clear Quest, Clear Case, CVS, and Subversion (SVN) for source control and Release Management.

Experience in working with FIX Protocol to create both admin and application messages that are interchanged between client and brokers.

Used Log Insight for monitoring the error logs and fixing the problems.

Strong experience working with version control software like Bitbucket, GIT and build tools ANT, Maven, Gradle and Jenkins.

Developed complex Hive queries for real-time data processing and analytics.

Optimized Hive queries for performance improvement and efficiency.

Implemented Hive partitioning and bucketing strategies to enhance query performance.

Worked with large-scale datasets in Hive, managing data ingestion and transformation processes.

Collaborated with data engineers and analysts to design and implement Hive-based solutions for business requirements.

Experience in Setting up the build and deployment automation for Terraform scripts using Jenkins.

Experience in designing application architecture based on MVC for design, data validation database access.

Practiced Behavior-Driven Development (BDD) methodologies with Jasmine, collaborating with stakeholders to define test scenarios and specifications, and translating them into executable test cases.

Experience in Requirement Analysis, Design, Development, Testing and Documentation of Software Development Life Cycle (SDLC).

Eagerness and willingness to learn new technologies and adapt to evolving industry trends.

Effective communication skills, enabling seamless collaboration with cross-functional teams and stakeholders.

A Team Player with adherence to position at all levels and ability to perform under stress. Excellent communication and interpersonal skills. Experienced in Agile/Scrum methodologies.

TECHNICAL SKILLS

Programming Languages

Java, C, SQL.

Software Methodologies

Agile, Scrum, SDLC Waterfall.

J2EE Technologies

Servlets, JSP, JDBC, EJB, JSF, Java Beans XSD, JAX-RS, JAX-WS, SOAP, WSDL, JMS

Web Technologies

HTML, DHTML, XML, CSS, jQuery, JavaScript, TypeScript AJAX, Bootstrap, JSON, Angular, React, React JS, Node JS, Express JS.

Frameworks

Spring, Hibernate, Struts, Spring MVC, Spring IOC, Spring Boot, Spring AOP, Spring Web Flow, JSF, Log 4J, JUnit, Hibernate, JPA, Apache

Web Services

SOAP, RESTful.

Databases

SQL, MySQL, MongoDB, Oracle, Cassandra.

O-R Mapping

Hibernate, JPA

Testing Tools/Others

Junit, Mockito, Log4J, Postman, Swagger.

Build Tools

Maven 3.5.0, Ant, Gradle.

Version Control Tools

GIT, GITHUB 2.12.0, SVN, Bit Bucket.

Application/Web Servers

Apache Tomcat, Web Logic Server, Web Sphere Application Server, JBoss.

IDE Tools

Spring Tool Suite, Eclipse, IntelliJ, Net Beans.

Operating Systems

Windows, Linux, Unix, Mac OS

Cloud Computing:

Azure, AWS, GCP

EDUCATION

Bachelor’s in computer science engineering. 2007 - 2011

Master’s in computer science 2012 – 2013

PROFESSIONAL EXPEREICNE

Client: Deloitte, Harrisburg, PA. Sep 2022 – CURRENT

Role: Full Stack Java Developer

Responsibilities

Actively pursued requirement gathering and pointing stories following sprint pattern to implement development work following the Agile methodology.

Developed the application using spring framework that leverages model view layer architecture, also configured Dependency Injection. Implemented cloud native application using Spring Cloud.

Involved in the Development of Spring Framework Controllers.

Developing application using spring core module and POJO's using Eclipse and worked on Messaging service.

Used HTML, CSS, TypeScript, JavaScript, jQuery, JavaScript, Ajax for developing the web interface.

Developed user interface using JSP, JSP Tag libraries JSTL, HTML5, CSS, and JavaScript to simplify the complexities of the application.

Spearheaded the development of modern web applications using Angular 13, leveraging its latest features and enhancements to deliver responsive, dynamic, and highly interactive user interfaces.

Designed and developed RESTful style Web Service layer.

Integrated Spring DAO for data access using Hibernate. Designed and developed RESTful style Web Service layer.

Used Hibernate ORM tools which automate the mapping between SQL databases and objects in Java.

Created hibernate mapping files to map POJO to DB tables.

Wrote SQL queries for JDBC prepared statement and used callable statements to call stored procedures.

Implemented unit test for testing Angular components with frameworks using KARMA and JASMINE

Developed comprehensive test suites using Jasmine for unit testing JavaScript and TypeScript code, ensuring the quality and reliability of software products.

Implemented AI algorithms within a Big Data environment, leveraging technologies like PySpark and Hive for data processing and analysis.

Implemented SOAP message handlers to intercept, process, and manipulate SOAP messages at various stages of the communication lifecycle, enhancing functionality and extensibility of the SOAP-based infrastructure.

Conducted regression and performance testing on Big Data systems, including Hive and Spark clusters, to ensure optimal performance and scalability.

Designed and implemented relational and dimensional data models for large-scale data warehouses, incorporating concepts like normalization and star schema.

Utilized Hadoop and HDFS for distributed storage and processing of large datasets, optimizing data workflows and job performance.

Integrated SonarQube with CI/CD pipelines (such as Jenkins, GitLab CI/CD, or Azure DevOps) to automate code quality checks as part of the development workflow.

Conducted code reviews and static code analysis using SonarQube to identify and remediate code smells, bugs, vulnerabilities, and security vulnerabilities.

Defined and enforced quality gates in SonarQube to maintain code quality standards and ensure adherence to best practices across projects.

Developed and maintained Unix scripts to automate data pipeline processes, enhancing efficiency and reducing manual intervention.

Leveraged Couchbase and HBase for real-time data storage and retrieval, ensuring high availability and scalability of distributed databases.

Collaborated with cross-functional teams to integrate AI models and analytics solutions into existing data platforms, driving actionable insights and business value.

Participated in code reviews and provided technical expertise in areas like PySpark optimization, Hive query performance tuning, and data modeling best practices.

Mentored team members on Big Data technologies and best practices, fostering knowledge sharing and skill development within the organization.

Utilized TCP/IP networking protocols for real-time communication between distributed systems, implementing socket programming in Java.

Developed client-server applications using TCP/IP sockets for real-time data exchange, ensuring reliable and efficient communication over networks.

Using Apache Axis, we developed a set of SOAP web services to expose payment processing functionalities such as authorizing payments, capturing transactions, and handling refunds. We defined SOAP endpoints and operations to represent each service method and parameterized the SOAP messages using XML schema definitions.

we implemented WS-Security standards to encrypt and sign SOAP messages to ensure confidentiality, integrity, and authentication of data exchanged between our systems and the payment gateway. We integrated with the payment gateway's authentication and authorization mechanisms, such as OAuth or mutual SSL, to establish a secure communication channel.

Contributed to the continuous improvement of data architecture and infrastructure, evaluating emerging technologies and recommending strategic enhancements.

Implemented asynchronous tests in Jasmine using techniques like asynchronous callbacks, promises, and async/await syntax, ensuring proper handling of asynchronous operations and timing issues.

Used JavaScript frameworks to develop web UI AngularJS using Node Package Manger NPM for Web application.

Closely worked with Kafka Admin team to set up Kafka cluster setup on the QA and Production environments.

Implemented backpressure handling strategies in Spring Reactive applications to control the flow of data between producers and consumers, preventing resource exhaustion and overload in high-load scenarios.

Developed Apache Camel routes to consume and expose RESTful APIs, handling requests/responses, authentication, and data transformation to enable communication between diverse systems and services.

Implemented asynchronous messaging patterns using Apache Camel with ActiveMQ, enabling reliable, scalable, and decoupled communication between distributed systems and applications.

Scaled NoSQL databases horizontally to accommodate growing data volumes and user loads, adding and removing nodes dynamically and rebalancing data distribution across clusters.

Implemented high availability and fault tolerance mechanisms in NoSQL databases using features like replication, sharding, and clustering, ensuring data availability and resilience against failures.

Had knowledge on Kibana and Elastic search to identify the Kafka message failure scenarios.

Used Docker, spring boot, JBOSS, Azure, Cassandra.

Integrated NoSQL databases such as MongoDB or Couchbase with Angular applications, leveraging their flexible schema design and horizontal scalability to store and retrieve data efficiently.

Developed custom Splunk apps and add-ons to extend Splunk's functionality according to business requirements.

Integrated Spark with Kafka for real-time data ingestion and processing.

Designed and implemented fault-tolerant Spark streaming applications using checkpointing and stateful processing.

Worked on Spark Structured Streaming for processing and analyzing continuous streams of data.

Utilized Spark MLlib for real-time machine learning tasks, such as model training and inference on streaming data.

Orchestrated the design and implementation of SOAP-based APIs, adhering to industry standards and best practices to facilitate interoperability and data exchange with external partners and systems.

Collaborated with stakeholders to define SOAP service contracts, specifying operations, message formats, and error handling mechanisms to ensure clarity and consistency in communication protocols.

Troubleshooted and resolved performance bottlenecks in Spark jobs to ensure efficient processing of real-time data.

Developed and maintained real-time data processing pipelines using Hive and Apache Kafka, ensuring high throughput and low latency.

Implemented optimizations in Hive queries and data models, resulting in significant performance improvements.

Implemented security measures and access controls for Hive tables and databases.

Integrated Hive with other big data technologies such as Hadoop, HDFS, and Apache Spark.

Created and maintained documentation for Hive schemas, queries, and data pipelines.

Participated in code reviews and provided constructive feedback for Hive scripts and queries.

Contributed to the design and architecture of data storage and processing solutions using Hive.

Conducted performance tuning and troubleshooting for Hive queries and jobs.

Utilized HiveQL for data manipulation and querying tasks.

Implemented security measures and access controls within Hive environments.

Integrated Hive with other Hadoop ecosystem tools for end-to-end data processing pipelines.

Familiarity with Splunk's development frameworks, including Splunk SDKs and REST APIs.

Implemented offline data persistence in Angular applications using NoSQL databases with client-side caching and synchronization mechanisms, ensuring data availability and consistency in unreliable network conditions.

Utilized AEM's targeting and segmentation capabilities to create personalized content experiences based on user behavior, demographics, and preferences.

Created custom visualizations and data models to enhance data analysis and visualization capabilities.

Configured and optimized Splunk ITSI for monitoring IT infrastructure, applications, and services.

Designed and implemented workflows using Camunda BPM to streamline business operations, manage task assignments, and enforce business rules and policies.

Developed custom plugins, extensions, and integrations for Camunda BPM to meet specific business requirements and enhance functionality.

Created comprehensive documentation including process diagrams, technical specifications, user guides, and training materials to facilitate knowledge transfer and ensure effective use of Camunda BPM within the organization.

Developed service-oriented dashboards and KPIs to provide insights into the health and performance of IT services.

Implemented real-time data processing solutions using Hadoop and Spark technologies.

Developed and implemented real-time data processing solutions using Hadoop ecosystem technologies.

Enhanced system performance by optimizing MapReduce jobs and fine-tuning Hadoop cluster configurations.

we collaborated closely with the payment gateway provider's technical team to exchange WSDL files and establish the contract between our systems and theirs. We also followed best practices for versioning and documentation of the SOAP web services to facilitate future updates and maintenance.

Collaborated with cross-functional teams to design and deploy scalable and fault-tolerant data pipelines.

Conducted troubleshooting and debugging of Hadoop applications to ensure smooth operation and minimal downtime.

Integrated Splunk ITSI with ITSM tools for end-to-end visibility and monitoring of IT operations.

Implemented various search algorithms (e.g., binary search, breadth-first search, depth-first search) and sorting algorithms (e.g., quicksort, mergesort) for efficient data retrieval and organization.

Leveraged graph algorithms (e.g., Dijkstra's algorithm, Floyd-Warshall algorithm) to solve problems involving network analysis, shortest path finding, and connectivity analysis in diverse applications.

Contributed code enhancements and bug fixes to open-source projects, demonstrating practical application of algorithmic and data structure knowledge in real-world software development.

Hands-on experience in administering Snowflake accounts, managing users, roles, and privileges, and configuring security policies to ensure data protection and compliance.

Proven track record in performance tuning and optimization within Snowflake, including query optimization, resource utilization monitoring, and fine-tuning of virtual warehouses for optimal performance.

Proficiency in Apache Camel framework, leveraging its capabilities for integration purposes, message routing, and transformation.

Developed data streaming pipelines using algorithms and data structures optimized for handling continuous streams of data in real-time, enabling real-time analytics, monitoring, and decision-making in dynamic environments.

Deployed applications on Azure by using AppDynamics.

Design and develop scalable, secure, and highly available web applications using Java and GCP technologies.

Installed (GUI, Silent mode) configured and maintained IBM WebSphere Network Deployment Server.

Deployed EARs, WARs in the WebSphere Application Servers.

Implemented a generic interface to Hibernate criteria API for UI search functionality.

Involved in designing and implementing the caching functionality in developing and accessing the Database using Elastic search.

Configured security measures in MuleSoft, including OAuth, SSL, and encryption.

Designed and developed MuleSoft integration solutions for seamless communication between applications and systems.

Implemented error handling mechanisms in MQ to manage and log errors during message processing.

Moving existing monolithic architecture to Micro services and migrating them to AWS cloud.

Managed AWS infrastructure as code using Terraform.

Currently working on fitness tool which aids in Test Driven Development (TDD).

Environment: Java 17, J2EE, JSP 2.3, PL/SQL, Angular JS 1.8, Spring Framework 5.0, Spring Boot, EJB 3.2, Kafka, Angular 13, JMS, JNDI, Oracle21, XML, SOAP, Junit 4, Apache Camel, Web Sphere 8.5, GCP, hibernate 5.4, Microservices, AWS, Azure, Elastic search, Terraform, JDBC, MS SQL Server 2019, JESS, RESTful Web service, SOA Design Patterns, Jira, My Sql, Cassandra, LDAP, NOSQL

Client: San Bernardino County, San Bernardino, CA Feb 2021 – Aug 2022

Role: Full Stack Java Developer

Responsibilities

Used Spring Boot for the application development.

Used Spring MVC and Dependency Injection for handling presentation and business logic.

Consumed Web Services to interact with other external interfaces to exchange the data in different forms by using RESTful service.

Implemented RESTful and SOAP based Web Services and used Soap UI for testing.

To maintain loose coupling between layers published the business layer as services and injected necessary dependent components using Spring IOC and published cross cutting concerns like Logging, User Interface exceptions, Transactions using Spring AOP.

Developing user-friendly GUI interface and Web pages for Client-side validation using React 17, React 9, HTML, SCSS, CSS.

Create RDS servers for data storage and EC2 Instances on AWS for running backend servers.

Worked on the orchestration layer for communicating between the Microservices and persisting the data in the Cassandra database.

Performed deployment of applications on AWS.

Worked on Amazon AWS cloud services like Writing Lambda functions for producing data from different sources, S3 as storing the data, SNS, SQS, RDS, IAM for security, Cloud



Contact this candidate