Post Job Free
Sign in

Real-Time Backend Developer

Location:
Columbia, MD, 21046
Posted:
February 25, 2025

Contact this candidate

Resume:

Uday Chaitanya

Java Backend Developer

Email: ****************@*****.*** Phone: 667-***-****

SUMMARY:

Java Backend Developer with 7 years of experience in designing, developing, and implementing scalable, high-performance, and secure enterprise-level applications using Java/J2EE technologies.

Developed distributed data processing applications using Apache Spark (RDDs, DataFrames, and Datasets) for batch and real-time analytics.

Optimized Spark jobs by tuning executors, partitions, and caching strategies, reducing processing time by 30-50%for large-scale datasets.

Developed dynamic front-end components using React, integrating seamlessly with RESTful APIs for real-time data updates.

Worked on state management in React using Redux, improving UI responsiveness and user experience.

Implemented Spark Streaming for real-time data ingestion and processing, ensuring low-latency analytics and real-time decision-making.

Built and deployed ETL pipelines with Apache Spark and Kafka, improving the efficiency of data ingestion into NoSQL databases.

Integrated Cassandra as a scalable data store for Spark applications, ensuring high availability and fast retrieval of large datasets.

Designed column-family data models in Cassandra, optimizing read/write performance for transactional and analytical workloads.

Developed and optimized big data processing pipelines using Apache Spark, ensuring efficient handling of large-scale data transformations.

Worked with digital media formats such as MP4, HLS, and DASH, optimizing video streaming performance in cloud environments.

Implemented video transcoding and adaptive bitrate streaming solutions for improved media delivery.

Designed and implemented microservices architecture using Java Spring Boot, enhancing system modularity and scalability.

Built and maintained real-time data processing pipelines leveraging Kafka for event-driven architectures.

Worked extensively with NoSQL databases like Cassandra and Solr, optimizing data storage and retrieval for high-performance applications.

Designed and developed scalable, multi-tier web applications in Java and Python, integrating with data platforms to support analytics-driven solutions.

Optimized SQL and NoSQL queries, ensuring high performance in large-scale datasets stored in Cassandra, Solr, and Azure CosmosDB.

Designed and developed highly available, scalable systems, ensuring low-latency data processing for analytics and reporting.

Implemented CI/CD pipelines for automated testing and deployment, streamlining development and release cycles.

Led performance tuning and debugging efforts in big data applications, identifying and resolving bottlenecks in Spark jobs and Kafka consumers.

Integrated GCP Big Data technologies like BigQuery and Dataflow, supporting large-scale analytics and machine learning workloads.

Followed Agile methodologies, participating in daily stand-ups, sprint planning, and code reviews to ensure high-quality development.

Led the development and planning of web applications that interact with large-scale data lakes and data platforms, ensuring robust and efficient data processing.

Hands-on expertise in developing RESTful APIs, implementing Object-Oriented Programming (OOP) principles, and optimizing performance for backend systems.

Built RESTful web services using Java (J2EE) and Python (Flask/Django) to facilitate seamless data integration and enable low-code/no-code experimentation platforms.

Experienced in containerization and orchestration using Docker, Kubernetes, and OpenShift, with a strong focus on deploying and scaling Spring Boot applications.

Adept in Front-End Technologies, including HTML, CSS, Bootstrap, JavaScript, and frameworks like Angular, ensuring seamless integration between frontend and backend.

Extensive experience in database design and integration, working with relational databases such as Oracle, MySQL, PostgreSQL, and NoSQL databases.

Proficient in implementing multi-threaded programming with the Executor Framework, optimizing concurrent processing and enhancing application performance.

Skilled in version control systems like Git and implementing CI/CD pipelines using Jenkins, automating build, test, and deployment processes.

Experienced in cloud-native applications, leveraging tools like AWS EC2, S3, and Kubernetes YAML configurations for efficient deployments and scaling.

Adept at following Agile methodologies and contributing to sprints with continuous improvement through collaborative teamwork.

Skilled in monitoring and logging solutions using the ELK Stack (Elasticsearch, Logstash, Kibana) and Prometheus, enabling proactive issue detection and resolution.

Proficient in integrating messaging systems with JMS, Kafka, and RabbitMQ for real-time data processing and asynchronous communication.

TECHNICAL SKILLS:

Programming Languages

Java17, Java11, Java8, Core Java, J2EE, XML, SQL, PL/SQL

Middleware

J2EE, JBoss4.2.3/4.3.0

Web Development:

RESTful APIs, JSON, XML, JSP/Servlets

Cloud Technologies

AWS(EC2, S3, RDS, Lambda), Microsoft Azure

Scripting Languages

UNIX Shell Scripting

Frameworks and Libraries:

Spring Boot, Hibernate, Spring Security, Apache Kafka

Monitoring and Testing Tools

Kibana, Grafana, Junit, Mockito, Sonar, Fortify

IDE’s/Utilities

Eclipse, Putty, IntelliJ, NetBeans

Databases:

MySQL, PostgreSQL, MongoDB, Redis, Oracle 10g/11g

Build Tools

Maven, JIRA, Gradle, Bamboo, SQL Developer

Process & Concepts

Agile, SCRUM, SDLC, Object-Oriented Analysis and Design, Test driven Development, Continuous Integration.

Containerization tools

Docker, Kubernetes, Open Shift.

Version Control

Git, GitHub, GitLab

Methodologies

Agile, Waterfall, TDD (Test Driven Development)

CI/CD Tools

Jenkins, docker and Kubernetes

Testing tools

HTML Unit, Jasmine, Mockito, BDD Bev events and Api testing Selenium

Servers

Tomcat, Web sphere, App Services

Other Skills:

Microservices Architecture, Database Optimization, API Design, Agile/Scrum Methodologies

PROFESSIONAL EXPERIENCE:

Client: Procal Technologies 08/2023 – 04/2024

Project: Scalable Order Management System for E-Commerce

Role: Java Backend Developer

Location : New Jersey

Responsibilities:

Involved in all phases of the Software Development Life Cycle (SDLC), including analysis, design, development, testing, and deployment of the scalable order management system.

Proficient in developing scalable APIs using Spring Boot and Kotlin, ensuring seamless integration and efficient performance.

Implemented Azure-based cloud solutions, utilizing services like Azure Data Factory, Azure Functions, and CosmosDB for scalable data processing.

Implemented secondary indexes and materialized views in Cassandra to enhance query performance and reduce processing overhead.

Developed full-text search and indexing solutions using Apache Solr, improving search capabilities for large-scale unstructured data.

Designed and implemented high-performance NoSQL data models in Cassandra, optimizing read and write patterns for scalability and low-latency access.

Developed custom data partitioning and replication strategies in Cassandra to ensure fault tolerance and high availability in distributed environments.

Implemented Cassandra secondary indexes, materialized views, and custom queries to enhance searchability and data retrieval efficiency.

Debugged and optimized Cassandra query performance using CQL tracing, compaction strategies, and read/write consistency tuning.

Tuned Solr indexing and query performance, implementing sharding, replication, and caching strategies for high availability.

Integrated Solr with Spark for distributed search analytics, enabling real-time insights and intelligent search recommendations.

Leveraged Solr faceting and query parsing to build advanced search filters and ranking algorithms for enterprise applications.

Developed GCP Big Data solutions, integrating BigQuery and Dataflow for data analytics and processing.

Designed and optimized SQL and NoSQL queries, improving database performance and query execution times.

Strong expertise in object-oriented programming with experience in Kotlin and other languages like Java and C#.

Hands-on experience with Azure Cloud Services, including Azure Functions and Azure PaaS, for deploying and managing cloud-native applications.

Designed and developed RESTful APIs using Spring Boot for managing order creation, payment processing, and inventory updates.

Implemented database interactions using Hibernate and optimized complex SQL queries to enhance database performance.

Designed and optimized NoSQL data models to handle structured, semi-structured, and unstructured data efficiently.

Experience with data pipeline orchestration tools such as Apache Airflow to manage and automate data workflows efficiently.

Integrated machine learning models into data pipelines, leveraging technologies such as Spark and Kafka to enhance data processing capabilities.

Integrated Apache Kafka for asynchronous communication, enabling real-time updates to inventory and shipping services.

Improved system reliability and performance by implementing caching mechanisms using Redis, ensuring quick access to high-frequency data.

Containerized the application using Docker and deployed it on Kubernetes clusters, ensuring scalability and fault tolerance during peak traffic.

Automated build, test, and deployment workflows using Jenkins, reducing manual intervention and deployment time by 40%.

Adept in conducting coding exercises, showcasing expertise in writing clean, maintainable, and efficient code following best practices.

Developed and implemented robust error handling and logging mechanisms to monitor API performance and streamline troubleshooting.

Collaborated with front-end developers and QA teams to ensure seamless integration of backend services with the user interface.

Actively debugged and resolved production issues, providing 24x7 support to maintain uninterrupted operations.

Created proof of concepts (POC) to evaluate new tools and technologies for future scalability and performance enhancements.

Followed best coding practices using Java, Spring Boot, and Microservices architecture, ensuring maintainable and high-quality code.

Impact:

Reduced order processing time by 50% through optimized APIs and asynchronous service processing.

Enhanced system scalability, supporting a 200% increase in traffic during high-demand periods like flash sales.

Achieved 99.9% system uptime by implementing proactive monitoring and containerized deployments.

Environment: Java, Spring Boot, Hibernate, RESTful APIs, MySQL, Redis, Apache Kafka, Docker, Kubernetes, Jenkins.

Client: Infosys 08/2020 – 09/2022

Project: Employee Management System for Enterprise HR Operations

Role: Java Backend Developer

Location : Hyderabad India

Responsibilities:

Involved in all phases of the Software Development Life Cycle (SDLC), including requirement gathering, analysis, design, development, testing, and deployment of the Employee Management System.

Developed responsive front-end interfaces using JavaScript frameworks like Angular and Node.js, ensuring optimal user experience for data analytics applications.

Spearheaded the creation of new services and system enhancements, ensuring high availability and fault tolerance.

Conducted performance tuning and debugging, identifying and resolving bottlenecks in distributed data systems.

Developed Cassandra-based caching and indexing layers, improving query performance by 30-40% for large-scale distributed applications.

Integrated Cassandra with Apache Spark, enabling real-time data processing and analytics for large datasets.

Designed and implemented Cassandra schema evolution strategies, ensuring backward compatibility and seamless database migrations.

Utilized Cassandra logging, diagnostics, and monitoring tools to troubleshoot performance bottlenecks and optimize database queries.

Worked in an Agile environment, collaborating with cross-functional teams to deliver high-quality, production-ready code.

Worked extensively with SQL databases to perform data manipulation, query optimization, and integrate back-end services with relational databases such as PostgreSQL and MySQL.

Experienced in queue-based messaging systems like RabbitMQ for efficient asynchronous communication and event-driven architectures.

Implemented AWS cloud solutions using services like EC2, Lambda, S3, and RDS, ensuring scalability, security, and high availability of web applications.

In-depth knowledge of Linux/Unix environments, with proven ability to develop, deploy, and troubleshoot applications on these platforms.

Developed and maintained data-driven web applications for analytics and compliance purposes, supporting business needs and regulatory requirements.

Designed and developed backend services using Spring Boot and Hibernate to manage employee records, roles, and payroll processing efficiently.

Implemented Spring Security for secure access control, ensuring data protection and compliance with industry security standards.

Developed cross-platform mobile applications using React Native for Android, ensuring responsiveness and smooth user experience

Built and exposed RESTful APIs to enable seamless integration with front-end applications and third-party systems like payroll services.

Developed and optimized .NET and Kotlin-based applications, demonstrating proficiency in full-stack development.

Extensive experience in API development and testing, with a focus on RESTful services and microservices architecture.

Optimized database performance in PostgreSQL by creating efficient SQL queries, utilizing indexing, and ensuring minimal query latency.

Containerized the application using Docker, ensuring consistency across development, testing, and production environments.

Automated CI/CD pipelines with Jenkins, reducing manual intervention and ensuring smooth and consistent deployments.

Deployed the application on AWS EC2 and used S3 for secure document storage, including contracts and payroll-related files.

Monitored application performance and tracked error rates using AWS CloudWatch, improving response times by 30% through proactive optimization.

Provided technical support during production deployments and actively resolved critical issues to ensure system reliability.

Collaborated with cross-functional teams to streamline HR workflows and enhance system usability.

Impact:

Automated repetitive HR tasks, reducing operational workload by 40% and improving team efficiency.

Ensured compliance with data security regulations by implementing robust encryption and access control measures.

Scaled the system to handle over 10,000 employees across various departments while maintaining consistent performance.

Environment: Java, Spring Boot, Spring Security, Hibernate, RESTful APIs, PostgreSQL, Docker, Jenkins, AWS (EC2, S3).

Client: Wipro 10/2018– 07/2020

Project: Online Banking System for Secure Transaction Management

Role: Java Developer

Location : Hyderabad India

Responsibilities:

Involved in all phases of the Software Development Life Cycle (SDLC), including requirement gathering, analysis, design, development, testing, and deployment of a secure online banking system.

Designed and developed RESTful APIs using Spring Boot to manage core banking operations, including customer account creation, fund transfers, and loan application processing.

Designed and developed scalable, fault-tolerant distributed systems using Java Spring Boot and Apache Spark, ensuring efficient batch and stream processing.

Built high-performance ETL pipelines leveraging Apache Spark (RDDs, DataFrames, Datasets) to process large-scale data.

Designed and developed highly optimized Solr indexing pipelines, ensuring efficient full-text search and fast retrieval of large-scale unstructured data.

Implemented Solr sharding, replication, and caching strategies to improve query performance and scale search operations across distributed clusters.

Debugged Solr query performance issues using profiling tools, optimizing query execution times by 40% through fine-tuned schema definitions.

Developed custom Solr analyzers, tokenizers, and query parsers to enhance text processing and search accuracy.

Implemented real-time event streaming and data ingestion using Kafka, improving data throughput and system responsiveness.

Collaborated with cross-functional teams, including data engineers and product designers, to build high-performance data analytics web applications.

Built database schemas and implemented efficient data models using Hibernate and MySQL, ensuring transactional integrity and minimizing query response times.

Integrated Apache Kafka for real-time transaction processing and asynchronous communication between services, ensuring timely customer notifications and seamless inter-service communication.

Implemented caching mechanisms with Redis to optimize frequently accessed data such as account balances, transaction histories, and loan repayment schedules, improving response time for critical banking operations.

Ensured robust security measures by implementing Spring Security for secure authentication and authorization, protecting sensitive customer data and complying with industry regulations.

Containerized the application using Docker, enabling consistent environments across development, testing, and production stages, and deployed the system on AWS EC2 for high availability.

Managed database storage with AWS RDS for scalable, reliable, and secure data management and utilized AWS S3for document storage, such as loan agreements and customer KYC documents.

Set up CI/CD pipelines with Jenkins to automate build, testing, and deployment workflows, significantly reducing manual errors and deployment time.

Configured a centralized logging and monitoring solution using the ELK Stack (Elasticsearch, Logstash, Kibana)to track system performance, detect anomalies, and troubleshoot production issues efficiently.

Conducted unit testing and integration testing using JUnit and Mockito to ensure high-quality code and adherence to functional requirements.

Actively debugged and resolved production issues, providing on-call support during critical operations to maintain system reliability and minimize downtime.

Created proof-of-concept (POC) solutions to evaluate and integrate advanced technologies such as API gateways and service mesh for improved scalability and security.

Ensured the application adhered to PCI DSS compliance standards for financial transactions, including secure data storage, encryption, and audit logging to meet regulatory requirements.

Worked closely with cross-functional teams, including frontend developers, QA testers, and DevOps engineers, to ensure seamless integration and delivery of the banking system.

Impact:

Reduced transaction processing time by 30% through optimized API design, caching strategies, and database query improvements.

Enhanced system security by achieving full compliance with PCI DSS standards, ensuring safe and secure financial transactions for all users.

Scaled the system to support a user base of over 100,000+ customers while maintaining 99.9% uptime, even during high-traffic periods, through effective scaling strategies and proactive monitoring.

Increased system reliability and availability by implementing robust CI/CD pipelines and automated testing, reducing deployment-related errors by 40%.

Improved operational efficiency for bank employees by automating loan application workflows and customer account management, reducing manual effort by 35%.

Environment: Java17, Spring Boot, Microservices, REST Web Services, Kafka, Docker, Kubernetes, Open-Shift, Angular 10v, AWS, XML, XSLT, HTML, Tomcat, Oracle 11g, Maven, GIT.

Client: Wipro 07/2016 – 09/2018

Project: Inventory Management System for Retail Operations

Role: Java Backend Developer

Location : Hyderabad India

Responsibilities:

Involved in all phases of the Software Development Life Cycle (SDLC), including requirement gathering, analysis, design, development, testing, and deployment of the Inventory Management System.

Designed and implemented high-performance ETL pipelines using Apache Spark, processing terabytes of structured and unstructured data.

Developed and optimized Spark applications using Core, SQL, and Streaming APIs, ensuring efficient data processing in batch and real-time workflows.

Refactored and debugged Spark jobs for performance optimization, reducing execution times by 40% through partitioning, caching, and broadcast joins.

Designed and implemented backend services using Spring Boot to handle inventory updates, supplier management, and stock auditing efficiently.

Developed RESTful APIs and microservices to support seamless integration between big data pipelines and enterprise applications.

Utilized Spark Streaming to process real-time data and integrate it with NoSQL databases like Cassandra and Solrfor quick lookups.

Integrated Solr with Apache Spark, enabling distributed search analytics and machine learning-driven recommendations.

Designed and implemented Solr schema configurations, defining field types, analyzers, and indexing strategies to improve search relevancy.

Developed RESTful APIs for querying Solr indexes, enabling seamless integration with microservices-based architectures.

Implemented fault-tolerant Spark Streaming solutions, leveraging checkpointing and stateful transformations to ensure real-time data consistency.

Tuned Spark configurations (executor cores, memory, shuffle partitions) to optimize resource utilization and prevent job failures.

Developed custom UDFs (User-Defined Functions) and UDAFs (User-Defined Aggregate Functions) to extend Spark SQL capabilities for complex business logic.

Integrated Apache Spark with Kafka, enabling real-time ingestion and processing of streaming data with millisecond latency.

Debugged Solr indexing failures and search anomalies, using logs, monitoring tools, and custom diagnostics to resolve issues quickly.

Refactored and optimized Solr-based search workflows, improving efficiency and reducing system load by 30%.

Worked extensively with Azure cloud services, deploying and managing big data solutions using Azure Synapse, Data Lake, and Azure Functions.

Integrated MongoDB as the primary database for flexible and scalable storage of product catalogs and supplier information, supporting dynamic data structures.

Built dynamic web interfaces with React, utilizing modern front-end development practices for intuitive and engaging user experiences.

Leveraged RabbitMQ for asynchronous messaging to synchronize real-time inventory updates across multiple retail locations, ensuring data consistency and accuracy.

Improved system performance by implementing caching mechanisms using Redis, optimizing frequently accessed data such as product details and stock levels for faster retrieval.

Built and exposed RESTful APIs to enable seamless integration with front-end applications and third-party logistics systems, enhancing overall system usability.

Containerized the application using Docker to ensure consistency across environments and deployed containerized services on Kubernetes clusters for scalability during peak sales.

Automated CI/CD pipelines using Jenkins, enabling seamless deployment of new features and bug fixes, reducing downtime during updates.

Monitored system health and resource usage through Prometheus and visualized operational metrics with Grafana, ensuring proactive issue detection and resolution.

Collaborated closely with QA and DevOps teams to debug and resolve critical issues, ensuring system reliability and optimal performance.

Provided technical support during production rollouts and actively resolved deployment-related issues to maintain uninterrupted operations.

Conducted proof-of-concept (POC) experiments for integrating advanced tools like real-time analytics dashboards and predictive inventory systems.

Followed coding standards and best practices using Java, Spring Boot, and Microservices architecture, ensuring a modular and maintainable codebase.

Impact:

Reduced stock discrepancies by 40% through the implementation of real-time inventory tracking and reconciliation mechanisms.

Improved API response times by 25% with optimized database queries and effective caching strategies, enhancing the user experience for internal and external systems.

Achieved 99.8% system availability, supporting over 50 retail locations with consistent high-traffic operations during peak sales periods.

Increased operational efficiency by automating supplier order processing and inventory distribution, reducing manual effort by 30%.

Environment: Java, Spring Boot, Hibernate, RESTful APIs, MongoDB, RabbitMQ, Redis, Docker, Kubernetes, Jenkins, AWS (EC2, S3), Prometheus, Grafana.

Education Details:

Masters (M.P.S) : Data Science, University of Maryland Baltimore County. 09/2022 – 05/2024

Bachelor’s degree : Electronics and Communication

University: GVP College of Engineering 06/2012 – 05/2016



Contact this candidate