Lohita Deadeepya Mulukuri
Email: ***********@*****.*** Phone: 469-***-**** LinkedIn: Lohita Deadeepya
Senior Software Engineer
PROFESSIONAL SUMMARY:
Software Engineer with 7+ years of hands-on experience building full-stack enterprise applications using Java, Spring Boot, Angular (2+ and 14), TypeScript, and AWS, with strong command over Docker and Kubernetes for deploying containerized microservices in scalable cloud environments.
Specialized in developing cloud-native, secure, and responsive applications using Spring Boot (REST APIs) and Angular, orchestrating deployments via Kubernetes and Docker, and integrating with AWS services like EKS, EC2, S3, and RDS.
Designed and developed robust RESTful and GraphQL APIs for scalable enterprise applications using Spring Boot and Node.js.
Proficient in developing front-end UIs using Angular, React, TypeScript, JavaScript, HTML5, CSS3, Bootstrap, and RxJS.
Experienced in Python development for backend services and automation, and in implementing AI/ML models for predictive analytics using TensorFlow and Scikit-learn.
Built and maintained data pipelines using Apache Spark, PySpark, and Google Pub/Sub, supporting ETL, analytics, and data transformation workflows.
Built and deployed microservices and containerized apps using Docker, managed through Kubernetes clusters on AWS EKS and Azure Kubernetes Service.
Hands-on experience working with AWS (Lambda, EC2, S3, CloudWatch, SNS, SQS) and Azure (Functions, AKS) for building, hosting, and monitoring cloud-native services.
Developed and secured applications using OAuth 2.0, JWT, RBAC, and Spring Security to enforce authentication and authorization.
Implemented caching strategies using Redis and optimized data storage using PostgreSQL, Oracle, and MongoDB.
Applied design patterns like DAO, DTO, Factory, and MVC to build maintainable and testable codebases.
Integrated Apache Kafka and RabbitMQ for reliable messaging and asynchronous communication across distributed microservices.
Automated CI/CD pipelines using Jenkins, GitHub Actions, and GitLab CI, improving build, test, and deployment workflows.
Applied Infrastructure as Code (IaC) principles using Terraform and AWS CloudFormation to automate cloud provisioning.
Developed unit, integration, and E2E test suites using JUnit, Mockito, Cypress, Jest, and Selenium to ensure code quality and system reliability.
Actively participated in Agile/Scrum ceremonies including sprint planning, reviews, retrospectives, and daily stand-ups.
Collaborated with cross-functional teams and product stakeholders to deliver business-critical solutions on time.
Implemented real-time observability and monitoring using Prometheus, Grafana, New Relic, and ELK Stack for system performance and logging.
Delivered multiple successful projects in telecom, logistics, healthcare, and public sector, aligning technical solutions with business goals.
Known for strong communication, curiosity, analytical thinking, and a growth mindset — always keen on learning new tools and tech.
TECHNICAL SKILLS:
Category
Tools/Technologies
Programming Languages
Java, JavaScript, Python, Typescript, SQL, Shell Scripting/Unix/Bash Scripting
Front End
Angular, JavaScript, TypeScript, CSS3, HTML5, AJAX, Bootstrap
Back End
Spring Boot, Node.js, GraphQL, Django
Container Technologies
Docker, Kubernetes
SDLC Methodologies
Agile, Waterfall, Test Driven Development
Databases
Oracle, DB2, PostgreSQL, MySQL, MS SQL server
Application server
Jetty, Apache Tomcat, IBM WebSphere
AWS Cloud
EC2, S3, EKS, RDS, CloudWatch, SNS, SQS, Lambda, IAM, CloudFormation
Build & Source Code Management
Maven, Git, Bitbucket
CI / CD Platforms
GitHub, Jenkins, Artifactory, GitLab CI/CD, Terraform, Azure DevOps
IDEs and Tools
IntelliJ, Visual Studio Code, Eclipse, Postman, JIRA
Operating Systems
Windows OS, Ubuntu (Linux Operating Systems), MacOS
Other
Splunk, Salesforce CRM, Apache Kafka, RabbitMQ, PySpark, TensorFlow, Scikit-learn, Prometheus, Grafana, ELK Stack
WORK EXPERIENCE:
United Postal Service Remote, USA September 2023 - Present
Senior Software Engineer
Project Description: UPS is a global leader in logistics and package delivery services. I contributed to the modernization of an internal package tracking system by converting a legacy platform into a unified architecture to streamline operations, enhance scalability, and improve maintainability.
Role & Responsibilities:
Developed and modernized logistics and tracking applications by designing scalable microservices using Java (Java 11 to 17) and Spring Boot, optimized for package routing, tracking status updates, and delivery confirmation.
Built secure and efficient RESTful and GraphQL APIs to enable seamless integration between internal systems and mobile/web platforms used by drivers and customers.
Migrated legacy monolithic architecture to microservices, improving deployment flexibility, fault isolation, and scalability across shipment workflows.
Developed dynamic UIs for shipment status, routing maps, and delivery timelines using React.js and Angular 14, enhancing user experience for support teams and operations.
Designed schema and managed real-time data transactions for shipment tracking using PostgreSQL, Redis, Oracle, and MongoDB.
Implemented caching and rate-limiting using Redis to handle high-frequency tracking requests without overloading backend systems.
Created and optimized Kafka-based messaging flows to support event-driven shipment updates, integration with sorting centers, and anomaly detection in package flow.
Designed and implemented real-time data pipelines using Kafka, Google Pub/Sub, and PySpark to process and analyze package scan data from distribution hubs.
Integrated AI/ML models using TensorFlow and Scikit-learn for predicting potential delivery delays based on historic data, weather patterns, and route congestion.
Deployed applications using Docker and managed them through Kubernetes on AWS EKS, ensuring auto-scaling and high availability for logistics-critical systems.
Built and maintained CI/CD pipelines with Jenkins and Terraform, automating build, test, and deployment for containerized services and infrastructure provisioning.
Strengthened platform reliability by fine-tuning APM tools like New Relic and Prometheus, identifying performance bottlenecks in API and backend services.
Implemented secure authentication and authorization using OAuth 2.0, JWT, and OpenID Connect for internal applications and APIs.
Hardened system security by enforcing OWASP best practices, mitigating risks like SQL injection, CSRF, and XSS in logistics management interfaces.
Conducted rigorous testing using JUnit, Cypress, Selenium, and Jest, ensuring application reliability during nationwide shipment spikes and peak delivery times.
Collaborated across DevOps, QA, and product teams in an Agile environment, supporting on-time delivery of enhancements for shipment visibility and logistics reporting.
Technologies Used: Java, Spring Boot, Node.js, Python, Django, React.js, Angular 14, HTML5, CSS3, TypeScript, RESTful, GraphQL, SOAP, PostgreSQL, MongoDB, Redis, MarkLogic, Oracle, Kafka, Google Pub/Sub, PySpark, AWS, Docker, Kubernetes, OpenShift, Jenkins, GitLab CI/CD, Terraform, Git, Bitbucket, Prometheus, ELK Stack, SonarQube, OAuth 2.0, JWT, OWASP, Jira, Confluence
United States Department of Agriculture Dallas, TX (Remote) August 2022 - September 2023
Senior Software Engineer
Project Description: USDA is a federal agency responsible for developing and implementing policies on farming, agriculture, and food safety. I supported the development of a web application to manage internal workflows by building user-facing interfaces and integrating backend services.
Role & Responsibilities:
Developed secure internal platforms using React.js, Angular 12+, Spring Boot, and Django for policy tracking, regulatory approvals, and data collection.
Designed and built GraphQL and RESTful APIs to streamline frontend/backend data flow, improving access control and reducing redundant data calls.
Implemented OAuth 2.0, JWT, and CSRF protection for sensitive federal workflows and internal policy management tools.
Integrated WebSockets for real-time updates in audit and task-tracking dashboards used by USDA officials and administrative staff.
Built reusable UI components and forms using React.js Hooks, Context API, and Angular, with lazy loading and tree-shaking for improved performance.
Implemented lazy loading, tree shaking, and virtual DOM optimizations, enhancing frontend performance and reducing load times.
Architected and deployed secure cloud infrastructure using Terraform, AWS CloudFormation, and Helm, ensuring reproducibility and compliance with federal standards.
Managed containerized deployments using Docker and Kubernetes, deploying across AWS, Azure AKS, and GCP Kubernetes Engine as part of a hybrid cloud strategy.
Streamlined CI/CD processes using GitHub Actions, GitLab CI/CD, and Azure DevOps, ensuring version control and traceability of infrastructure and application changes.
Developed asynchronous messaging flows using Kafka and RabbitMQ, supporting audit logging, document processing, and regulatory triggers.
Designed and tuned PostgreSQL, MongoDB, and Elasticsearch databases for high availability, supporting search-heavy internal portals.
Built sharded, replicated database setups to ensure failover protection and high availability for mission-critical workflows.
Used Prometheus, Grafana, and the ELK stack to monitor service health and logs, enabling proactive issue detection across distributed systems.
Applied CQRS and event sourcing patterns to decouple read/write models in compliance-heavy transactional systems.
Collaborated in cross-agency Agile teams, contributing to scalable digital transformation initiatives within the USDA.
Implemented unit, integration, and end-to-end tests using Jest, Mocha, JUnit, Cypress, Selenium, and Playwright, ensuring code reliability and maintainability.
Led peer code reviews and knowledge-sharing sessions, improving code quality, maintainability, and team productivity.
Worked in Agile teams using Scrum and Kanban, actively participating in daily stand-ups, sprint planning, and retrospectives.
Provided technical mentorship and best practice guidance, ensuring adoption of clean, maintainable, and scalable codebases.
Technologies Used: React.js, Angular 12+, TypeScript, JavaScript, Redux, NgRx, Node.js, Spring Boot, Python, Django, RESTful APIs, GraphQL, PostgreSQL, MongoDB, Redis, Elasticsearch, Kafka, RabbitMQ, AWS (Lambda, EC2, S3), GCP (Cloud Functions, Pub/Sub), Azure (Functions), Docker, Kubernetes, GitHub Actions, GitLab CI/CD, Terraform, Helm, Prometheus, ELK Stack, SonarQube, Jira, Confluence.
Centene Corporation Dallas, TX (Remote) May 2022 - August 2022
Senior Software Engineer
Project Description: Centene Corporation is a healthcare enterprise that provides services to government-sponsored health programs. I helped develop a provider-facing application used by hospitals to verify patient insurance eligibility and streamline treatment authorization processes.
Responsibilities:
Developed Single Page Applications (SPAs) using React.js and React Native, ensuring responsive design and seamless cross-platform navigation.
Built and maintained secure microservices using Spring Boot and GraphQL, enabling efficient data retrieval and streamlined backend operations.
Integrated .NET and Java-based microservices, facilitating cross-platform communication and interoperability.
Implemented authentication and authorization using OAuth2, JWT, and Okta-based Single Sign-On (SSO) for secure user access management.
Developed Kafka-based real-time streaming pipelines for processing patient insurance requests and used RabbitMQ for asynchronous background processing.
Designed and optimized PostgreSQL and MongoDB database schemas, ensuring high availability and efficient query performance.
Created reusable React components and form modules with integrated validations, improving development speed and maintainability.
Developed RESTful and GraphQL APIs using Node.js and Spring Boot, ensuring seamless frontend-backend communication.
Deployed applications on AWS and Azure, leveraging Docker containers, Kubernetes orchestration, and Terraform for infrastructure automation.
Built and maintained CI/CD pipelines using Jenkins, Azure DevOps, and Terraform, automating build, test, and deployment workflows.
Worked closely with QA and DevOps teams to implement automated testing and continuous integration, ensuring high software reliability.
Participated in Agile development processes, including sprint planning, code reviews, and technical mentoring to enhance development efficiency.
Implemented RBAC (Role-Based Access Control) and data-level security to comply with HIPAA standards and secure sensitive patient data.
Technologies: React.js, React Native, Angular, JavaScript, TypeScript, Node.js, Spring Boot, GraphQL, .Net, Kafka, RabbitMQ, PostgreSQL, MongoDB, OAuth2, JWT, Okta (SSO), AWS (EC2, S3), Azure, Docker, Kubernetes, Terraform, Jenkins, Azure DevOps, Swagger, Postman, Jira, Confluence.
Verizon Boston, MA, USA October 2020 - May 2022
Senior Software Engineer
Project Description: Verizon is a leading telecommunications provider delivering wireless, IoT, and edge computing solutions. I contributed to the development of a cloud-based platform that processed real-time telemetry data from smart IoT devices. The system generated automated alerts based on custom thresholds configured by device owners and enabled secure customer account management. Additionally, I worked on building service models to help third-party providers deploy and manage their applications through Verizon’s Multi-access Edge Computing (MEC) infrastructure, distributed across regional towers for low-latency access.
Responsibilities:
Contributed to the development of a cloud-native platform that processes telemetry data from smart IoT devices, enabling real-time alerting based on customer-defined configurations such as temperature or motion thresholds.
Designed and implemented Spring Boot microservices for customer account creation and management, supporting a subscription-based model similar to platforms like Netflix.
Developed a Service Model Interface for onboarding third-party service providers to Verizon MEC (Multi-access Edge Computing), allowing services to be deployed on edge nodes distributed across infrastructure like cell towers for low-latency access.
Created secure and scalable RESTful APIs to expose device metrics, user preferences, and alerting configurations to the frontend and external systems.
Built intuitive and responsive React.js user interfaces for customers to monitor device activity, view telemetry trends, and configure alert preferences.
Integrated Apache Kafka to support real-time ingestion and streaming of telemetry data from distributed IoT devices.
Applied JWT authentication and Role-Based Access Control (RBAC) to protect APIs and ensure secure access to user and organization-level data.
Designed efficient data schemas and handled data persistence using PostgreSQL, optimizing performance for querying high-frequency sensor data.
Containerized backend services with Docker and deployed to Kubernetes clusters on AWS (EKS) for scalable, reliable service delivery.
Implemented automated CI/CD pipelines using Jenkins, supporting seamless code integration, testing, and deployment.
Technologies: React.js, JavaScript, Spring Boot, REST APIs, Apache Kafka, PostgreSQL, JWT, RBAC, Docker, Kubernetes, AWS (EKS, EC2, S3), Jenkins, HTML5, CSS3, Jira, Confluence
Siemens Healthineers Bangalore, India February 2019 – July 2019
Senior Software Engineer
Project Description: Siemens Healthineers is a global medical technology company focused on diagnostic imaging, laboratory diagnostics, and digital healthcare solutions. I contributed to a data processing system that enabled integration and automation of patient imaging data across hospital systems.
Role & Responsibilities:
Developed interactive UI components using Angular to visualize medical imaging data and facilitate user interaction for diagnostic workflows.
Built and integrated backend services using Java and Python to handle secure transmission and transformation of imaging metadata.
Utilized Apache Kafka for reliable messaging and real-time data flow between hospital systems and diagnostic modules.
Collaborated with cross-functional teams to automate healthcare workflows, reducing manual intervention and improving diagnostic turnaround time.
Implemented backend analytics using Django to track system performance and data processing metrics for hospital IT teams.
Wrote unit and integration tests using JUnit and PyTest to ensure system accuracy and maintainability in a high-compliance healthcare environment.
Technologies Used: Angular, Java, Python, Django, RESTful APIs, Apache Kafka, PostgreSQL, JUnit, PyTest, Git, Jenkins, Docker, Jira, Confluence
Plintron Global Technologies Chennai, India February 2016 – February 2019
Software Engineer
Project Description: Plintron Global Technologies is a telecom solutions provider supporting mobile virtual network operations worldwide. Starting as an intern and later transitioning to a full-time developer, I contributed to building a customer billing and lifecycle management system tailored for telecom service providers.
Role & Responsibilities:
Developed and maintained modular microservices using Spring Boot to manage customer onboarding, billing cycles, and usage analytics.
Built user-facing dashboards with React for telecom partners to manage subscribers, view usage reports, and process billing adjustments.
Migrated existing monolithic services to Pivotal Cloud Foundry (PCF), enabling better scalability, monitoring, and deployment automation.
Implemented Kafka-based event processing pipelines to manage telecom usage data and trigger billing workflows in real time.
Deployed applications to AWS infrastructure using EKS and CloudFront, ensuring high availability and performance in production.
Participated in code reviews and Agile ceremonies, contributing to continuous improvement of system design and development practices.
Technologies Used: React.js, Java, Spring Boot, RESTful APIs, Apache Kafka, PostgreSQL, Pivotal Cloud Foundry (PCF), AWS (EKS, CloudFront), Docker, Git, Jenkins, Jira, Confluence, Agile
EDUCATION
B. Tech in Computer Science from Gandhi Institute of Technology and Management, Visakhapatnam, India.
M.S in Computer Science from Southern Arkansas University, Magnolia, AR, USA.
Trainings and Certifications:
Pursuing AWS Solutions Architect Certification