Post Job Free
Sign in

Senior Java/Python Backend Engineer with React Frontend

Location:
Philadelphia, PA
Posted:
March 11, 2026

Contact this candidate

Resume:

Sai Ayyapusetty

**************@*****.*** 301-***-**** www.linkedin.com/in/ayyapusetty-sai

Professional Summary:

With 7+ years of IT experience involving all phases of Software Development Life Cycle (SDLC) including planning, analysis, design, implementation, testing and maintenance of multi-Tier distributed, secure enterprise applications using Java and Python.

Proficient in various Spring modules, including Spring Core, Spring ORM, Spring Boot, and Spring Security, and experienced in developing microservices using Spring Boot with RESTful APIs.

Proven expertise building microservice-based backend architectures that are scalable, fault-tolerant, and aligned with modern RESTful design principles.

Skilled in PostgreSQL schema design, query optimization, and integrating data-intensive financial systems.

Strong command of frontend development with ReactJS, implementing reusable UI components, hooks, and state management to deliver responsive, high-performance interfaces.

Experienced in CI/CD pipelines with Jenkins, GitLab, Docker, and AWS CodeDeploy for automated build, test, and cloud deployments.

Experienced in big data processing using Apache Spark, developing batch and real-time data pipelines with Spark SQL and Structured Streaming to process large-scale financial datasets efficiently.

Solid understanding of software design patterns, unit testing (JUnit/Mockito), and Agile collaboration with UX, QA, and DevOps teams to ensure performance, quality, and maintainability.

Technical Skills:

Programming Languages

Java, Python, Node JS, C

Web Technologies

React JS, AngularJS, HTML, DHTML, JavaScript, jQuery, CSS, AJAX, DOJO, XML, Web Services (REST, WSDL).

Frameworks

Hibernate, JPA, Spring (IOC, AOP, MVC, Boot, ORM, Dependency Injection)

Database Environments

MongoDB, Cassandra, CockroachDB,Oracle Sql,, MySQL

Web/Application Servers

Web Logic, Apache Tomcat, JBOSS

Webservice Technologies

REST, GraphQL

Cloud Technologies

AWS, Azure

Operating Systems

Windows (All Variants) UNIX

Messaging Services

Rabbit MQ, Active MQ, Apache Kafka

Version Control Tools

GitLab, GIT, Bit Bucket, CVS, SVN

Professional Experience:

Client: Freddie Mac, Mclean VA. Sep 2024 – Present

Role: Sr Full Stack Developer

Project Summary:

Working on financial applications designed for risk analytics, portfolio management, and capital forecasting. These applications process large-scale financial data, execute credit risk models, and provide real-time scenario analysis for investment decisions and regulatory compliance. The system integrates various financial services, pricing models, and data workflows to deliver data-driven insights for business strategies.

Responsibilities:

Developed and maintained Java-based applications supporting financial risk models, capital forecasting, and loan valuation calculations across distributed systems

Designed and optimized RESTful APIs using Java and Spring Boot for secure, high-throughput data exchange between risk analytics services and external financial modelling platforms

Implemented event-driven and streaming architectures using Kafka on AWS MSK for real-time data transformations, aggregations, windowed operations, and enrichment of financial datasets

Built observability and monitoring dashboards using Splunk, custom Python log parsers, CloudWatch metrics, and JVM diagnostics to track service availability, API latency, error trends, and memory utilization

Designed UI prototypes and application skeletons using AngularJS, HTML5, CSS, and DHTML, and developed responsive web interfaces for risk and loan valuation workflows.

Developed Apache Spark batch processing jobs using Spark SQL and DataFrames to process and transform large mortgage datasets stored in Amazon S3.

Enforced authentication and authorization across microservices using Spring Security, JWT, and OAuth2, securing internal analyst access and system-to-system integrations via REST and JDBC

Designed and implemented Oracle PL/SQL stored procedures, functions, and triggers to support risk calculations, capital forecasting logic, and large-scale batch processing

Built Python-based ETL utilities to cleanse, transform, and validate large mortgage datasets before ingestion into AWS RDS, Aurora, and MongoDB, improving pipeline reliability and performance.

Designed and optimized ETL pipelines to move structured and semi-structured data from relational databases into Amazon S3 and Glacier for large-scale processing and long-term archival.

Integrated Angular Router for navigation across modules, secured routes with guards, and optimized state management using Angular services.

Integrated Kafka producers and consumers with Spring Boot microservices and downstream systems including S3, RDS, and DynamoDB to support batch and streaming ingestion use cases

Leveraged AWS Lambda for data transformation, aggregation, and scheduled processing tasks, reducing manual batch workloads and improving turnaround times by approximately 40%

Troubleshot production performance issues by correlating logs and metrics across Kafka, application services, and databases, significantly improving triage and root-cause analysis efficiency

Wrote and tuned complex SQL and PL/SQL queries to extract, validate, and reconcile large datasets used in credit risk models and capital forecasting dashboards

Managed Jenkins-based CI/CD pipelines, automating builds, unit tests, Docker image creation, vulnerability scans, and deployments to AWS environments

Implemented Spark batch jobs to transform, cleanse, and aggregate structured and semi-structured datasets before loading into AWS S3 and RDS.

Implemented fault-tolerant Spark pipelines with checkpointing and structured streaming to ensure reliability of mortgage processing workflows.

Optimized Docker images using multi-stage builds and lightweight base images, reducing image size and accelerating deployments to EC2 and EKS.

Built scalable Spark ETL pipelines to cleanse, enrich, and aggregate financial datasets before loading them into AWS RDS and MongoDB.

Enhanced API reliability by implementing centralized exception handling, interceptors, and validation layers within Spring MVC and Spring Boot services

Deployed microservices on AWS EC2 and Kubernetes clusters, integrating CloudWatch alarms and health checks to monitor system performance and availability.

Configured API Gateway and VPC networking to securely expose services, manage traffic routing, and control access between internal and external systems

Implemented Spark Structured Streaming pipelines integrated with Kafka to process real-time mortgage transaction data and perform windowed aggregations.

Implemented infrastructure using AWS CloudFormation and Terraform to provision EC2 instances, VPCs, security groups, MSK clusters, and database resources consistently across environment.

Developed unit and component test cases using Jasmine and Karma to validate AngularJS controllers, UI workflows, form validations, and business rules

Automated financial report generation using Python, scheduling jobs to aggregate risk metrics and deliver reports to business stakeholders

Collaborated closely with product owners, risk analysts, and QA teams to gather requirements, validate functional and non-functional behaviour, and resolve issues across development, QA, and production environments

Environment: Java 8/11, Spring (Core, Security, JDBC), Angular 14, RESTful APIs, MongoDB, AWS, Spark, Jenkins, GitLab, MongoDB Compass, Oracle PL/SQL, Apache Tomcat, Apache HTTP Server

Client: E-Trade, CA May 2022 – Aug 2024

Role: Sr Full Stack Developer

Project Summary:

The goal of the project was to improve the functionality and user experience of E Trade application system. The scope encompassed both back and frontend enhancements, including optimizing database queries, improving frontend responsiveness, and implementing server less computing for scalability.

Responsibilities:

Designed and implemented event-driven microservices using Java and SpringBoot, enabling seamless communication between loan-valuation and risk-analytics services. Designed and implemented event-driven microservices using Java and SpringBoot, enabling seamless communication between loan-valuation and risk-analytics services

Developed and enhanced SQL stored procedures and functions to handle portfolio management and trading data in CockroachDB

Used GraphQL to aggregate and orchestrate data from multiple Spring Boot microservices, reducing over-fetching and minimizing client-side API calls

Built responsive, user-friendly web interfaces using HTML5, JavaScript, jQuery, AJAX, and DataTables for dynamic data presentation

Implemented Azure App Services and Azure Functions for serverless workloads, supporting event-driven financial operations

Implemented Splunk dashboards to monitor end-to-end performance of trading and portfolio-management microservices, tracking API latency, order-processing times, and system throughput during peak trading hours

Implemented asynchronous programming patterns using Promises and async/await to efficiently manage concurrent API calls and real-time data updates

Supported integration of Hadoop-processed data with Kafka, Cassandra, and CockroachDB for hybrid batch and streaming use cases

Tuned Kafka consumer performance by optimizing poll intervals, batch sizes, commit strategies, and concurrency to reduce lag and improve throughput

Integrated CockroachDB for distributed transactional workloads, improving horizontal scalability and fault tolerance

Designed and implemented robust server-side applications using Spring MVC, ensuring high performance and responsiveness to frontend requests

Built interactive dashboards and data visualizations using React with D3.js and Recharts, enabling analysts to derive insights from complex financial data

Configured Azure API Management to securely expose REST and GraphQL endpoints, enforcing throttling, authentication, and access control

Developed Spark batch and streaming applications to process high-volume trade transactions and market data feeds with low latency

Integrated Hibernate and JPA for ORM-based data access with CockroachDB, managing complex entity relationships and transactional consistency

Leveraged Python with Pandas and NumPy to preprocess market and trade data for real-time analytics dashboards built in React

Used Java 8 features such as Lambda expressions, Stream API, and functional interfaces to improve performance of bulk data operations

Integrated Kafka pipelines with CockroachDB for real-time ingestion of streaming data, ensuring scalability and fault tolerance

Worked extensively with GraphQL schemas, resolvers, and queries to support UI-driven data requirements for trading and portfolio dashboards

Containerized Django and React applications using Docker and managed deployments through Kubernetes for consistency across environments

Deployed Spark jobs on AWS EMR clusters and configured resource allocation, executor memory, and parallelism settings to ensure efficient cluster utilization

Developed Spark Structured Streaming applications integrated with Kafka to process real-time market data streams with low latency.

Created custom reusable React component libraries and leveraged jQuery plugins for drag-and-drop, widgets, menus, and form handling

Implemented automated testing frameworks including JUnit and Mockito to ensure application quality and reliability within CI/CD pipelines

Designed and implemented real-time streaming applications using Kafka Streams, enabling complex data processing and analytics

Configured RabbitMQ message brokers for inter-service communication between Spring Boot microservices and Node.js event processors, improving decoupling and reliability

Deployed applications through Jenkins-based CI environments to support continuous integration and development testing

Worked with Kubernetes to define deployments, services, and scaling strategies for containerized applications in cloud environments

Configured Git integration with Jenkins CI pipelines and automated containerization workflows using Ansible and Kubernetes

Automated builds and deployments using Azure DevOps pipelines, integrating unit tests, code quality checks, and container image publishing

Implemented Python-based monitoring utilities to track Azure Functions and Kafka pipelines, generating alerts and performance metrics

Performed capacity planning and load testing on Kafka clusters to handle peak financial transaction volumes

Environment: Java,GraphQL, Spring Boot, Microservices, ReactJS, REST, Hibernate, Kafka, RXJS, HTML5, CSS3, JavaScript, SQL, Cassandra, DOM, Docker, Jenkins, Azure, EJB, IntelliJ, oracle, Agile and Windows, swagger

Pixel Dream, India – (Client – ICANOTES) June 2018 – July 2021

Role: Full stack developer

Responsibilities:

Worked on Angular and Java to develop end-to-end modules for the loyalty and rewards platform, implementing RESTful APIs on the backend and dynamic dashboards on the frontend for seamless user interaction.

Utilized Angular Router for navigation and route-level authorization between user, campaign, and admin modules.

Optimized JavaScript performance by implementing lazy loading, code splitting, and minimizing DOM manipulation for faster rendering of trading dashboards

Created Python data-ingestion modules to feed healthcare claim events into Cassandra and PostgreSQL, streamlining large-batch processing.

Developed the application using Spring MVC, Spring Web flow and added asynchronous features using AJAX.

Developed Spring Data Cassandra repositories for seamless integration with Spring Boot microservices, reducing query latency.

Created Python data-ingestion modules to feed healthcare claim events into Cassandra and PostgreSQL, streamlining large-batch processing.

Developed Angular components, services, and modules to manage campaign creation, reward tracking, and user interactions efficiently

Optimized SQL/PLSQL execution plans to reduce query latency and improve batch job performance.

Worked on the JAVA Collections API for handling the data objects between the business layers and the front end.

Developed the presentation layer written using JSP, HTML, CSS and client-side validations were done using JavaScript, and jQuery and Angular JS.

Utilized AWS EC2 and S3 to host services and manage object storage, achieving cost-efficient cloud integration.

Deployed Angular builds through Jenkins pipelines and hosted static assets on AWS S3 for scalable and reliable access.

Implemented extensive JavaScript for creating the Global templates that can be used across the JSP Pages. Produced client-side validation through JavaScript and Ajax for asynchronous communication.

Applied J2EE design patterns such as MVC, Singleton, and Data Access Objects in the architecture and used JDBC as DAO class to persist the data on to Oracle Database.

Implemented time-series data storage in Cassandra for audit logs and event tracking, supporting efficient queries on billions of records.

Collaborated in CI/CD pipelines to automate AngularJS build and deployment processes through Jenkins and AWS S3 hosting.

Performed unit testing using JUnit, ensuring robustness and minimizing production defects.

Developed Python scripts for asynchronous job scheduling to support daily batch reconciliations and real-time claims reporting

Enhanced system performance by tuning long-running PL/SQL queries and indexing strategies.

Environment: Java, JSP, Java Script, AJAX, AngularJS, PL/SQL, Maven, Servlet, EJB, Agile, MVC, JSON, JDBC, Oracle 8i, JUnit, CSS, Web logic server, Eclipse.

Education:

Master’s degree – University of Maryland Baltimore County

Bachelor's degree – GITAM University



Contact this candidate