INTRO –
Hi, I’m Pavan Kalyan Bommena, a Senior Java Full Stack Developer with over 10 years of experience specializing in using Java, J2EE, xml and Python. On the front end, I have experience with HTML, CSS, React, Hibernate, Angular, Typescript, and JavaScript frameworks, and I’ve integrated these with backend services to provide seamless user experiences. I have hands-on experience with AWS services like Lambda, EC2, S3, CloudFormation, and AWS API GATEWAY, which I have leveraged to build and optimize cloud-native applications.
I have a strong foundation in Core Java, including features like executor frameworks, lambdas, and the Date-Time API, and have implemented multiple microservices using Spring Boot, Spring Security, and Spring Cloud. My expertise extends to both synchronous and asynchronous messaging systems, utilizing technologies like Kafka, SQS, and SNS for reliable communication between microservices.
In addition to backend development, I am proficient in building and optimizing RESTful and GraphQL APIs, ensuring efficient data retrieval and seamless communication between frontend and backend systems.
I have worked extensively with various relational and NoSQL databases, including Oracle, MySQL, PostgreSQL, DynamoDB, and MongoDB. My experience also includes automating deployment and infrastructure management using AWS CloudFormation, Terraform, and CI/CD tools like Jenkins, ensuring that applications are deployed reliably and can scale dynamically.
Additionally, I am skilled in implementing security features such as OAuth 2.0 and JWT for authentication and authorization.
I have worked in Agile environments, participating in Scrum and Kanban methodologies. I actively engage in all Agile ceremonies such as sprint planning, retrospectives, and daily stand-ups to ensure the smooth and timely delivery of projects.
Project
In my recent role at Vanguard, I worked on the development of the Returned Post Office (RPO) Management and Address Verification System. The primary goal of this project was to automate the process of handling returned mail and address verification in compliance with SEC regulations, while streamlining account restrictions based on address status.
As part of the backend development team, I designed and implemented microservices using Java and Spring Boot. These microservices interacted with AWS services such as Lambda, SQS, SNS, and DynamoDB to manage account restrictions and update investor information in real-time. I was responsible for ensuring that the system could scale effectively by using AWS serverless architecture with Lambda, and I leveraged DynamoDB to store account restrictions, which included Time to Live (TTL) for automatic expiration of records.
I also integrated external services like LXNX for address verification, which helped automate the process of updating investor addresses and removing account restrictions. The system was designed to process real-time updates and run scheduled batch jobs via ECS for expired records. Additionally, we implemented fraud, and ineligibility checks to ensure only valid accounts underwent address verification.
One of the key highlights of this project was the use of GraphQL to design efficient APIs for the frontend to query and retrieve relevant data with minimal requests. This helped optimize data retrieval and improve the system’s performance. Additionally, by leveraging AWS services and Spring Boot, we were able to reduce operational overhead and enhance system reliability and scalability.
This project significantly reduced manual intervention, increased operational efficiency, and ensured compliance with industry regulations. It also provided a scalable framework that can accommodate future enhancements as Vanguard continues to expand its services.
The deployment process for the Vanguard RPO Management and Address Verification System leveraged a highly scalable and reliable cloud-based architecture, ensuring seamless integration of various system components. Kubernetes (EKS) was utilized to manage the deployment of containerized microservices, providing automated scaling, load balancing, and self-healing capabilities. Using Docker, the application components, including Java-based microservices and batch jobs, were containerized to ensure consistent environments across development, staging, and production.
Deployment pipelines were automated using Jenkins, integrated with Terraform to provision and manage infrastructure on AWS. Kubernetes manifests, written in YAML, defined resource configurations, including Pods, Services, Deployments, and ConfigMaps, enabling smooth orchestration of workloads. Ingress Controllers managed external traffic, routing it to the appropriate services within the Kubernetes cluster, while AWS Elastic Load Balancer (ELB) ensured high availability and efficient traffic distribution.
The batch jobs, running on Amazon ECS, were triggered on a scheduled basis to process expired records from DynamoDB, while microservices deployed on Kubernetes interacted with SNS and SQS for real-time messaging. To enhance security, sensitive information like database credentials and API keys were securely stored in AWS Secrets Manager and injected into the Kubernetes Pods as environment variables.
Continuous monitoring was implemented using AWS CloudWatch and Prometheus, integrated with Grafana for visualization and real-time alerting. Logs from Kubernetes clusters were shipped to Elasticsearch for centralized analysis and debugging, ensuring quick resolution of issues. This automated and robust deployment strategy not only reduced manual overhead but also ensured compliance, high availability, and scalability, meeting Vanguard’s business and regulatory requirements.
By leveraging Kubernetes and a DevOps-centric approach, the deployment process was optimized for performance, resiliency, and future scalability.
Project Overview:
The project focuses on managing and sharing content, data, and client information for CME financial advisors and sales teams. It integrates multiple systems, such as Seismic and TeamSite, to provide access to marketing materials and financial data. The content is pulled from different sources and made available for decision-making through user-friendly interfaces. Data governance ensures that third-party data is processed and stored securely. The system streamlines workflows, enabling sales teams to order content, track materials, and enhance client interactions.
The project revolves around content management, data governance, and access management for CME’s financial advisors and sales teams. These teams need access to client-related data, asset information, market research, and marketing content to assist in their advisory roles and decision-making. The project integrates multiple applications and external vendors for handling and processing this data efficiently.
Key Components:
1. Client Profile (FDE):
Description: This component displays client-related data, including asset information and relevant financial data.
Data Sources: Data is pulled from internal sources like XDE and XD3, as well as third-party systems, to provide comprehensive client profiles.
Technology:
oBackend: Spring Boot Microservices to expose APIs to fetch and process client data.
oFrontend: Angular for the user interface, providing an interactive, dynamic experience for displaying client and asset data.
2. Data Governance (CDAO):
Description: The CDAO team processes and manages incoming data packages from third-party vendors such as Morgan Stanley and UBS. The data is refined and stored in an S3 bucket for further consumption.
Data Processing:
oETL jobs are implemented to clean and process incoming data, ensuring it’s ready for use.
oData updates can cause delays in the system, affecting the timely consumption of updated data.
Technology:
oBackend: Python for ETL jobs and data processing scripts that interact with AWS services like S3 and AWS Lambda for serverless data processing.
oAWS S3: Storage for processed data, allowing easy access for further use by the application.
3. Content Management (Seismic, TeamSite, XSV):
Description: Seismic is the content management platform that enables the sales team to access marketing materials and relevant content for clients. The content is pulled into Seismic through TeamSite and synchronized via XSV (SYNCHER).
Workflow: Sales executives can browse, track, and share content, both electronically and physically, with clients.
Technology:
oBackend: Spring Boot microservices integrate Seismic, TeamSite, and XSV through APIs and synchronization mechanisms.
oDatabase: MySQL or PostgreSQL for managing content metadata, orders, and user interactions.
oFrontend: The UI can be developed using Angular for smooth content browsing and order tracking.
4. Order Management:
Description: The Literature Order History (LOL) system tracks orders and content requests made by the sales team. This allows them to request materials for clients and ship the content.
Technology:
oBackend: Spring Boot microservices to manage order statuses and connect with Seismic’s ordering system.
oDatabase: SQL databases (e.g., MySQL or PostgreSQL) are used to track orders, store content metadata, and manage order histories.
oFrontend: Angular for developing an intuitive order-tracking interface for sales teams.
5. Financial Advisor Content Management:
Description: This component supports financial advisors working with external clients, such as Registered Investment Advisors (RIAs), bank advisors, and broker-dealers. It allows them to access Vanguard’s marketing materials, fund information, and educational content to share with clients.
Technology:
oBackend: Spring Boot services to integrate content from various sources (e.g., Seismic, XSV).
oAWS: Leverages AWS S3 for content storage, ensuring scalability and fast content delivery to financial advisors.
oFrontend: Angular for the development of user interfaces, providing seamless content access for advisors.
oMicroservices: Microservices expose APIs for integrating content management and financial tools, ensuring smooth access and security.
6. Workflow:
Description: The workflow allows the sales team to select and order content from Seismic. It ensures that the right materials are shared with financial advisors to support their client interactions.
Technology:
oBackend: Spring Boot microservices to handle content management processes and workflow automation.
oIntegration: Seamless integration with Seismic, TeamSite, XSV, and AWS S3 to ensure content is correctly synchronized and shared.
oDatabase: The system relies on SQL databases (e.g., MySQL) to store and track orders, content, and user interactions.
Tech Stack:
1.Backend:
oSpring Boot for microservices development, managing business logic, and handling requests to and from external vendors and internal data systems.
oJava 11/17 for backend services, leveraging features such as lambda expressions, streams, and enhanced performance.
oPython for ETL jobs, data processing, and interfacing with AWS services.
oAWS services like S3, Lambda, EC2, CloudWatch, and IAM for scalable, cloud-native solutions.
oMicroservices architecture to enable decoupled and scalable service interaction, ensuring high availability and flexibility.
2.Frontend:
oAngular (latest version) to develop dynamic, single-page applications (SPAs) for client profiles, content browsing, and order management.
oBootstrap and Material Design for responsive and modern UI components.
3.Database:
oMySQL/PostgreSQL for relational data storage, managing orders, content metadata, user data, and client information.
oAWS S3 for storing large datasets and files, such as marketing content and financial data.
4.Integration:
oteam processes data through ETL pipelines, refining and storing it on S3 for further use by the system.
5.Security:
oOAuth 2.0 and JWT for securing access to applications, ensuring only authorized users can access sensitive data.
oAWS IAM for managing permissions and access to resources in AWS.
6.CI/CD:
oJenkins, Maven, and GitLab for continuous integration and deployment, ensuring efficient code release cycles.
7.Testing:
oJUnit and Mockito for unit testing backend services.
oCypress for end-to-end testing and ensuring frontend reliability.
oSonarQube for code quality analysis and ensuring maintainability.
Role as a Java Full Stack Developer:
As a Java Full Stack Developer, you would be responsible for:
Backend Development: Developing Spring Boot microservices to handle content management, client profile data, order management, and integrate with third-party vendors.
Frontend Development: Building dynamic UIs with Angular to display financial data, client information, and marketing content.
Data Integration: Writing ETL scripts in Python to process data from third-party vendors, refine it, and store it on AWS S3 for further use.
Security and Authentication: Implementing OAuth 2.0 and JWT for secure access management to sensitive data and services.
Collaboration: Working with cross-functional teams to gather requirements, integrate services, and ensure the proper flow of data across the content management, data governance, and order management systems.
CME - I currently work as a Senior Java Developer at CME Chicago, with a primary focus on Java, Python, and Spring Boot. In my role, I've had the opportunity to incorporate Golang into developing microservices deployed on Google Cloud Platform (GCP). One of the major projects I contributed to involved designing scalable microservices for real-time processing and transaction handling within a financial application. My focus was on enhancing the speed and reliability of transaction processing.
In this project, I helped build and optimize a data pipeline using Apache Kafka and Google Pub/Sub, contributing to real-time processing improvements and ensuring seamless integration with backend systems. I also utilized Golang to implement low-latency services, with one of the core modules being developed using Golang to maximize performance.
As part of the migration to GCP, I designed and implemented data pipelines utilizing BigQuery, Dataflow, and Pub/Sub. In addition to my backend development work, I collaborated closely with the DevOps teams to integrate CI/CD pipelines, ensuring efficient and seamless application deployment. This role has given me extensive hands-on experience with Golang, leveraging its strengths to optimize critical performance components in real-time applications.
So from the AWS perspective, I work on the various services. I work on, like AWS lambda for real time processing for serverless application, and I leverage the Amazon Eks to manage the orchestrate Docker container in the scalable way, and we also made extensive use of API gateway to expose the RESTful API securely to ensure the reliable access to the micro services, and for event driven messaging systems. I work with the Amazon SQS, and on the storage side, we utilize like Amazon s3 and DynamoDB and for monitoring, I work with AWS cloud watch,