Post Job Free
Sign in

Software Development Spring Boot

Location:
Piscataway, NJ
Salary:
165000
Posted:
June 11, 2025

Contact this candidate

Resume:

Irfan Ahmed

Cell: 908-***-**** Email: **********.***@*****.***

LinkedIn: https://www.linkedin.com/in/irfan-ahmed-1353b941/

Professional Summary

Seasoned professional with decade+ experience in software analysis, design, development, and support for web/enterprise applications in the financial domain like Evault ERP, Retirement Planning & Telecommunication Billing.

Experienced in development of event-driven design of Microservices and 12-factor app development standards.

Experienced in software architecture and applying best practices with practical experience of large-scale applications development.

Experienced in cloud native environments like AWS and Spring Cloud. Hands on working experience for using Cognito, SQS, S3, EC2, RDS, Lambda, Zuul, Eureka and Hystrix.

Experienced in software development life cycle models like Waterfall, Agile, Scrum and Kanban methodologies.

Experienced in both Windows and Linux based development environments.

Proficient in Java 8 features like Lambdas, Functional Interfaces, Stream APIs, Collection API improvements.

Experienced in performance enhancement and code optimization techniques.

Experienced in design & implementation of web-based application using Struts, Spring, Spring Boot, and JSF frameworks.

Experienced in design & implementation of Hibernate/Sprint Data JPA layer entities and repositories and involved in developing the service layer architecture.

Excellent knowledge of distributed applications and experienced in developing middleware based on WSO2 ESB for integration in SOA environment and WSO2 API Gateway for API based front ends.

Excellent experience developing APIs using Java-based enterprise technologies and tools using Java 8 and above, Microservices architecture, Spring, Apache Kafka, Apache Camel, and REST.

Experienced in designing REST/SOAP based services with authentication and authorization mechanism.

Experienced in Mock based testing using JUnit.

Experienced in building environments on CI/CD pipelines using Jenkins, Azure Pipelines.

Experienced in Nodejs for interactive web client components.

Expertise in React JS Framework to develop SPA.

Experienced working with React Components, forms, events, keys, router, and flux concepts.

Experienced in working with Redux Architecture using complex OOP concepts in improving web app performance.

Designed and implemented a real-time data streaming pipeline using Apache Kafka for ingesting time series data from various sources. Documented the data streaming architecture, processes, and best practices for knowledge sharing within the team.

Experienced Java Engineer with 15+ years in building high-performance, distributed microservices, specializing in real-time data processing using Java 17, Spring Boot, Kafka, and multithreading. Exposure to financial services workflows including US Treasury trading (RFQ, D2D/D2C models) and a strong grasp of latency-sensitive, fault-tolerant design patterns suitable for electronic trading platforms.

Experienced in building modern front-end applications using React, Angular, and Vue.js, leveraging component-based architecture and reusable UI elements.

Proficient in integrating front-end applications with RESTful APIs and WebSocket-based real-time communication using client-server architectural principles.

Skilled in JavaScript and TypeScript with strong understanding of ECMAScript standards and front-end performance optimization techniques.

Familiar with modern API design standards including REST and GraphQL, ensuring efficient client-server data interactions.

Hands-on experience with cloud platforms such as AWS and Azure for deploying scalable and resilient front-end and backend services.

Proficient in implementing and optimizing Elastic Search and the ELK stack (Elasticsearch, Logstash, Kibana) to enhance data retrieval, visualization, and analysis capabilities. Adept at developing scalable search solutions, troubleshooting performance issues, and ensuring the seamless integration of log management systems.

Technical Skills

Web Server-Side Technologies: Spring Boot, Spring MVC, Struts, JSF and JSP/Servlets

Web Client-Side Technologies: NodeJS, jQuery, React JS, Angular 7,12,13 JavaScript (ES6+), TypeScript, DOJO.

Languages: Java, JavaScript, TypeScript (with OOP and generics support).

Cloud: Spring Cloud, AWS (EC2, Lambda, S3, RDS, Cognito), Azure (Kubernetes, DevOps Pipelines), GCP (basic exposure)

Aspect Oriented Programming: AspectJ, Spring AOP,

Spring Boot, Spring MVC

ORM Framework: Spring Data JPA/Hibernate.

Middle Tier Technologies: WSO2 ESB.

Design Patterns (Singleton, DAO Pattern, Observer, Command, Proxy, Façade, Builder, Factory)

Unit Test Framework (TDD): JUnit

Operating Systems: Linux, Windows

Databases: MySQL, PostgreSQL, MS SQL Server, AWS RDS, Oracle and Cassandra.

IDE: IntelliJ Idea, Eclipse, STS, NetBeans and WSO2 integration designer for designing the ESB and mediation flow.

Version Control tools: GIT, SVN

Build Tools: Maven, Gradle, ANT

Web/Application Server: WebSphere, WebLogic, JBOSS.

Servlet Container: Apache Tomcat (6, 7, 8,9).

Reporting: Jasper Reports, I-Reports

Bug Tracking: JIRA

Messaging System: IBM WebSphere MQ, Active MQ, Kafka, SQS.

Log Management and Analysis: ELK (Elastic Search, Log Stash, Kibana).

API & Communication: RESTful APIs, GraphQL, WebSockets, JSON over HTTP.

Professional Experience

Company: TCS (UPS client) Dec 2022-Present

Team Lead Java Developer Full Stack (Team Lead / Architect)

NPT (Network planning tool). It’s a tool to schedule and manage the delivery of the packages throughout the country. Many LOAD picking centers, driver scheduling route calculation information are managed by this tool. Whole network of locations, delivery centers are maintained by this tool. Microservices using Spring Boot and java 11 are built along with Angular as front end. Open shift used as container management tool. Working as Tech Lead in North American Team.

Key Responsibilities:

Design and implement components and Back end according to requirements.

Writing the code using Cutting edge J2EE, Spring Boot, Angular 13 technologies.

Integration of back-end code with the multi types of databases like Couchbase, SQL Server.

Optimizing of the algorithms using the latest implementation of techniques in java lambda expressions, Data Streams.

Assisting the lead developer and or developers in composing the load testing plan, directly writing scripts to implement the plan.

Writing reusable Angular components, Injectable services, Request/Response interceptors.

Assisting the project management office and testers in the creation of the test plan and test scenarios whose scope includes new code or changes in existing code.

Providing input to technical lead and architect on development solutions; Monitoring and supporting high volume batch file and web service integrations.

Designing cutting-edge technologies-based application, Conceptualizing, and creating wireframes, graphical templates, and other graphical elements for various angular components and templets for NodeJS.

Designed and integrated front-end applications with real-time WebSocket-based backends for dynamic user interfaces.

Developed and consumed GraphQL APIs for selective data retrieval, improving payload efficiency and reducing front-end complexity.

Utilized TypeScript in Angular and React applications to enhance maintainability and catch compile-time errors early.

Designed and implemented a real-time data streaming pipeline using Apache Kafka for ingesting time series data from various sources.

Developed Kafka producers and consumers to ensure data ingestion and processing reliability and scalability.

Leveraged Kafka Streams to perform real-time data transformations, aggregations, and filtering for time series data.

Led the optimization of Kafka configuration settings, resulting in a 50% increase in message throughput, allowing the system to handle a higher volume of data. Implemented Kafka partitioning strategy enhancements, enabling seamless scaling of the platform to accommodate a 3x growth in data volume without performance degradation.

Implemented Kafka replication improvements, reducing message loss by 75% during broker failures, thereby increasing system reliability and data integrity.

Integrated MongoDB as a storage layer for time series data, optimizing data retrieval and storage mechanisms.

Implemented data serialization and deserialization techniques to ensure data consistency and compatibility between Kafka and MongoDB.

Designed and maintained data schema for MongoDB collections to efficiently store time series data.

Collaborated with cross-functional teams to troubleshoot and resolve data processing issues and optimize system performance.

Ensured data pipeline security and access controls in compliance with company policies.

Monitored and fine-tuned the Kafka and MongoDB clusters for optimal performance and scalability.

Documented the data streaming architecture, processes, and best practices for knowledge sharing within the team.

Integrated Spring Boot microservices with Amazon RDS (MySQL/PostgreSQL) for scalable, cloud-hosted relational data storage.

Designed and optimized database schema on RDS to support high-throughput APIs, reducing query latency by 30%.

Configured IAM-based access, automatic backups, and multi-AZ deployments for high availability and disaster recovery.

Designed and optimized multi-threaded Java microservices for near real-time processing in a latency-sensitive environment, leveraging Kafka for event-driven architecture—similar to RFQ-based interactions in US Treasury trading.

Integrated with internal systems in a D2D-like model, supporting asynchronous communication and failover resilience, aligning with Treasury market reliability expectations.

Participated in architecting request-response APIs in a microservices environment with throughput, concurrency, and SLA focus, akin to high-volume quote handling in electronic trading.

Tools $ Technologies:

Java, J2EE, Spring BOOT, Dockers, OpenShift, Angular 7.0.x/ 12.0.x, 13.0.x, Nodejs 12.14.x/14.15.x, Spring Tool suit, Jdk 17, JDK 11, MongoDB, Apache Kafka.

Company: TCS (Barclays Bank) RFT NYK

Sr. Java Developer Full Stack

RFT (Risk, Finance, and Treasury) this group is responsible for managing risks, financial operations, and treasury activities within the bank. Microservices using Spring Boot and java 11 are built along with React JS as front end. ADO (Azure Dev Ops) used as CI/CD.

Key Responsibilities:

Developed and maintained React-based front-end modules for treasury and regulatory reporting dashboards.

Built Spring Boot microservices to support real-time risk calculations, market data ingestion, and reporting APIs.

Implemented OAuth2/JWT-based security across microservices and integrated with enterprise SSO.

Utilized Kafka for event-driven communication between trade systems and risk engines.

Developed and optimized PostgreSQL queries and stored procedures for reporting services.

Designed reusable React components for interactive visualizations and risk factor breakdowns.

Integrated Prometheus + Grafana for monitoring microservices and implemented alerting for SLA breaches.

Participated in Agile ceremonies (daily standups, sprint planning, retrospectives) and worked closely with product owners in UK and India.

Tools $ Technologies:

Java 17, Spring Boot, Spring Security, Spring Data JPA, RESTful APIs, React.js, Redux, JavaScript (ES6+), HTML5, CSS3, Bootstrap, Material-UI, Kafka, RabbitMQ, Eureka, Feign Client, OpenFeign, ELK Stack (Elasticsearch, Logstash, Kibana), Prometheus, Grafana, Splunk, OAuth2, JWT, SSO Integration, JUnit 5, Mockito, Postman, Swagger/OpenAPI.

Company: TCS (Backbase Bank) Texas Farm Credit Bank

Sr. Java Developer Full Stack

Customized and extended Backbase Digital Banking Platform for retail and commercial users.

Built responsive React components for customer onboarding, account management, and loan servicing workflows.

Developed Spring Boot microservices to integrate with core banking systems and external credit bureaus.

Implemented RESTful APIs for digital document management, payments, and loan applications.

Integrated Elasticsearch for quick access to customer financial history and transaction search.

Worked on CI/CD pipelines using Azure DevOps and Docker-based deployments on Kubernetes.

Ensured accessibility and cross-browser compatibility for React UI components.

Collaborated with Backbase Solution Architects and TFCB stakeholders to align features with regulatory and compliance needs.

Tools $ Technologies:

Java 8, Java 11, Spring Boot, Spring Data JPA, RESTful APIs, Backbase SDK, DBS Journeys, Identity & Access, Widget Development. React.js, Redux, JavaScript (ES6+), HTML5, CSS3, Bootstrap, Material-UI, Kafka, Eureka, Feign Client, ELK Stack (Elasticsearch, Logstash, Kibana),

Company: Macrosoft (At&t (Bedminster Office) (https://www.att.com/)) June 2019-Dec 2022

Senior Java Developer Full Stack (Team Lead / Architect)

ENMT user experience web portal At&t users. It is portal that hosts many applications using common code base for back-end Spring boot-based API’s. This tool is a custom reporting tool for the higher management which has complex front-end templates and backend database queries. This tool is migrated from JSP based Spring MVC implementation to Spring MVW Angular based front end. It has dockerized implementation managed by Kubernetes on the Azure cloud.

Key Responsibilities:

Design and implement components and Back end according to requirements.

Senior Developer of the Team working on site in the Client office and have direct communication with client.

Writing the code using Cutting edge J2EE, Spring Boot, Angular 7 technologies.

Integration of back-end code with the multi types of databases like MySQL, Oracle, Hadoop, Snowflake.

Optimizing of the algorithms using the latest implementation of techniques in java lambda expressions, Data Streams.

Assisting the lead developer and or developers in composing the load testing plan, directly writing scripts to implement the plan.

Implemented the SSO (Single sign on) security for the application according to the AT&T standards using Spring Security.

Designed and implemented WebSocket-based real-time alerting system to deliver hurricane and severe weather service notifications to AT&T-connected devices across affected regions.

Built scalable WebSocket endpoints using Spring Boot and STOMP protocol, enabling persistent, low-latency connections with mobile and web clients.

Integrated with internal emergency alert APIs and external data sources such as NOAA feeds, broadcasting location-specific alerts using geo-tagged WebSocket sessions.

Leveraged Kafka for ingesting high-volume alert payloads and Redis for managing distributed client session states to ensure reliable message routing across WebSocket clusters.

Ensured resilience by implementing failover mechanisms and fallback polling for clients with intermittent connectivity, maintaining compliance with disaster recovery protocols.

Developed front-end listeners in Angular for real-time UI updates, including progressive disclosure of alerts, countdowns, and actionable buttons (e.g., “Acknowledge” or “Redirect to Safety Tips”).

Collaborated with security and compliance teams to enforce encrypted communication (TLS) and session validation for sensitive regions and critical alert levels.

Writing reusable Angular components, Injectable services, Request/Response interceptors.

Assisting the project management office and testers in the creation of the test plan and test scenarios whose scope includes new code or changes in existing code.

Providing input to technical lead and architect on development solutions; Monitoring and supporting high volume batch file and web service integrations.

Designing cutting-edge technologies-based application, Conceptualizing, and creating wireframes, graphical templates, and other graphical elements for various angular components and templets for NodeJS.

Designed and integrated front-end applications with real-time WebSocket-based backends for dynamic user interfaces.

Developed and consumed GraphQL APIs for selective data retrieval, improving payload efficiency and reducing front-end complexity.

Utilized TypeScript in Angular and React applications to enhance maintainability and catch compile-time errors early.

Developed and maintained real-time data processing solutions using Apache Kafka, ensuring seamless data flow and low-latency processing for critical applications.

Implemented Kafka Connect connectors for integrating various data sources into the Kafka ecosystem, facilitating data ingestion and synchronization.

Implemented Kafka storage optimizations, resulting in a 40% reduction in disk space utilization and associated cost savings in cloud infrastructure.

Implemented Kafka security protocols and access controls, achieving compliance with industry regulations (e.g., GDPR, HIPAA) and enhancing data protection measures.

Used Ansible automate the provisioning and configuration of servers, networking devices, and cloud infrastructure.

For Kafka monitoring and observability used Confluent Control Center.

Leveraged GraphQL to design and optimize APIs, enabling clients to fetch tailored data efficiently and reducing over-fetching issues.

Implemented data serialization and deserialization techniques to ensure data consistency and compatibility between Kafka and MongoDB.

Designed and maintained data schema for MongoDB collections to efficiently store time series data.

Collaborated with cross-functional teams to troubleshoot and resolve data processing issues and optimize system performance.

Ensured data pipeline security and access controls in compliance with company policies.

Monitored and fine-tuned the Kafka and MongoDB clusters for optimal performance and scalability.

Implemented and optimized Elastic Search and the ELK stack (Elasticsearch, Logstash, Kibana) to enhance data retrieval, visualization, and analysis capabilities. Adept at developing scalable search solutions, troubleshooting performance issues, and ensuring the seamless integration of log management systems.

Successfully deployed and configured the ELK stack for centralized logging and real-time analytics, reducing the mean time to identify and resolve issues by 40%.

Utilized Logstash to create robust data ingestion pipelines, handling various data sources and formats, resulting in a more efficient log management process.

Developed comprehensive Kibana dashboards for real-time monitoring and visualization of application metrics and logs, facilitating better decision-making and operational insights.

Designed and implemented optimized file transfer mechanisms using AWS services (e.g., S3 Transfer Acceleration, AWS DataSync, and multi-threaded upload/download strategies) for multi-TB datasets between us-east-1 and eu-west-1 regions.

Tuned TCP window size, parallel streams, and multi-part uploads to maximize throughput across long-haul networks.

Used VPC endpoints, Direct Connect, and S3 Transfer Acceleration to reduce latency and increase reliability for inter-region transfers.

Leveraged AWS DataSync and custom Boto3-based scripts to automate and monitor transfers of large files (10GB+) with integrity verification and retry logic.

Worked closely with network and security teams to ensure firewall/NACL configurations, MTU size adjustments, and encryption in transit using TLS.

Implemented performance testing and benchmarking tools (e.g., iperf3, s3cmd, CloudWatch) to analyze bottlenecks and improve transfer speed by 30-40%.

Designed and optimized multi-threaded Java microservices for near real-time processing in a latency-sensitive environment, leveraging Kafka for event-driven architecture—similar to RFQ-based interactions in US Treasury trading.

Integrated with internal systems in a D2D-like model, supporting asynchronous communication and failover resilience, aligning with Treasury market reliability expectations.

Participated in architecting request-response APIs in a microservices environment with throughput, concurrency, and SLA focus, akin to high-volume quote handling in electronic trading.

Tools $ Technologies:

Java, J2EE, Spring BOOT, Dockers, Kubernetes, AWS, Azure, Angular 7.0.x/ 12.0.x, Nodejs 12.14.x/14.15.x, Tomcat 9.0, Spring Tool suit, Jdk 8, JDK 11, MongoDB, Apache Kafka, Elasticsearch 7.17.X, Kibana 8.10.x, Logstash 8.5.x

Company: Verizon Wireless (https://www.verizonwireless.com/) July 2018-June 2019

Senior Java Developer

Key Responsibilities:

Working on Self Service Framework (SSF). SSF is a server-side framework used by Android, IOS and web clients to collect the data from System of Records. It acts as middle layer for communication between Customer facing applications and System of Records. Verizon is making an equivalent version of this product on micro-services. Worked on the Angular JS based Carousal portal for the user Interaction as well communication with billing for Customer Profile, Billing high and Unexpected Charges Services.

Responsible for maintaining continued migration from SSF Spring based Rest Services to micro-services by understanding business flows.

Resolved performance issues by fixing services timeout issues, query performance issues and also done code reviews for new enhancements.

Done enhancement and writing new APIS as needed using Spring Restful architecture.

Fixed cloud configuration and deployment issues.

Designed and implemented WebSocket-based real-time chatbot communication between mobile clients and backend SSF services, enabling persistent bi-directional messaging for enhanced customer engagement.

Used STOMP over WebSocket protocol with Spring WebSocket on the server side, allowing seamless communication with Angular and Android front-end clients.

Developed secure WebSocket endpoints with session validation, enabling encrypted customer-specific communication and stateful interactions.

Implemented a message broker-based architecture to handle chatbot routing, session timeouts, and fallback to asynchronous messaging in case of connection loss.

Created a message history service using Spring Data JPA to persist chat messages for auditing and customer service reviews.

Leveraged Redis Pub/Sub for message distribution across multiple WebSocket server nodes to maintain scalability and load balancing.

Integrated chatbot events with Kafka producers for downstream analytics and customer behavior tracking.

Fixed old test cases and developed new test cases by understanding business flow.

Configured and deployed SSF micro-services on development and QA environments.

Worked on billing services like Customer Profile, Billing high, Unexpected Charges.

Communicated with both onshore and offshore development teams for easy integration with SSF microservices.

Provided technical guidance and best practices to management and development teams about API gateway for micro-services deployment.

Migrated on-premises Oracle DB to Amazon RDS PostgreSQL, reducing maintenance overhead and improving scalability.

Tuned RDS PostgreSQL configurations (workmen, autovacuum, etc.) to handle 10M+ records efficiently.

Identified and fixed service deployment issues using Eureka service registry.

Mentored team for using Git and helped environment teams for stabilizing CI/CD build pipelines.

Designed and optimized multi-threaded Java microservices for near real-time processing in a latency-sensitive environment, leveraging Kafka for event-driven architecture—similar to RFQ-based interactions in US Treasury trading.

Integrated with internal systems in a D2D-like model, supporting asynchronous communication and failover resilience, aligning with Treasury market reliability expectations.

Participated in architecting request-response APIs in a microservices environment with throughput, concurrency, and SLA focus, akin to high-volume quote handling in electronic trading.

Tools $ Technologies:

Java, J2EE, Spring BOOT, AWS EC2, AWS S3, AWS Lambda, Spring Cloud, Spring Data JPA, Spring AOP, Restful Web services, SOAP Services, JDK 1.8, Agile, Oracle, Junit, Angular.

Company: Systems Limited (https://www.systemsltd.com/) Feb 2018 -April 2018

Sr. Java/Software Developer – Technical Lead (Assistant Architect)

Project: Oneload

Url: https://www.oneloadpk.com/

Key Responsibilities:

Involved in the application architecture decisions. Implemented the multiple design patterns to resolve the business issues.

Involved in the review and analysis of the Functional Specifications, and Requirements Clarification Defects etc.

Involved in the analysis and design of the initiatives using Rapid Application Development.

Developed and maintained responsive and dynamic user interfaces using Vue.js with Vuex for state management, facilitating seamless interaction and data binding between front-end components and RESTful services.

Implemented reusable Vue components, lifecycle hooks, and custom directives to enhance UI reusability and maintainability.

Performed deployment of applications on JBOSS 7.2 Application Server.

Developed Use Case Diagrams, Object Diagrams and Class Diagrams in UML using Rational Rose.

Used Spring BOOT, Spring Cloud for the development of REST WS.

Used Spring Cloud Netflix Eureka Server for web service registry and discovery.

Used Spring Cloud Netflix Zuul as API gateway.

Implemented ORM in the persistence layer using Hibernate framework in conjunction with Spring Aspect Oriented Programming (AOP) functionality.

Involved in implementing validations, exception handling.

Used Hibernate for Object Relational Mapping (ORM) and data persistence.

Involved in Spring and Hibernate Integration.

Created Docker script for RTV. Docker based deployment were on project’s future roadmap.

Created customized Kibana dashboards tailored to various stakeholders, enhancing the visibility of key performance indicators and operational metrics.

Mentored team over Java 8 about how we can affectively use Java Stream APIs.

Led a project to migrate the existing search functionality to Elastic Search, resulting in a more efficient and user-friendly search experience for end-users.

Implemented Logstash configurations to aggregate and parse logs from multiple sources, standardizing log formats and improving the ability to troubleshoot issues across the organization.

Automated routine maintenance tasks and monitoring setups using scripting and orchestration tools, reducing manual intervention and operational overhead.

Established best practices for managing Elasticsearch indices, including index lifecycle management (ILM) policies and regular backups, to ensure data integrity and compliance.

Tools $ Technologies:

Java, J2EE, Spring BOOT, Spring Cloud, hibernate 5.x, JBOSS 7.2, Eclipse IDE, SOAP, WSDL, JDK 1.8, Agile, SQL, JDeveloper, JUnit, Hibernate, SWAGER REST WS Documentation, NodeJS, ELK (7.7.x, 7.0.x, 7.x.x).

Company: EWS Pvt Ltd. (https://ewsystemsinc.com/) Oct 2013 – Nov 2017

Senior Java/Software Developer (Technical Team Lead)

Project: Tone Networks USA

Url: https://tonenetworks.com/

TONE Networks is a unique service for busy women looking to better their personal and professional lives. With hundreds of short form videos from credentialed experts, we are your guide to important questions as you navigate life’s path. This application provides a platform to the users to interact with industry best mentors to mitigate the psychological problems. A list of videos populated based on user interests is played and a chatting window can be used to contact with the Mentor.

Key Responsibilities:

Detail design, Development, and implementation of the system.

Managed a team of four people for the project, created the estimations for efforts required to produce end-to-end solution and led the technical solution development. Responsible for the code reviews and code optimization to deliver the code as per the project quality standards.

Implemented application on AWS EC2, AWS S3, AWS Cognito instances.

Integrated PayPal, Stripe as payment gateways.

Integrated MailChimp as third-party email template provider.

Integrated JWPlayer as third-party video player and video management tool.

Developed the SRS using Drop wizard’s framework and used Microservices Architecture approach for the development of RESTful APIS by using Jersey Specification.

Developed and maintained responsive and dynamic user interfaces using Vue.js with Vuex for state management, facilitating seamless interaction and data binding between front-end components and RESTful services.

Implemented reusable Vue components, lifecycle hooks, and custom directives to enhance UI reusability and maintainability.

Implemented several design patterns such as Singleton and Factory, Builder Pattern, Adapter pattern to resolve the most common design issues.

Worked as developer of use cases from scratch.

Used Gradle as code automation deployment tool.

Tools $ Technologies:

Drop wizards, JDBI, Restful Micro services, Jersey, Gradle, AWS kinesis, AWS Cognito, AWS S3 AWS EC2, google Guice, Stripe Payment API, PayPal Payment API, JWPlayer API, VUE.js.

Project: NYPD USA

Development of file Picker Application. NYPD uses an application for the uploading of the traffic violation tickets on a server. This spring boot-based application picks those evidence after resizing and renaming and archiving those file places them on the defined server for NYPD.

Key Responsibilities:

Offshore Sr. Java Developer in Lahore Office.

Developed the stand-alone using Spring Boot.

Developed complete Applications from scratch.

Deployed a fat jar on the server.

Customized exception handling to overcome the server down



Contact this candidate