Post Job Free
Sign in

Fixed Income Payment Processing

Location:
New Jersey
Posted:
July 10, 2025

Contact this candidate

Resume:

Name: Srinivas S

Email : srinivasjava*** gmail.com

Cell No: +1-657-***-****

Professional Summary:

Sun certified Programmer with 15+ years of cross functional experience in java j2ee enterprise applications development delivered high quality applications to MorganStanly, Barclays,Citi, UBS,Legg Manson. Designed and developed Multi-threaded, low latency high volume Java application, Enterprise Application Integration, Enterprise Service Bus, messaging infrastructures, RIA applications using Flex, GWT,GXT,EXT JS, Angular JS Java/J2EE monitoring and performance tuning, Open Source Technologies. Knowledge in Fixed Income, Wealth Management, Credit Risk, Payment processing in the field of Finance.

An Actimize professional with experience in the Actimize Remote Banking, AML – SAM Solution, InAuth, design, development, and implementation of Actimize projects and products for various financial services clients.

Worked on champion/challenge configuration in order to perform the tuning while the system is taking live traffic.

Experience in Actimize Risk Case such as configuring Alert Views, Dart queries, Alert summary, workflows, dashboard, policy manager and platform list.

10 years of expertise in core java with collections, multi-threading and garbage collection.

9 years of financial development experience in electronic trading systems to receive market data, order and trade management for commodities and commodities and Settlement between different counter parties. Products including fixed income and Equity and Foreign Exchange, Payment processing.

8 years of expertise in hibernate, spring for building financial applications.

6 years of expertise in developing applications using EJB, EJB3, JPA.

10 years of expertise in SQL, JDBC, Data modeling, RDBMS (oracle, Sybase, SQL server, DB2).

5 years of expertise in JMS for processing asynchronous messages using Message Driven Beans.

5 years of expertise in integrating different applications using MQ Series, TIBCO, Weblogic JMS Provider and Jboss MQ Provider and sending messages to the different systems.

4 years in developing applications using Webservices and SOAP using AXIS and JBOSSWS.

Strong in designing and architecting the applications using various JAVA, J2EE design patterns and RUP methodologies.

2 years of expertise in developing applications using JBoss Cache Coherence and Replication of data between different nodes in Application

Expertise in using Java design patters creational, Structural, Behavioral patters

4 years of expertise in developing applications using XML, XSLT, DOM, SAX, STAX, TRAX,

XPath, Xquery.

4 years of expertise in writing Stored Procedures Using PL/SQL and TSQL.

6 years of expertise in developing applications using Struts, WebWork, JSF, Oracle ADF, GWT.

1 years of expertise in developing applications using Adobe Flex 3.2

3 years of expertise in developing applications using GWT, EXT-GWT (GXT)

2 years of expertise in developing applications using EAI Technologies Sprig Integration, Tibco BW

1 years of expertise in developing and designing portlets using Weblogic Portal’s Page Flows and

10 years of expertise in develop and deploy applications using Jboss, WebLogic, Websphere and Apache Tomcat.

Proven skills in relationship management with clients and effectively guiding teams during the project development life cycle and delivering the product within time cost and quality parameter.

Attend internal reviews and provide status of current work in progress as necessary.

Assist other team members, with assigned tasks or project issues, wherever possible

Led the design and development of Kotlin-based microservices for critical banking operations such as account management, payment processing, and fraud detection.

Having good understanding of standards and basic concepts related to SSL.

Education Qualifications:

B.E Computer Science (Karnataka University, Dharwad)

MBA (Madras University, Chennai)

Certifications:

Sun certified programmer for Java 2 Platform (JCP)

JBOSS Certified Advanced J2ee Developer

Oracle certified programmer for SQL & PLSQL (OCP-I)

Oracle certified programmer for Architecture and Administration (OCP-II)

Skillset:

Windows-98, Windows NT/2000, UNIX, MCP, Linux, C, Java 1.4,PL/SQL,TSQL,XML,XSLT,HTML, and DHTML, UML, MQSeries, TIBCO Rendezvous, J2EE 1.4 (Servlets, JSP, EJB, JMS, JTA), RMI, EJB2.0/3.0, JMX, JDBC, JFC, OOAD, RUP, 3-DVE, Tomcat 4.0/5.0, Struts1.2/2.0,WebWork,Spring 1.2, Hibernate2.0/3.0, Axis1.3,Ant, Tomcat 4./5.0, Weblogic 6.1/8.1, Websphere,Jboss3.0/4.0, Eclipse 3.1/3.2,TogertherJ5.5/6.0,RAD 6.1,Strutes Studio 4.5, Rational Rose, Rational Requisite Pro, Rational Architect and Jmeter, Visual Basic 6.0,Swing, Oracle 9i & 8i, MS SQL Server, JCS, JBOSSCache, Ehcache, Coherance DB2,DMS-II, VSS 6.0, CVS, Subversion, Rational clearcase.

Projects Summary:

Morgan Stanley 1 NewYork Plaza, NY

Project #1: CMAT(Capital Market Automatic Trading) / GOMS 12//2021 –Present

Lead Full stack Java Developer

CMAT is trading application for various fixed income products including Corportate bonds, Municipal Bonds, Agnency Bonds, Treasuries, Zero Cupon Bonds, Convertibles, Non-dollar Denominated Bonds,CDs,Structured Products.The system allows a trader to handle all security transactions including pricing, order handling against interanl or extranally owned bonds, offering bonds, bidding bonds, responding to RFQs, and integrating with ECN markts(TradeWeb,TMC, BondPoint) using CMONE, CMONE FIX Gateway(Quick Fix) is pub sub and ECN connectivity part of the system.

GOMS (Order Management System) application is Platform which is a global and distributed fixed income trading and workflow management platform that provides consistent, reliable, high-performance Execution, Internal Markets, Auto Quoting, Auto Execution, Automated Real-Time Pricing primarily responsible for handling all the trading action on orders beginning at new order creation .GOMS is centralized application for different financial products such as equities, options, fixed come and mutual funds .GOMS keeps track of an orders various transition states, from order creation onwards. primarily users are financial advisors who creates orders and monitors their order statuses for making meaningful business decisions .GOMS job is collect the events(order transitions) and apply rules according to business logic.

The Actimize Anti-Money Laundering (AML) software solution screens accounts company-wide for suspicious activity. Actimize solution allows Morgan Stanley to track activity for a customer from a centralized view. It provides detection & reporting on all activity and mitigates the resulting exposure to Morganstanley by meeting regulatory AML program requirements. It allows Morgan Stanley to Bank to improve filing processes and Currency Transaction Reporting (CTR)

Responsibilities:.

Used Java 17 features like Sealed Classes, Utility Methods, Nested Based Access Control, and Local Variable Syntax for lambda Parameters.

Utilized Java features like Lambda expressions for collection evaluation and comparing the data, Stream API for Bulk data operations on Collections which would increase the performance of the Application, Parallel operations on collections for effective sorting mechanisms and to change the sequential stream flow of threads while session establishment.

Implemented Apachi camel to process the fix messages with Kafka cluster setup.

Implemented REST web services Consuming and Exposing services with Camel. Used Camel Components (File, Http, JMS, Kafka).

Designed and developed GraphQL APIs with Node.js for real-time financial data access, enabling front-end teams to query only required fields, optimizing performance and reducing payload.

Built and maintained Node.js microservices to support backend trading and compliance modules, aligned with enterprise-grade scalability and reliability requirements.

Used Java Spring Boot for core business logic and Node.js to develop complementary microservices for audit logging, user session handling, and caching mechanisms.

Implemented Kafka producer and consumer applications on Kafka cluster setup with help of Zookeeper.

Implemented to reprocess the failure messages in Kafka using offset id.

Created Java-based Kafka producer API to capture live-stream data into various Kafka topics.

Developed and implemented a Kafka-based message processing system, including reprocessing failure messages.

Developed and maintained Java-based pricing and risk analytics systems for Fixed Income products such as bonds, treasuries, and swaps.

Collaborated with architects and stakeholders to modernize core banking applications by introducing Kotlin for improved maintainability and performance

Integrated Node.js microservices with backend Java systems to enable real-time trading data processing and improve response times.

Developing producer and consumer applications, utilizing Spring Kafka API, and creating data pipelines from API streaming Gateway REST services.

Used Apache Kafka to handle order messages and ensure reliable communication between different components of the trading system.

Worked with Kafka for Messaging for decoupling applications by separating sending and receiving data.

Integrated Kotlin with existing Java-based systems, ensuring smooth interoperability and efficient migration of legacy modules

Developed custom GraphQL directives for authorization and caching logic to enhance API behavior while keeping schema definitions clean and declarative.

Implemented the function to send and receive JMS messages on IBM MQ asynchronously and send JMS message.

Designed and developed RESTful APIs using Node.js for secure and scalable communication between internal systems and external clients.

Led the design and development of big data pipelines using the Hadoop ecosystem (HDFS, MapReduce, Hive) to process high-volume trading and market data for regulatory reporting.

Integrated Java-based microservices with Hadoop to enable real-time ingestion and batch processing of trade surveillance data.

Hands-on experience deploying Java/Spring Boot microservices on AWS (EC2, S3, RDS) and Azure (App Services, Key Vault) to build scalable banking solutions.

Designed and implemented CI/CD pipelines using Jenkins, GitLab CI, and Docker for automated build, test, and deployment of financial applications.

Containerized enterprise Java applications using Docker and orchestrated deployments on Kubernetes (EKS/AKS) for high availability and resilience.

Developed and maintained microservices architecture using Spring Boot, Kafka, and REST APIs for modules like AML, Payments, and KYC.

Integrated API Gateways (Kong, Apigee) with OAuth2 and JWT to secure internal/external APIs in multi-tenant banking platforms.

Experienced in JMS over messaging to exchange information in more reliable and asynchronous way in enterprise Applications. Used IBM MQ and Apache Camel.

Worked on fixed income trade lifecycle events including trade entry, allocation,confirmation, settlement, and reporting.

Engineered modular GraphQL schemas and resolvers for trade surveillance data, improving API efficiency and enabling faster product iteration cycles.

Utilized the Quick FIX library to generate FIX protocol messages, ensuring precise and consistent communication among trading partners.

Implemented Micro Service Architecture, with Spring Boot-based services interacting through a combination of REST, SOAP.

Worked on Java functional programming concepts like Streams, Functional Interfaces, new date time API and lambda expressions.

Implemented JMS Listener on MQ, Routing functionalities and invoking corresponding web service using Apache Camel.

Implemented trade lifecycle management modules to support trade capture, enrichment, validation, and booking for Fixed Income securities

Implemented custom Decision Rules, Forms, and Alerts for AML and fraud detection use cases.

Created Actimize plugins (Java-based) to handle complex data enrichment and validation logic.

Worked on data ingestion pipelines: transforming and loading data from upstream systems to Actimize.

Installed Actimize code on AIS and RCM servers.

Developed functions, channels, and executable plans based on business requirements.

Built new servers in PROD, QA, and R&D environments with Actimize code installations.

Upgraded AIS, IFM, and ActOne versions as needed. Customized vendor-provided code, including:

Alert display changes,DART,Workflows,Alert views,User/role creation and GUI-related modifications

Created users and assigned roles through the RCM GUI.

Created and activated rules in relevant policy types on AIS servers.

Developed custom ActOne plugins: RFI mail to internal exchange, Platform list validation plugin, Post-step change plugin

Tuned performance for Java microservices interacting with Actimize engine and MongoDB/Oracle DB.

Designed and deployed Spring Boot-based wrappers for Actimize rule testing and simulation environments.

Designed and implemented shared TypeScript interfaces and types between frontend (React) and backend (Node.js/GraphQL) to ensure consistency and reduce integration issues.

Collaborated with data engineering and DevOps teams to manage and monitor Hadoop clusters using Cloudera Manager, ensuring high availability and fault tolerance.

Core Java with Concurrent API is used extensively for parallel processing and chunk processing.

Developed component to receive inbound data from different source systems in different formats (fixed length,xml) normalized to GOMS format using Apache Camel Smook Data Format.

Developed component send formatted (Goms JSON) trade data to GOMS Rest Endpoint to capture enriched data. Response from Rest Endpoint publishes to MQ using Apache Camel

CMONE Gateway (Quick Fix) used to communicate with different ECN for Trade request/Execution, Offerings using Fix message.

Involved in developing several Fix message New Order single, Execution, Confirmation.

Implemented multiple Angular components compatible with the latest versions of Type- Script and Angular CLI.

Integrated with Bloomberg and Reuters APIs to fetch real-time market data (e.g., yield curves, credit spreads) for accurate Fixed Income pricing models.

Configured various routes, directives, and components for grid, pagination, conditional validations, templating, dynamic loading, lazy loading.

Developed secure RESTful APIs in Kotlin adhering to financial regulatory standards (e.g., PCI DSS, SOX, GDPR).

Experience in working with advanced JavaScript such as ECMAScript 6.

Used Angular-Router to turn the application into Single Page Application.

Enhanced application performance by Angular 14 component-based development in view of Angular framework transitions.

Implemented data validation and reconciliation frameworks using Java and Hadoop for accurate trade lifecycle analytics across asset classes.

Developed Angular views to bind models with the DOM and synchronize data with the server as a single-page application (SPA).

Developed unit tests using Karma and Jasmine and followed strict patterns for unit test cases with Jasmine.

.Consumed REST services using Angular HTTP and performed various REST Http operations for data retrievals and updates.

Transformed monolith into microservices using Java 17, Spring Boot, via 12 factor app methodology.

Spring cloud, Distributed Tracing-Spring Cloud Sleuth, Zipkin, Centralized logging- Splunk, Microservice, Netflix OSS framework- Eureka (service discovery), Hystrix (fault-tolerance).

Used Spring Boot which is radically faster in building cloud Micro Services and develop Spring based application with very less configuration.

Developed Micro services & APIs using Spring Cloud, Spring Security, Spring Boot, Spring Integration.

Migrated legacy ETL jobs to Hadoop-based workflows using Sqoop and Hive, enabling seamless data transfer from RDBMS to HDFS.

Developed service using Spring Cloud Function which process data sent Amazon S3 bucket and Amazon Kinesis streams using inbound Lambda which parses data, validates, calculates, and stores in PostgreSQL RDS.Out bound Lambda processing generate out files sent to Legacy system.

To send emails to users developed Java API to interact with the Amazon SNS.

Involved in designing and deploying multitude applications utilizing almost all AWS stack (Including EC2, S3, AMI, Route53, RDS, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and Auto-Scaling

Spring JDBC used retrieve and update,inserting data in oracle database with use of stored procs.

Gulp is used to build and Bower is used for managing dependencies

JSON request/response from UI is processed using Spring Boot, Restful features with micro services Architecture and Spring Data Jpa is used to save/retrieve the data in the backend

Oracle Coherence used to store Reference Data.

Design and Develop Micro Services for various Activation and management activities in One Network flow.

Implemented asynchronous processing using Kotlin coroutines to handle high-throughput transactions with minimal latency.

Proficient in utilizing build automation tools such as Gradle for streamlined build processes and deployments across different environments.

Developed custom Docker container images and executed tasks related to Docker tagging and pushing these images to designated servers.

Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker.

Worked on creation of Docker container images, tagging, pushing images, integrating Spring boot, created the Docker containers and Docker consoles for managing the application life cycle.

Involved Kubernetes for automated deployments, scaling, and management of applications across clusters.

Designed and reviewed the test scenarios and scripts for given functional requirements and automated test cases in Junit and Mockito.

Extensive experience with Splunk Log Management, including the creation of dashboards, monitoring, reporting, and email notification to users and Integrating Splunk with a wide variety of legacy data sources.

Migrated legacy JavaScript codebase to TypeScript as part of modernization efforts, enhancing IDE support, refactoring safety, and long-term maintainability.

Utilized the software development kit (SDK) for Splunk to create unique Splunk apps that extended the functionality of Splunk and allowed for integration with other tools and systems.

Used Kubernetes for managing containerized workloads and services.

Utilized Docker and Kubernetes for containerization and orchestration of microservices, enhancing deployment efficiency and resource management.

Implemented CI/CD pipelines using Jenkins and Git, automating build, test, and deployment processes to streamline development workflows

Using Grafana and Spring Actuator to create centralized dashboards to monitor the performance, status and health of all applications and pods in all environments.

Parsing and formating swift message and generating swift messages using OpenStack .

Proficient in designing scalable APIs using GraphQL with robust resolver logic, schema stitching, and performance optimizations.

Hands-on expertise in Node.js for building high-performance, event-driven backend services integrated with microservices architecture.

Developed cloud-native applications with Docker, Kubernetes, and CI/CD pipelines, ensuring fast and reliable deployments.

Solid experience working with both SQL (PostgreSQL) and NoSQL (MongoDB) databases, including data modeling and optimization.

Tools/Environment:

Java 17, concurrent package, lambda expressions, streams, J2ee, webservices,JMS,XML(Dom, Sax, Xpath, Stax, XSLT),IBM MQ,Kafka 2.0, Spring Boot,Apache Camel 2.2, Spring Batch, Spring Data JPA, EHCache,Jprofiler,Angular6,Bower,Gulp,Eclipse,WebStorm,GIT/Stash(Bitbucket),Jenkins,Confluence,

Jira,SourceTree, Swagger, Maven, Windows, Linux,DB2,Sql Server,Angular 14 Coherence.

Citigroup Jersey city

Project :( UNO TradeSurvilence/Trade Hub ) 9/2018 –12/2021

Sr. Lead Java Developer --Actimize Integration

UNO is a trade surveillance application which centralizes trading controls and monitors trading activity to avoid trading misconduct and market abuse across trade lifecycle covering all asset classes(Equity, FX). It enforces financial regulations such as MiFID, EU Market Abuse Directive, REMIT, ESMA Guidelines and the Dodd-Frank Act. It's re-write of an Actimize's (Enterprise Risk Case Management) product.

Trade Hub – is front office application that allows buy side clients to trade broad portfolio of financial instruments around the clock, on any market and with a wide range of counterparties.Application integrates multiple-dealers and multiple assets classes onto a single screen format for electronic order routing and facilitates liquidity aggregation across exchanges and other liquidity pools.

Responsibilities:

Led the development and modernization of UNO Trade Surveillance and Trade Hub applications using Java 11, React, and Spring Boot, ensuring alignment with global financial regulations like MiFID II, ESMA, and Dodd-Frank.

Collaborated with Nice Actimize teams for integration and resolution of production issues in modules like SAM, WLF, KYC, and RCM; deployed and configured AIS/RCM servers.

Rewrote and enhanced legacy Actimize Enterprise Risk Case Management product, integrating NTLM for enterprise SSO access and building Actimize rules and data models in collaboration with architects.

Participated in Agile/Scrum processes, involving sprint planning, daily stand-ups, and retrospectives to deliver high-quality features and bug fixes.

Developed dynamic and modular React.js UI components integrated with backend microservices for real-time trade monitoring across asset classes (Equity, FX).

Utilized Angular 13 for front-end development in Trade Hub; implemented routing, components, directives, pipes, and services for responsive UI.

Worked on upgrading UI from Angular 9 to Angular 13, improving performance and compatibility with latest browser standards.

Contributed to CI/CD pipelines for Node.js components using Jenkins and GitLab, automating builds, tests, and deployments across environments.

Integrated RESTful APIs using Spring Boot, tested services with Postman, and ensured data accuracy and API security using 42Crunch and OAuth2.0.

Integrated Java applications with fixed income order management systems (OMS) and electronic trading platforms (e.g., Tradeweb, MarketAxess).

Implemented Kafka producers and consumers for real-time streaming and processing of FIX messages in a distributed architecture using Apache Camel and Zookeeper.

Designed and maintained Microservices architecture with centralized config (Spring Cloud Config), service discovery (Eureka), and load balancing.

Developed secure RESTful APIs in Kotlin adhering to financial regulatory standards (e.g., PCI DSS, SOX, GDPR).

Used AWS S3 for application log retention and lifecycle management; worked on containerization using Docker and orchestration with Kubernetes (kubectl).

Implemented multithreading, ExecutorService, Locks, and semaphores to improve processing efficiency and application throughput.

Conducted unit testing using JUnit and Mockito, integrated CI/CD pipelines using Jenkins, and ensured code quality with SonarQube.

Integrated API Gateways (Kong, Apigee) with OAuth2 and JWT to secure internal/external APIs in multi-tenant banking platforms.

Created and published Swagger/OpenAPI documentation for seamless API onboarding and testing.

Applied DevSecOps practices by embedding security tools (SonarQube, OWASP ZAP, Snyk) in CI/CD workflows for early vulnerability detection.

Enforced TLS encryption, input validation, and secure token handling across all financial microservices.

Managed secrets and credentials securely using HashiCorp Vault and Azure Key Vault in production environments.

Used Kotlin with Spring Boot to deliver scalable solutions for loan processing, customer onboarding, and real-time account notifications.

Utilized Git/Bitbucket for version control and branching strategies; collaborated across Dev, QA, and Data Science teams to address schema changes, JSON formats, and analytics integration.

Processed large JSON-based data payloads and persisted using Spring Data JPA and Oracle SQL; mapped incoming data sources to Actimize models.

Ensured real-time display of liquidity aggregation across markets by integrating multi-asset, multi-dealer trading into a single screen for Trade Hub.

Delivered visually rich charts and dashboards using Plotly and ECharts for executive trade oversight and alert generation.

Developed UI modules following a model-driven approach using reusable Angular components for CRUD operations and custom table handling.

Configured and deployed applications in Unix environments using Autosys for job scheduling and JIRA for project tracking and ticket management.

Demonstrated strong problem-solving and critical-thinking skills to meet tight deadlines in high-pressure regulatory reporting environments.

Tools/Environment: Java 11, concurrent package, lambda expressions, streams, CompletableFuture, Spring Boot, Spring Batch, Spring Data JPA, Hibernate, Angular 13, React.js, TypeScript, HTML5, CSS3, Plotly, ECharts, OAuth2.0, Jprofiler, JAX-B, JAX-RS, SQL, Oracle, BitBucket/Stash, ServiceNow, Autosys, Eclipse, IntelliJ, Unix/Linux, JUnit, Mockito, Log4J, Docker, Kubernetes, AWS (S3, EC2), Swagger, Microservices, Apache Kafka, Apache Camel, Zookeeper, SonarQube, Jenkins, JIRA, 42Crunch API Security, NICE Actimize (SAM, RCM, WLF, KYC), GemFire Cache

Citigroup 111 WallStreet

Project : UNO/ Payment Processing engine (Statemachine) 9/2015 –9/2018

Sr. Java Developer

Word Link Next Gen is a re-write of existing mainframe application (world Link) using J2EE. This application main purpose is to send payments across 160 countries with multi-currency capabilities. Payments can be sent in the different types of disbursement modes (Ach, Wire, Mass pay, Remote Check, Cash). The application has the capability to book the Fx deals in multiple currencies and across multiple counterparties. Also, the system calculates the corresponding settlement date with counterparty after considering weekends and holidays. .

This Application divided in to three parts. UI (User interface), Payment Processing Engine (Statemachine) and Report Engine.

Payment Processing engine (Statemachine): Statemachine processes the payments from different types of files (Fixed length, XML, Swift) and MQ messages (Swift). Payments are routed to different workflow components based on routing logic. Workflows are configured in spring container and initiated by thread pool. All workflows are monitored by M-Beans. Processing Engine has following modules Transaction Initiation, Sanctions, FXEngine, Funding and Disbursement.

Responsibilities:

Designed and developed multi-tier, custom build workflow based web application.

Implemented an adaptor which receives trade alerts from CEP engine and sends to Tibco EMS queue for further processing by Message Bus Message Bus was built using Spring Integration to dequeue xml trade messages from Tibco EMS queue and sends to different channels for parsing, enrichment, persisting into database .

XSLT and Trax API used to convert different formats of XML to our Standard XML format and StAX API used for parsing XML messages

Core Java with Concurrent API is used extensively for parallel processing and chunk processing

of trade execution data and market data and converts in to JSON and store it Mongo DB .

Used Java 8 Lambda expressions and Stream API to support functional-style operations on streams of elements.

Worked on Java8 functional programming concepts like Streams, Functional Interfaces, new date time API and lambda expressions.

Used Java 8 Lambda expressions and Stream API to support functional-style operations on streams of elements.

Worked on Java functional programming concepts like Streams, Functional Interfaces, new date time API and lambda expressions.

JProfiler is used to profile he application (Memory view and CPU view and Thread view are used )

Implemented grids and tree UI components, forms etc., and pages using Angular JS, JavaScript, HTML,CSS. Used services, controllers, and directives of angular JS.

Angular material framework used for UI components, Jemview-router used for separate views into tiers, changing application views based on state.

TypeheadJs used for building robust typeaheads, underscore-query search for objects with a Query Api similar to MongoDB and API library for Underscore and Lodash.

Moment used for parsing, formatting, validating dates. Lodash utility library used for arrays, objects & strings.

Jasmine used for testing controllers, services, directives.

Gulp is used to build and Bower is used for managing dependencies Consumed REST services using Angular HTTP and performed various REST Http operations for data retrievals and updates.

Used Spring Boot which is radically faster in building cloud Micro Services and develop Spring based application with very less configuration.

Developed Micro services & APIs using Spring Cloud, Spring Security, Spring Boot, Spring Integration.

Design and implementation of different micro services and its Docker files.

Created and maintained Docker images and Docker containers.

Configuring the Docker containers and creating Docker files for different environments.

Docker swam used for container orchestration .

Amazon IAM was used to maintain the user credentials and involved in creating custom IAM policies to various groups defined within the organization

To send emails to users developed Java API to interact with the Amazon SQS and Amazon SNS.

created Docker machine in Amazon EC2 instance and created Docker containers in



Contact this candidate