Post Job Free

Resume

Sign in

Solution Architect Software Development

Location:
Chester Springs, PA
Posted:
March 20, 2024

Contact this candidate

Resume:

Sarabhaiah Polakam ad4gkv@r.postjobfree.com 484-***-****

Summary

●An experienced Leader, seasoned senior solution architect lead with over 18+ years of experience in the Financial Technology sector. During my career at Bank of America, Fidelity, Citi Bank and Merrill Lynch, I have led successful multi-function digital transformation technology projects. This included leading the team through the software development life cycle, managing liaisons with all partnering teams, formulating/developing underlying architecture frameworks, reference policies and system/business processes.

●A change champion, creative follower and motivator with strong technical, analytical and interpersonal skills. Possess diversified experience in designing, planning, system implementation and execution. Strong written and oral communication skills.

●Strong experience with enterprise wide large-scale highly complex integration projects.

●Focused on providing cost effective Enterprise Architecture business solutions that add value to the business and its customers by presenting architecture diagrams to Leadership Teams.

●Excellent experience in varied roles of Solution Architect, Technical Manager, Technical Lead in designing, planning & implementing solutions for complex projects built around SOA.

●Master of large scale deployments and migrations. High quality experience working with matrix organizations and delivering the solutions.

●Excellent experience in creating project and deployment plans which includes system builds, security, application, storage and networking, extensive testing at all phases, performance tuning, acceptance, and successful project closure while taking into account risks and dependencies associated.

●Partnered with analysts and developers to establish high-level time and effort estimates based on client requirements during project kickoffs, refining the estimate for each iteration.

TECHNICAL SKILLS

Languages: Java, Java17, JavaScript

D.B.M.S: MongoDB, Cassandra, Oracle, Sybase, MySql

And PostgresSQL

Operating Systems: Linux And Windows

App Servers Websphere, WebLogic, JBoss, Apache Web

Server, Tomcat

Frameworks Spring, Spring-Data, Spring-Boot, Hadoop, Spark, Kafka, Minio-S3, Hibernate

IDE Tools Eclipse, IntelliJ, Visual Studio Code

CI/CD Tools Bitbucket, Confluence, JIRA, Bamboo, Jenkins, LightSpeed, Tekton, Harness

Cloud Technologies Kubernetes, HAProxy Ingress, Istio, Docker, YAML, HELM, OpenShift, LightSpeed, Harness, Tekton, Tanzu, Google Cloud, Native Cloud

Professional Experience

DPO, Citi Bank, Jersey City, New Jersey, USA Aug 2022 – Till Date

Position Title: Sr. Application/Solution Architect Lead

Project: DPO (Digital Product Onboarding)

Description: Current product onboarding for Customers at TTS, Citi Bank rely on manual, paper-based tasks that not only increase effort but are also expensive and time-consuming. Multiple forms requesting repetitive information and questionnaires, fill in paper forms, provide original identity documents and get hold of paper copies of bills that are mostly delivered electronically these days result in many points of friction and lead to significant levels of drop-out mid-journey. Onboarding process often takes up to 72 weeks to complete.

DPO initiative is a multiphase initiative to achieve good customer onboarding experience, reduce the onboarding journey time as much as possible, make the onboarding easy with self-servicing.

Environment: Intellij, Java17, Confluent Kafka, APIm, Helix Blue Prints, OpenShift, Spring Boot Microservices, MongoDB Atlas, COIN, Vault, CyberArk, Angular, Micro Front End (Module Federation - MFE), TypeScript, Free-Marker, Solace, State-Machine

Role: Responsibilities included:

●Interact with Business people to understand current process and pain points.

●Design Domains and Identify Partner Systems

●Design Onboarding Journey Steps and Data Points

●Design and implement Domain Micro Services

●Design & Implement UI Screens for Onboarding Journey Steps

●DPO Data Model to create in Atlas MongoDB

●Mongo Change Streams handling for CRUD operations

●Kafka Producers & Consumers for Business Events and Event Sourcing

●CQRS implementation for writes & reads

●SEGA for transactions

●Event Driver Architecture

●LightSpeed CI/CD pipelines

●Istio mesh network for traffic management

●mTLS, tls, hpa, canary, circuit-breaker, fault-injection, rate-limiting in OpenShift Kubernetes Environment

●Vault implementation for Secret Management

●Environment Specific Variables

DXC, Wells Fargo, Charlotte, North Carolina, USA Jul 2020 - Jul 2022

Position Title: Sr. Kubernetes Solution Architect Lead

Project: ADMF (Authorized Data Management Framework) – Data Xchange

Description: The ADMF (Authorized Data Management Framework) – Data Xchange Microservices product is a self-service microservices framework that provision data consumption via APIs. It is used by all downstream applications to consume the data from authorized data sources. To accomplish this. The solution uses microservices based architecture to provision the data from authorized data sources and integrates with ADMF – DataLens to capture lineage and audit controls.

Environment: Intellij, Java11, Confluent Kafka, Apigee, Istio Ingress, Kubernetes, Tanzu, AWS, YAML, Rest API, API Marketplace, Spring Boot, Spring Cloud, Spring Data JPA, Eureka Registry, Config Server, Admin Server, Sleuth, Zipkin, Spring Cloud Load Balancer, Feign Client, Maven, Jenkins, GIT, Oracle, Udeploy.

Role: Responsibilities included:

●Designed and implemented microservices using spring boot and spring cloud frameworks.

●Migrated VM based microservices to Tanzu Kubernetes

●Designed and implemented management services which includes service registry, admin server, exchange gateway and config server.

●Implemented domain service to source the autoloan cost of fund domain data from oracle db and publish to

●confluent Kafka topics using asynchronous api.

●Implemented API Gateway proxies using Apigee and worked on the integration with API Marketplace for

●consumer registration and subscription of API products.

●Implemented common reusable services which include data authorization control service, oracle local data

●authorization service, data catalog service and data availability service.

●Implemented oracle local data authorization service which verifies if the caller has access to the requested

●dataset using roles defined in oracle db.

●Implemented data authorization control service which makes decision to invoke the appropriate data

●authorization services depending on the data platform and the type of source access control.

●Implemented data catalog service which publishes the metadata about the requested dataset.

●Integrated with data availability service which verifies the availability of the dataset for the requested business

●date and returns the required details about the available data for consumption.

●Implemented admin service to monitor the status and metrics of all the microservices.

●Worked on distributed tracing for microservices using sleuth and zipkin.

●Implemented microservice communication using feign client.

●Implemented microservice load balancing using spring cloud load balancer.

●Implemented error handling pattern on microservices.

TAHOE, Bank Of America, Pennington, NJ Apr 2017 – Jun 2020

Website: www.bankofamerica.com

Position Title: Sr. Solution Architect

Environment: Java8, Spark, Spark-SQL. Restful Web Services, Spring, SpringBoo, Spring-Data, Spring Integration, Apache Kafka, MongoDB, SQL Server, Oracle, DB2, AVRO, Parquet.

TAHOE is one of the technology initiations Bofa initiated to pull data from various LOB data-sources and ingest cleansed and transformed AVRO formatted Data-Sets to Hadoop FS. TAHOE is responsible to further process the data based on LOB requirements and store as Silver, Bronze and Gold data to generate reports using visualization tool Tableau. TAHOE is also responsible for exposing the data via Common Restful Services to various LOB applications. TAHOE abstracts the complexities of Hadoop and related big data technologies from LOB technology teams and democratizes Hadoop and related big data technology stack to the organization. TAHOE does archival, operational and analytics solutions to various LOB users by concentrating LOB requirements only by abstracting technology aspects out. TAHOE Integrates non-intrusive ways with Bofa’s ML & AI platform called PHOENIX.

Role: Responsibilities included:

●Implemented microservices using Java, SpringBoot and Spring-Cloud

●Designed and iImplemented Data-Transport, Data-Ingest, Data-Chef components

●Designed and iImplemented Seed file schema to embed LOB requirements as seed to data components

●Designed and iImplemented and implement seed file reader

●Designed and iImplemented and Implement SPARK Jobs to do data cleanse, format and compress

●Implement SPARK Connect to DB2, Oracle, SQL Server to pull Data at rest

●Implement KAFKA producers and consumers for capturing meta-data of various data process events

●Author Shell Scripts for various LOB jobs to run jobs using AutoSys Jill files.

TMS, Citi Bank, NewCastle, DE Jul 2015 – Mar 2017

Website: www.citi.com

Position Title: Solution Architect

Environment: J2EE, Java, Restful Web Services, eHCache, Spring, SpringBoot, Spring-Cloud, Hibernate, SQL Server, Oracle, Paho, IoT, Apache DBCP, SiteMinder, LDAP, Active Directory, Jersey.

TMS is a Third Party Vendor (Transcentra) application customized and implemented to cater Citi needs. It was implemented by cloning the base product almost 25 years back and the vendor product took so many directions from then onwards which were not absorbed by Citi. As Citi decided to absorb the new TMS product from the vendor, it became a challenge to Citi as Citi also performed many customizations parallel over past 20+ years.

To retrofit the deviations and absorb new features from the vendor, Citi initiated a new initiative called TMS 2011 Upgrade Project. Implementing multi factored authentication by leveraging the existing Citi SSO is the biggest challenge in this project as TMS is thick client centric application where as Citi SSO supports only browser based web applications.

Role: Responsibilities included:

●Analyze the legacy TMS System to determine various business functions

●Analyze the Citi Customized TMS System to determine the customization functions

●Perform Gap Analysis

●Identify Deprecated Software Libraries and propose new Libraries

●Author approach document

●Author Architecture Design Document

●Prepare requirement matrix and use cases

●Author HLD, MLD & LLD for Gateway SSO services

●Implement SpringBoot Based Restful Services for SSO

●Perform SiteMinder Configuration

●ModJk configuration to route request from HTTPServer to AppServer

●Enable SSL port on HTTPServer

●Compile, Build & Deploy application code.

●Apache Kafka based PUB/SUB messages implementation

●JSON base request/response modeling

DFP BOFA, Newark, DE SEP 2011 – Jun 2015

Website: www.bankofamerica.com

Position Title: Solution Architect

Environment: J2EE, Java, SOA, JMX, JMS, MQ, Restful Web Services, Struts2, Dynacahce, Servlets, JSP, Freemarker, Spring, Hibernate, Spring Integration, Spring Tools Suite, Apache Kafka, Hadoop, Map Reduce, HDFS, Hive, Pig, Flume, Sqoop, HBase

The Device Fingerprint is one of the niche technological applications in Bank of America. Primary functionality of DFP is to collect various parameters of the device from which online banking users perform login actions. Whenever user login into online account, JS collector collects the device parameters and sends the data to server side where data gets analyzed to determine the compromised scenario. the processed data gets transferred to its data layer using Big Data data transmission tools. Once data is stored in HDFS, the data goes through the internal bank strategies to determine the fraudulent activities. Then by using sqoop, data is loaded from HDFS to Oracle by running autosys jobs to finally send to Oracle. From Oracle, the data is sent to case management system called Flash which is built on java and J2ee technology stack.

Role: Responsibilities included:

●Interacting with business users

●Analysis of functional specifications

●Developed API's using JMS and IBM MQ Series for XML messaging between several applications in prudential. The received XML messages from these queues are used by several services with in the application to integrate them appropriately.

●Bulk user creation service uses the above MQ Series API's, which gets user information from a third party Single Sign on application.

●Single Sign On integration for broker validation. After Broker's login, their information is retrieved from SSO Server. Once validated, brokers were allowed to do regular business.

●Involved in the configuration of Struts and Hibernate into a Spring application and used Spring's DAO support to build Hibernate DAO classes.

●Used Object/Relational mapping tool Hibernate to achieve object persistency.

●Design and development of Batch Process Service, which processes several batch jobs and sends out different formats of XML to different applications.

●Created SOAP clients to consume web services to crate turn by turn directions and created xml schema files using XML Spy.

●Wrote and Configured XML based rules that drive the system.

Borneo Components & Rendering Engine BOFA, Newark, DE Apr 2008 – Aug 2011

Website: www.bankofamerica.com

Position Title: Tech Lead/Tech Manager/Solution Architect

Environment: J2ee Core Components, J2EE, Java, SOA, SB, JMX, JMS, MQ, webservices,XML, spring 2.5, gigaspaces, Dynacahce, Servlets, JSP, Struts2,Freemarker, Web sphere 6.1, RAD 7.0, Perforce, Oracle, hibernate, Junit, JMeter, Drools

Provided leadership, solution engineering, project coordination, business requirements analysis, requirement specification, and project documentation to highly complex projects using Enterprise Architecture principles and delivering technology solutions within the framework of repeatable process. Outlining the tasks to be accomplished, wrote the technical requirements, and coordinates, persuade and motivate teams to perform activities on target dates. Provided the unique capability to resolve issues that, inevitably, arise during project execution, through quick but thorough analysis, providing options to stakeholders, and bringing the issue to quick closure.

Role: Responsibilities included:

●ESB and IFW services implementation to fetch SOR data

●Restful web services implementation for customer profile data

●Rendering Engine design and implementation

●Concurrent component design and implementation

●Cache component design and implementation

●JMS based real-time cache refresh implementation

●JMX based dynamic properties component design and implementation

●Oracle Change notification implementation for real time cache refresh

●Stub framework design and implementation

●Widget framework design and implementation

●Tagging framework design and implementation

●Rules framework design and implementation

●Logging and Exception handling framework design and implementation

●Act as IT architect & manager through the architecture & engineering phases of Borneo components.

●Successfully led the Rendering Engine from Architecture, Infrastructure and Operations perspective.

●Design, document, publish and coordinate the implementation of an extremely diverse variety of application, infrastructure and services necessary to support app dev requests especially Country Wide and Merrill Lynch Transitions.

●Working with other teams on definition and implementation of IT infrastructure and operations for major undertakings.

●Analyze business and technology needs to create solutions to meet these needs, and ensure adequate capacity, scalability, security, and flexibility.

●Developed strategies to align IT technology, people and process strategies with business capability roadmaps.

●Managed technology and vendor selection and consultant selection processes.

●Enable timely quality solution delivery by developing multiple input and output templates based on industry best practice and standards for designing highly available, cost effective solutions aimed at meeting business requirements.

Net-Access Re-ARCH BankOfAmerica, Newark, DE Apr 2006 – Apr 2008

Position Title:SR S/W SOA Developer

Environment: J2EE, Java, SOA, ESB, JMX, web services,XML, spring 2.0,gigaspaces, Servlets, JSP, Struts 2.0, Web sphere 6.1, RAD 7.0, Visio, Site Minder, Oracle, Informix, Sybase, Junit, JMeter and Exceed

Website: www.bankofamerica.com

Description:NARA is an initiation of BOA and MBNA merger. This project is responsible for serving service oriented financial data between two Web applications called WEAS and Netaccess. As part of this project several web services are developed at both Netaccess and WEAS ends. All common authentications related web services are implemented at WEAS end and all transactional data related web services are implemented at Netaccess end. Two separate projects called CommonAuth and WSMW modules are initiated for this purpose.

CommonAuth module mainly deals with the user, roles and authentication and authorization. Total MBNA user base was divided into Transparent and Non-Transparent users. All Non-Transparent users migrated to WEAS datacenter called GAID. When a new user comes for enrollment Netaccess invokes a webservice to find out whether the user exists in GAID and maintains unique usernames among BOA user base and Netaccess user base. User creations, deletion, updating activities are done through these web services.

WSMW is responsible to get all Non-Transparent users transactional data like Account details, Balance Transfers, Bill Pay, Eform details and Update Address. Few of these services are tuxedo services which is invoked by Jolt from the web service layer. WSMW is responsible to get all Accounts details of logged in user and send as a soap response to WEAS. WSMW is also responsible to store these details in intermediately caching server and read cached data whenever required by Netaccess later on to display on details screens.

Parallel Processing using Asynchronous beans to enhance system response time. Resource Environment Provider to keep environment and application specific properties. Spring based workflow for transactions. Distributed session management and Pre-Fetched caching using gigaspaces. Loosely coupled data services layer. Home grown jolt pool to make Tuxedo calls. Event framework to log tech events. High performance on-demand web services are few among the important tasks implemented during the NARA Gen1 project.

Role: Responsibilities included:

●Business user interaction in gathering requirements

●System study and prepare requirements.

●HLD & LLD preparation by taking requirements matrix

●Use cases, class diagrams and activity diagrams preparation

●WSDL preparation using existing XSD

●Homegrown web service framework to facilitate generic implementation.

●ThreadPool implementation for asynchronous functionality processing.

●Web Services development end to end.

●Home grown Jolt pool which is a wrapper on BEA JoltPool to implement failover and failback mechanism.

●Java utility development to write and read the serialized java objects to and from cache server.

●Oracle 10g Blob usage as a fallback option for Cache.

●Preparation of Health checks for all services using Mbean extensions.

●Spring base workflow.

●Ant build-scripts preparation for multiple modules which are part of single ear application.

●Deployment plans preparation.

●Deployment support for Websphere administrators.

●Stress testing using home grown test driver which is JMeter based tuning for improving the response times of the product.

●Bug fixes and production support.

●Reviews and code walkthroughs at peer level.

●Providing customer support and maintenance for the earlier releases of the product in parallel with development.

Client: DAPC &SPAN Sony Electronics, San Diego, US Mar 2005 – Apr 2006

Position Title:SR S/W Analyst

Environment: Java, Servlets, JSP, Struts, Web sphere 6.0, RAD 6.0, Visio, Hibernate3.1, Struts_menu, Display Tag, SQL Navigator, LDAP, Site Minder, PL/SQL, Oracle, Junit, VSS, Eclipse 4.0.3, and Hummingbird

Website: http://servicesales.sel.sony.com

Description: SONY Electronics has two web site ordering systems. End users to order accessories primarily use the DAPC Service Sales Web Site. Dealers and distributors to order parts and accessories primarily use the SPAN application. The purpose of this project is to develop a new DAPC web site for end users. Also, the BSSC group is considering the use of the DAPC shopping cart for their parts orders as a feed from their web site.It should contain all the functions like search ability, listing by category, tracking the rush orders, customer information, etc. The DAPC site will be designed to allow the WRPC business groups to maintain specified content.Dealers and distributors currently place orders using the SPAN application. Span is a real time online parts ordering system. Credit card transactions are dealt by encrypting the sensitive information.

The middle tier design is based on the struts architecture. Authentication module is developed using Sony Common EJB and Siteminder and LDAP is used for authorization module. Action Class and Manager Class combination is used for functional activities. DATA SORCE is used to get connection pool for Oracle database activities. Shopping cart module is developed using java.

SQL Loader scripts and unix based shell scripts are developed to load inventory and order data which comes as flat files from Mainframe systems. PL/SQL stored procedures are developed for getting inventory status, order status and tracking and order placement.

The transaction management would be taken care by the components itself by using the transaction operations exposed by the database connection object. As far as possible the database operations constituting the transaction unit will be accommodated within the same method. In cases where it spans across methods, the connection object would be passed as a parameter and transaction management will be done in the component itself.

HTTP Servlet session object is used to take care of tracking the session. It has a timeout period set and redirects to the login page, once timed out.URL Rewrite method is adopted inorder to maintain session if the users has the cookie disabled in their browser.

Struts Tile based web pages are developed as web_client. Struts_menu is used for all role based dynamic menu system. Display tag is used to display complex search records. Hibernate is used for data persistence.

Role: Responsibilities included:

●Development of different business components using Struts, Rad 6.0, Web sphere 6.0, Java, EJB, JMS, JAXP, Unix, SQL Loader and XML.

●Struts based Action Servlets development for different actions involved in customer center web application.

●Struts based Model component development for database connectivity and business logic

●Struts based view pages development using Tiles, JSP pages and tag libraries.

●Dynamic menu system is developed based on Struts-Menu

●Display tag is used extensively to display Fetched records.

●SQL loader and UNIX based shell scripts are used to load data to oracle database.

●Cryptic is used in generation of order mail.

●Hibernate is used for persistence.

●Providing application support/ troubleshoot a variety of problems, typically with minimal business impact.

●Performing/supporting user acceptance testing and unit testing

●Writing concise and clear technical documents

●Bug fixing and Unit testing.

●Reviews and code walkthroughs at peer level.

●Communicating with the clients to get requirements / clarifications.

●Providing customer support and maintenance for the earlier releases of the product in parallel with development.

●Worked on Performance Tuning for improving the response times of the product.

Education

M.E Engineering

Anna University (CEG), Chennai, India



Contact this candidate