JOSEPH CHO
Freehold, NJ *****
Cell: 848-***-**** Email Id: **********@*****.***
Joseph Cho’s blog - http://josephcho1.wordpress.com/
https://github.com/JosephCho/D3-Labs
LinkedIn - https://www.linkedin.com/in/joseph-suk-young-cho-28744947/
SUMMARY
I have been working at several companies as a Sr. Architect, a Full Stack developer, a Lead Developer, and a Technical Manager for more than 20 years, from Frontend area to Backend area by using most advanced cutting-edge technologies. My favorite development languages for Backend are Python, Java, C++, C, C#, and Spring Boot, Spring Framework, MEAN, MERN framework while my favorite frontend tools are HTML5, CSS3, Bootstrap 5+, JavaScript(ES5, ES6), jQuery, Angular (2, 4, 6, 11, 14, 15+), and ReactJS 17+ with TypeScript like Angular.
By working at several different companies, I have accumulated an extensive IT Application Design and Development experience, initially Prototyping Rapid Development and following a Regular Systematic Approach Development, working in top ranking organizations including Financial Companies such as Bank of America, BNY Mellon, JPMC(JPMorgan Chase Bank & Co), and other Non-Financial Companies such as MetLife Insurance Company, INFOSYS LIMITED, Cognizant Technology, AT&T Bell Labs, Lockheed Martin (with Security Level clearance – Secret), and IBM by leading several large scale projects for managing on shore and off shore, India team, in design of distributed systems by using AWS public cloud system, streaming data flow by using Kafka tool, implementing Data Design, Data Modeling, implementing projects by Design Patterns, using UML, SysML, BPM and Object Oriented Analysis and Design Tools on emphasizing scalability, performance, and availability.
Thus, I have Hands-on experience in implementing Microservices, RESTful APIs, and event driven architectures by using SQL and NoSQL DB with CI/CD systems and tools such as Jenkins, Docker, and Kubernetes, effectively leading 10-25 people with SDLC (software development life cycle) team work skills using Agile/Scrum, RUP (Rational Unified Process) and SDF (Software Development Framework) process tool with CMM level-3, 4, 5.
Nowadays, I am developing and studying AI (Artificial Intelligence) and ML (Machine Learning) area which can be used for a Financial Area, and I recently got a first step of obtaining a AI/ML certificate by META AI/ML which developed a famous PyTorch - https://elvtr.com/certificate/87b2dad8c60a2677e1b68f5960c55afc -and I am going to pursue the Black Belt Plus Program Certificate held by Analytics Vidhya for 2 years program.
TECHNICAL SKILLS
Languages: Python 3.12+, React 18+, Python libraries (NumPy, Pandas, Scipy, Scikit-learn, TKinter, Matplotlib, Seaborn, and SQLite), Go(lang) 1.8.3+, C, C++, C#, COBOL 6+, FORTRAN 77, Angular 17+, Vue 2.7+, Java (17+, 8+, 11+) and Scala, Java Spring Framework with MVC Restful web services, JavaScript (ES5 and ES6), JavaScript Library (Typescript.js, jQuery.js AngularJS1.x and Angular (2, 4, 6, 10, 14+), React (JSX and Babel JavaScript compiler, Webpack, and Redux, Material UI, Material-table, TypeScript and JSS), Ember.js Bootstrap 5+, Less, Sass, D3.js) with Web 2.0+ with Web 2.0, ECMAScript6 Standards, ES2015/ES2016, Bootstrap 5+, AJAX, JSON, Data Visualization(D3, Flot, Flask framework, Charts, Maps, Interactive Graphics), Perl, C, Pro*C, C++, C#, Visual C#, ASP.NET, HTML5 by using (Photoshop, Illustrator, Flash, PageMaker, Dreamweaver), CSS3, XML, Java Swing, Java with Concurrency Multi-threading, DWR, OOAD(with Object Oriented Analysis and Design with UML and Distributed Pattern), MATLAB, SysML, jBPM, BPMN 2.0, J2EE Architect Programming with EJB(Session Bean and Entity Bean), JSP, ASP and FE tools, Java Servlets, Java Applets, XML, HTTP, HTTPS, DWAMP, SOAP, REST, SOA, WSDL, JAXB, JAX-RS, PHP, LEX/YACC, AWK, Concurrency Multi-threaded Programming, Communication Protocols,, Unix Shell Scripting (KSH, CSH, BASH, etc), Apache Hadoop Ecosystem Package (HDFS, MapReduce, PIG, Hive, HBase, YARN, Sqoop, Oozie, and Apache Kafka, Storm, Ansible, Flume, Zookeeper), Ruby on Rails, Test Driven Development(TDD) (Jenkin, Junit, TestNG, JaCoCo, Mockito, EasyMock, Selenium 2.x for web testing tools and Selenium RC and WebDriver, Cucumber, Mocha, Jasmine), XAMPP(Cross-Platform (X), Apache (A), MariaDB (M), PHP (P) and Perl (P)), Security (LDAP, Active Directory, SSO(Single sign-on), SAML(Security Assertion Markup Language)), Matlib, Automated Testing using Junit, TestNG, JaCoCo, Powermock, JBehave, Cucumber, Jasmine, Solenium, Pit and utilizing APIGEE and Spring MVC
Databases: Hitachi Teradata Database using Data Warehousing of OLAP (Online
Analytical Processing by using ETL (Extraction, Transformation, and
Loading) tools such as Oracle Warehouse builder an Informatica.
AT&T developed Big Data of Daytona Cymbal DB which is compatible to
Hitachi Teradata Database, Relational DB SQL - MySQL, MariaDB, Oracle,
Informix, Sybase and No-SQL - ArangoDB, MarkLogic NoSQL, MongoDB,
Cassandra, Couchbase, AWS DynamoDB
Software: Agile/Scrum, LAMP stack (Linux, Apache, MySQL, PHP),
MEAN stack (MongoDB, Express, AngularJS, NodeJS),
MENR Stack (MongoDB, Express, NodeJS, ReactJS), DevOps,
SharePoint for Communicate, collaborate, and control our content with dynamic sites, secure file sharing, and integrated content management.
Big Data (Hadoop 3+, Hbase, Hive, Scala, Sqoop, Flume) development by
using Spark (SQL, Streaming, MLib Machine Learning, GraphX) –
Splunk can be used for monitoring and searching through big da
Spring Boot 2+ and 3+, Hibernate, Oracle, RESTFul webservice, GIT,
Angular (1.x, 2, 4, 6, and 11, 14+), React17+ with Hooks and EhCache
Data Warehouse - Hitachi Teradata Database using Data Warehousing of
OLAP (Online Analytical Processing by using ETL (Extraction,
Transformation, and Loading) tools such as Oracle Warehouse builder
and Informatica.
Kafka Streaming – K-table, K-Stream, and ksqlDB
Data Lake – Datawarehouse, Cloud
AI/ML – Deep Learning(DL) (Generative and Discriminative Model),
Reinforcement Learning(RL), Machine Learning Frameworks
(TensorFlow, Pytorch, Jax, Jasper, Runway, several Other AI
tools), Chat-GPT LLM, Databricks
Generative AI Model - The success of Generative AI models is to show visual content, including images, videos, and music from text content. To do that, three consecutive networks are needed such as VAEs(Variational Autoencoders), GANs(Generative Adversarial Networks), and Transformers & Autoregressive Models.
VAEs capture the underlying structure of data and can generate new samples by sampling from a learned latent space.
GANs introduce a competitive framework between a generator and discriminator, leading to highly realistic outputs.
Transformers excel at capturing long-range dependencies, making them well-suited for generating coherent and contextually relevant content.
Time Series Modeling, Neural Network Modeling, AWS Deep Learning AMIs, LLM Fine Tuning by using PyTorch and Open AI ChatGPT
And I have AI-ML certificate from Facebook (META) ELVTR as:
Facebook (META) ELVTR certifies their successful completion of
“Product Management for AI (Artificial Intelligence) and ML
(Machine Learning)”
url - https://elvtr.com/certificate/87b2dad8c60a2677e1b68f5960c55afc
NLP&Time Series - Mankind has always sought the ability to predict the
future. How do we predict the future? Time, being the fourth dimension in
our world, makes all the data generated in the world time series data.
But in the current age of data, the time series that are collected by
businesses are growing larger and wider by the minute. Combined with an
explosion in the quantum of data collected and the renewed interest in ML,
the landscape of time series analysis also changed considerably. There are
several areas of Time Series such as AutoRegressive Integrated Moving
Average (ARIMA), Regular time series, Irregular time series,
Time series analysis, and etc.
AWS - AWS Security Paradigm implementation of holistic perimeter
assessment of VPC, S3, SNS, SQS, Cloud Trail, EKS, ECS,
AWS Fargate containers, the mitigation of AWS Lambda,
Step functions. Building Scalable and Fault-Tolerant CI/CD
Pipeline by using AWS Automation (AWS CodeCommit,
CodeBuild, CodeDeploy, CodePipeline, and CloudFormation) for
Microservices, AWS Snowflake
Test Driven Tool - Solenium and Robot, and SDET (Software
development engineer in test) technology which is far
advanced from TDD and BDD testing idea – SDET is
an increasingly popular title for software testers,
but one with substantive differences from a traditional
QA role.
SDETs have skills in programming and technical
testing tasks, abilities and know-how that IT
organizations can use to complement more
conventional testers.
SonarQube for Code coverage and Testing,
Code Quality Analysis, Complex Analysis Of Code,
CI/CD Integration and Reporting.
Fortify - Advanced Security Testing, Static Code
Analysis, integration with Build Systems.
Container – Docker, Vagrant, Kubernetes, AWS -ECR(Elastic Container Registry ), ECS(Elastic Container Service), EKS (Elastic Kubernetes Service)
Security - OWASP Top 10, WS-Security, WS-Policy, SAML, XML,
Signature, and XML Encryption, and ADFS,
OAuth 2.0 API for Java Services – SaaS, IaaS and PaaS
Automation Development for unit testing, Integration testing, pipeline build and its testing, and deployment using CI/CD of Gaia – Docker and Kubernetes
Distributed Message Computing – IBM MQ, RabbitMQ, AMOQ, and Kafka
Revision Control System - Bitbucket, Source-tree, Bamboo, GIT, Subversion, Jira, Rational Clear Case, Rational Clear Quest, Rational Suite
Spring - Spring Framework 5.0+, and SpringBoot 2.0 +, 3.0+
Spring Framework (Microservices, MVC REST, IoC and DI, AOP,
Database, Transaction, Hibernate, Data JPA, Security, JMS, Batch,
Web Services, and Integration),
Public Cloud System – AWS, Google Cloud platform, Gaia, and etc.
Visualization – Microsoft Power BI, Prometheus, Neo4j
PROFESSIONAL EXPERIENCE
Accenture May 2021 – NOW
Associate Manager - Senior Application Development, Java and Python Architect, Full Stack Developer
META Platforms (Facebook) and Its Workplace platforms project
-Rewrite Facebook menus which were written by old React code. These menus were Home, Video, Friends, Notification., and others. The old React codes were using State, Properties, and old styles of codes. To improve its performance and readability, the new codes were written by using JSX, TypeScript, and React Hooks such as useState, useEffect, useContext, useRef, useReducer, useCallback, useMemo, and memo.
-There were many Facebook own confidential data stored in Microsoft SQL server which could not open to customers. These data were forbidden to store into a formal local DB such as MySQL, PostgreSQL, and any kind of NoSQL DB because of security reason. For this purpose, Facebook developed a global Microsoft SQL Server with LDAP (Lightweight directory access protocol) which was a protocol that helped users find data about organizations, persons, and more. LDAP has two main goals: to store data in the LDAP directory and authenticate users to access the directory. It also provides the communication language that applications require to send and receive information from directory services. A directory service provides access to where information on organizations, individuals, and other data is located within a network. However, even though this idea is good because all different organizations can access this LDAP data across world, the retried column numbers for each record would be more than 10,000. Thus, it looks impossible to extract each organization’s needed columns. Thus, I developed Python 3.12+ program and its libraries to view its retrieved data graphically and temporarily stored the selected columns into local DB easily without a formal DB schema and process this stored DB to supply data for Facebook menus or some other areas. These Python libraries were NumPy, Pandas, Scipy, Scikit-learn, and several others. The GUIs were TKinter, Matplotlib, Seaborn, and others. The DB was SQLite.
-Develop an AI/ML (Artificial Intelligence/Machine Learning) model to predict Facebook customers’ expectation and to provide the best service for their needs. Fortunately the gigantic data has been accumulated in Microsoft SQL server with LDAP ((Lightweight directory access protocol) format. To develop a useful mathematical model, among three outstanding ML data positions such as Supervised, Unsupervised, and Reinforcement learning, I tried to convert all existing data to Supervised learning model to predict future customers’ expectations and to provide best service for them.
-
The key steps for this mathematical development are as the following:
(1)Data Scrubbing (umbrella term for manipulating data in preparation for analysis.
Dropping non-clustered data – remove abnormal data compared to the majority data positions
Dimension Reduction – 3 dimensional or 2 dimensional plots are easier for the interpretation by using reduction technique of PCA (Principal Component Analysis known General Factor Analysis)
(2)Split Validation
A crucial part of machine learning is partitioning the data into two separate sets
using a technique called split validation. The first set is called the training data and is used to build the prediction model. The second set is called the test data and is kept in reserve and used to assess the accuracy of the model developed from the training data. The training and test data is typically split 70/30 or 80/20 with the training data representing the larger portion. Once the model has been optimized and validated against the test data for accuracy, it’s ready to generate predictions using new input data. To perform split validation in Python we can use train_test_split from Scikit-learn, which requires an initial import from the sklearn.model_selection library.
(3)Evaluation
The next step is to evaluate the results. The method of evaluation will depend on
the scope of my model, and specifically, whether it is a classification or regression model. My mathematical model case can use the accuracy score which can be calculated by the result of how many cases the model classified correctly divided by the full number of cases. If all predictions are correct, the accuracy score is 1.0, and 0 if all cases are predicted incorrectly
(4)Optimization
The final step is to optimize the model. For clustering analysis techniques, this
might mean going back and modifying the number of clusters, or changing the
hyperparameters of a tree-based learning technique
-Implement an enhanced Java Architecture renovation for existing Microservice to include Synchronous/Asynchronous, Blocking/Non-Blocking, Reactive Programming, and etc.
-Develop Facebook Back-end Workplace Platform by using Java 17+ with Multithreading (Executor Service as well as Fork Join model) for low latency, and Python 3.9+ with Click tool
-Develop Facebook Workplace Front-end GUI by using C#, React 17+ with Hooks (useState, useEffect, useMemo, useCallback, memo, useReducer), and also develop Tkinter graphic package by using Python libraries like Pandas, and NumPy to explore the alternative of using Quick Click Master tools without splitting this tool into two Front-end (React or Angular) or Back-end (Python 3.9+ and Java 17+).
-Prototype Development of a new way of Front-end approach by using Microsoft Azure AG(Agnostic Grid) approach which is independent on React, Angular platform.
-Develop Facebook Back-end Workplace debugging tools for ACNTOOLS to get data from LDAP3 and Facebook LDAP server by using Python 3.9+ with Flask, and Java 17+, and its related jobs, storing its data into sqlite3 DB.
-Develop a Wagtail prototype project to compare Django CMS (Content Management System).
The two most popular Content Management Systems (CMS) are Django and Wagtail.
But when two CMSs offer us many of the same benefits, it’s hard to know which platform between django cms vs Wagtail best serves our needs – According to my experience, Wagtail offers experienced developers control over the smallest details of their site and Google company love to use it now. Since I have a Wordpress background, Wagtail has lots of similarities in terms of positioning are striking.
Accordingly, If I want such control, Wagtail may be a better choice for me. It takes significant knowledge to extend your website’s functionality with Wagtail, but if I am a confident developer then I’ll have no problems and enjoy greater flexibility in some areas of our site’s design.
To my conclusion, both django CMS and Wagtail use the same Python programming language and Django framework and so whichever I choose I’ll enjoy many of the same benefits.
However, the best choice for me comes down to my needs:
1.django CMS is perfect for enterprises who want to set up an amazing-looking, content-driven website
2.Wagtail offers additional options for experienced developers who feel confident extending their site’s capabilities
-Prototype Development of Azure Logic App to be used together with existing Azure Functions which can be used by Java or C# or Node as the below reason:
1. Can be configured within a web portal and can execute our logic without programming any code. This is more simple to administrate as I only need to manage the properties of these logic blocks in my own workflow; many (over 200) connectors are plug-and-play so Logic Apps are hardly limited.
2. This is stateful, and declarative development approach which can be supported by large collection of connectors, Enterprise Integration Pack for B2B scenarios, build custom connectors
3. many other good points
-Gathering Facebook workplace customers’ data and its statistics to generate machine learning into DB to predict future customers’ behavior prediction which is used for AI (Artificial Intelligence). Decision tree, Confusion matrix, Hierarchical Clustering, Logistics Regression, and Grid search technique are used for this machine learning. Upon this machine learning, our team can generate a Rules based system based on Facts and Rules which is a simplified form of AI.
-The key feature of Facebook Workplace is to develop Acntools which can be run by batch mode to diagnose and debug Facebook Acntools Workplace, and this code is replaced by using Go(lang) for the distributed Network Service
-Quick Click Master tool is a Python package for creating a beautiful command line interface in a composable way with little code as necessary.
-Microsoft Active Directory tool is using for several different domains access via SCIM (System for Cross-Domain Identity Management) which is a standardized definition of two endpoints: a /Users endpoint and a /Groups endpoint. It uses common REST verbs to create, update, and delete objects, and a pre-defined schema for common attributes like group name, username, first name, last name and email. The schema approach for each different user’s unique demand in GraphQL is done in MSAL (Microsoft Authentication Library).
-Prototype Development of public cloud system of AWS and Azure Cloud system from local data of Meta Platform project
-Develop manipulation of millions of /users and /groups by using Kubernetes and Docker container product of KEYCLOCK which is to be used for Authentication without password and streaming of produces and consumers by using Kafka Confluent Cloud system by using K-Table, K-Stream, and ksqlDB
-Develop Test automation by using SDET (Software development engineer in test) technology which is far advanced from TDD and BDD testing idea – SDET is an increasingly popular title for software testers, but one with substantive differences from a traditional QA role. SDETs have skills in programming and technical testing tasks, abilities and know-how that IT organizations can use to complement more conventional testers
-From AWS Snowflake data, make the data be visualized and be reported by using Microsoft Power BI
-By using SharePoint, our team can communicate, collaborate, and control our content with several different teams to secure file sharing, and integrated content management to maximize our productivity.
CPF (Chicago Pacific Founders) Medical Insurance project
-Implement Microservice from current huge Monolithic System
-Develop CPF medical platform with cutting edge technology
-Strong experience of user interface design and development using React tool React.js, Workflows, Webpack, Enzyme, Redux, Flux, NPM and experience of integration with AEM flows, templates In-depth knowledge of JavaScript, CSS, HTML, and front-end languages. Knowledge with code versioning tools such as Git, SVN Ability to work independently as well as lead size teams. Excellent communication skills.
-Work ANGULAR 14, REACT 17+ Frontend with Hooks and NODE Backend developer
-AG(Agnostic) Grid development which is independent on React, Angular platform.
-Use Microsoft AZURE platform by using Visual Studio Code
-Using the Snowflake and Star Schemas in data modeling.
-Leveraged S3, Apache Parquet, and Apache Iceberg to store and process large volumes of data efficiently.
-Implement several routines for Collection and Views from ArangoDB NoSQL
-Implement GraphQL Apollo server via ArangoDB NoSQL
-Big Data (Spark, Hadoop 3+, MapReduce, HDFS, Hbase, Hive, Sqoop, Flume) Conversion to AWS by using Java 8 and Python
TELADOC HEALTH Project
-Job Summary: The Identity & Access Management Developer will work remotely and collaboratively to design, develop, and deploy the client’s Identity and Access Management solutions. The Identity & Access Management Developer is expected to demonstrate expert-level proficiency in IAM applications, preferably open-source solutions such as Keycloak, as well as in several languages and technologies required for integration such as Java.
-Designs, documents, and implements automation of Identity Management workflows. Working with IT Systems Engineering, IT Application, as well as with other application owners, provides Identity Services for both on premises and Cloud-based services.
-Go(lang) is used for the replacement for the existing infrastructure code for bug fix and enhancement of its performance
-Passwordless authentication - Discovery and research on the technical design (Biometrics - Face ID, touch ID, finger print) and its development by using IntelliJ IDEA tool, combining with Postman, Keycloak based on Microsoft Azure Active Directory, and Java 18.0 JDK.
JP Morgan Chase (JPMC) Bank June 2019- August 2020
VP (Vice President) as a full stack developer and Sr. Architect
- Develop Microservices approach from all existing old traditional monolithic deployment, adding new microservices technologies such as ZUUL and Reactive programming.
- By using SharePoint, we can monitor several teams, groups, and other organizations on track which feeds usage and access information to dashboards or scheduled reports so we are always in the loop – Centralized reports, Actionable reports, Monitor several teams policy compliance, and several others.
- JPMC started to implement microservices from its existing monolith application. Once monolith application in one place can be decomposed into several microservices which are independent and decoupled services, there arises a big issue how to communicate with each microservices when they want to send/receive messages among them.
The best solution for this issue is to use one of the follows: (1) Brokers such as RabbitMQ (2) RPC (Remote Procedure Calls) (3) REST APIs (4) Apache Kafka (5) and several others. Several years ago, JPMC developed the famous RabbitMQ and donated this product to Open Source, and this product have been used by a large number of companies such as Bloomberg, Zalando, WeWork, Wunderlist within various industries which rely on microservice based architecture. Thus, at this time, JPMC wants to develop another kind of MQ such as IBM MQ development product for the purpose of selling this product. This product is based on AMQP, MQTT, and STOMP protocol.
- Develop IBM Message Queue(MQ) series communication software by using Java 8+, Python 3.9+, C#, Connexion, Swagger, Flask via IntelliJ IDEA, Django, Jenkin CI/CD, Microservices and Jira ticket manipulation, Git, Bitbucket, Gradle, Maven, Ant, and Spring MVC, Spring Boot 2.5+, IntelliJ IDEA, and etc.
- Develop automation tool of MDB(multilateral development bank) which will support IBM MQ platform with PCF(Programmable Command Formats ) and Rest
- Develop frontend GUI by using Swagger, Flask, and React JS for the SaaS (Software as a Service) product of IBM MQ series by using HTML5, CSS3, Bootstrap 5, JavaScript ES6, Angular 10+
- Develop a Wagtail project to manage the IBM MQ platform GUI
- API development of IBM MQ Infrastructure as SaaS by using GraphQL and API Gateway which are beyond SOAP and REST – GraphQL is a new challenging way of REST deficiency and its problem.
- Using Java 1.8+ and Interactive Java with JShell in Java 1.9+, convert several existing old legacy Java products to Multithreading (Executor Service as well as Fork Join model) for low latency and Rabbit MQ messaging to Executor Service model which can be managed more transparently and Fork Join model multi-threading for large scale, compute sensitive application and IBM MQ and JMS messaging system.
- For this Java development, TDD (Test Driven Development – (1) black-box testing (2) white-box-testing) and BDD (Behavior Driven Development) approaches are used – TDD approach is for the development purpose while BDD approach is an agile process designed to keep the focus on the stakeholder value throughout the whole project. For the Unit testing, Junit, TestNG, and Mockit tools are used, and for the BDD, JBehave, Cucumber tools are used for the customers.
- Develop MQAdmin.api and UMCAdmin.api using Django platform
- Develop Security System of SaaS middleware product to be used by internet by applying
OWASP Top 10, WS-Security, WS-Policy, SAML, XML, Signature, and XML Encryption, and
ADFS by using Go(lang).
- From Manual to Automation Development for unit testing, Integration testing, pipeline build and its testing, and deployment using CI/CD of Gaia Jenkins, Google Cloud Platform, and AWS – Docker and Kubernetes for PaaS (Platform as a Service)
- Big Data (Hadoop 3+, Hbase, Hive, Scala, Sqoop, Flume) development by using Spark (Spark SQL, Spark Streaming, MLib Machine Learning), Kafka and Python 3.9+
- Container Buildup – Docker, Vagrant, Kubernetes, AWS – EMR (Elastic Map Reduce), ECR (Elastic Container Registry ), ECS (Elastic Container Service), EKS (Elastic Kubernetes Service)
- Front-end development by using Microservices approach of Java/J2EE, Spring Boot 2.4+, Hibernate, Oracle, RESTFul webservice, GIT
- Develop LDAP for JPMC – IDAnywhere and Easy IDA built on ADFS (Microsoft Active Directory Federate Service)
- Cloud Development using Terraform by Go(lang) as Infrastructure as Code along with JPMC Gaia Cloud Foundry for IaaS (Infrastructure as a Service), PaaS (Platform as a Service), and SaaS (Software as a Service) through Apigee and API Gateway
- Prototype Development of a Sandbox project for Deep Learning(DL) (Generative and Discriminative Model), Reinforcement Learning(RL), Machine Learning Frameworks (TensorFlow, Pytorch, Jax, Jasper, Runway, several Other AI tools), Machine Learning using AWS DeepRacer)
- Prototype Development of AWS Machine Learning, DeepRacer, CloudWatch by using Python BOTO3 and AWS CDK
- Prototype Machine Learning Model development by using AWS Sumerian as Gateway of accessing AWS existing 23 different AWS Machine Learning products such as Amazon SageMaker, Amazon Rekognition, and etc. which can be factualized between VR(Virtual Reality) and AR(Augmented Reality) with Visual Animation and realistic Simulation.
- AWS Security Paradigm implementation of holistic perimeter assessment of VPC, S3, SNS, SQS, Cloud Trail, EKS, ECS, AWS Fargate containers, the mitigation of AWS Lambda, Step functions, and etc..
- Building Scalable and Fault-Tolerant CI/CD Pipeline by using AWS Automation (AWS CodeCommit, CodeBuild, CodeDeploy, CodePipeline, and CloudFormation) for Microservices
- Conversion of old legacy code of BIG DATA in HBase to AWS Spark EMR (Elastic MapReduce) by storing into Dynamo DB and Redshift, and managing and deploying it by AWS Containers such as Elastic Container Service and Elastic K8 Service.
- End to End development from client side to server side by using LAMP stack (Linux, Apache,
MySQL, PHP), MEAN stack (MongoDB, Express, AngularJS, NodeJS), and MENR
Stack (MongoDB, Express, NodeJS, ReactJS)
INFOSYS LIMITED December 2018 – May 2019
Work for Bank of America project as a full stack developer and Sr. Architect
- Develop Big Data (Hadoop 3+, Hbase, Hive, Scala, Sqoop, Flume) development by using Spark, Kafka and Python
- Develop Front-end development by using Microservices approach of Java / J2EE, Spring Boot 2.3.3+, Hibernate, Oracle, RESTFul webservice, GIT, Angular 4+, React JS
- Develop Microservices approach from all existing old traditional monolithic deployment into several meaningful chunks to be deployed independently and communicate them via HTTP protocol.
- Prototype Development of Migration from Oracle Relational DB to AWS Dynamo, and RDS
- AWS Security Detection, Prevention, Automaton, and its Solution by using Python BOTO3 and Go(lang).
- AWS DevOps, Continuous Delivery, Automation of Delivery Product by using Python BOTO3 and Go(lang).
- AWS Cloud Training at AWS Loft in NYC area by using Python BOTO3, and Go(lang).
COGNIZANT TECHNOLOG February 2017 – October 2018
BNY Mellon Project: May 2018 - October 2018
Sr. Full Stack Developer for DevOps and