Post Job Free

Resume

Sign in

Service Server Data

Location:
San Jose, CA
Salary:
145K
Posted:
April 05, 2023

Contact this candidate

Resume:

MawChin Lee

(Skype ID) - live:mawchin****, or using GMAIL : google hangout, eMail:adwcnz@r.postjobfree.com LinkedIn: https://www.linkedin.com/pub/mawchin-lee/79/875/4ba 1421 Stone Creek Drive, San Jose CA 95132 U.S.A US Citizen Phone 1-408-***-**** SUMMARY

● Perform programming in Java7-8-9-11/ Core java, java J2EE/J2SE/J2ME -Java Micro edition; JavaScript/ Python / Go, C / C++ / [ C#, .Net,ASP.Net, ASP.NET Core, EF Core, ADO.NET ], Scala – SBT, Scale AKKA (actor, message ), python 2.7 – 3.8, bash shell script, Eclipse/INTELLIJ IDE,

● Using VS Code & Extension and Visual Studio Express to programming Go/ C-C++/C#, HTML5, NODEJS, REACT / Angular.

● C / C++, bash shell script, Eclipse/INTELLIJ IDE, VC ++ object oriented - Abstraction, Encapsulation, Inheritance, Polymorphism. Linux/C/C++ /C++14 STD :: Thread, STL container collection, C /C++ PTHREAD, Boost library: ASIO, binding, collection, multi-threading and resource synchronization]. C/C++ Build tool: Make, CMAKE / MakeFile.

● GNU gcc /g++ [C/C++] In Linux and/or Window system {By install Cygwin/ (Eclipse IDE for C++) and Mingw (from sourceForge page) to install gcc / g++ 6.3.0 or later updated collection version, using Chocolatey package plus window power shell to install make }; You can also use Apple Store/Apple ID to install Xcode for Object C/C++, swift language both for IOS / Mac OSX development

● Experienced J2EE/Core Java, java script, AMQP ( Rabbit MQ ), Kafka-brokers / zookeeper –distributed and partitioned “Topic” style Publisher / subscriber Message Queue programming -Apache Kafka, and ELK [ Log stash / Elastic search / KIBANA ] stack-, AMQP( Rabbit MQ )

/ MQTT IOT Eclipse MOSQUITTO / PAHO, or MQTT / HIVE MQ. IBM MQ Work Experience on Spring tool suite MVC framework (dependency injection, AOP, MVC, bean xml configuration), Hibernate 4. SOLR / LUCENE search library (Inverted index-hash table index writer/reader/query, B- Tree). Solr feature: Near real-time Indexing, Advanced full-text / facet (narrow down and refine search results by applying different filter) / geospatial

/hit highlighting (allows you to quickly locate important words or phrases in the document you are currently viewing) search Capabilities, Optimized for High Volume Traffic/Performance, Standards Based Open Interfaces - XML, JSON and HTTP CRUD rest API. Can ne standalone or shard

– multiple node cloud, Comprehensive Administration UI Interfaces, Easy Monitoring/logging, Highly Scalable and Fault Tolerant Flexible.

● Micro Service ( u Service ) using spring boot or spring framework with JSP or JSF (event /tag and embedded UI component): structures an application as a collection of loosely coupled services ( application is organized as a set of small and automated/independent c l o u d n a t i v e software components – speed in design, safety in change, at scale and in harmony- through message based or gateway enabled ), deign pattern adopted could be - Creation patterns (based on object creation mechanism) -Structure patterns (assemble object and classes into larger structure) –behavior patterns

(concerned with algorithm and the assignment of responsibilities between objects –Domain driven or -event drivenot pattern (coupled with CQRS handling / async operation); Maven build tool, Restful API gateway, OOP, AOP, DI, Spring Framework/ Spring boot and spring cloud ( Spring Session ), NETFLIX OSS Stack- Eureka client Discovery, ZUUL URL mapping/filter Security and Authentication / Authorization with Oauth&Oauth2 –SAML single sign on, Ribbon client load balance, HYSTRIX – distributed trace and circuit breaker) Ecosystem. Service Mesh (istiod) – configuration,discovery,certificates - with Pilot/Citadel / Proxy Envoy / ingres gateway / additions to handle micro services communications . MVC, bean xml configuration, Spring Data JPA, hibernate, No-SQL Database Cassandra, MongoDB, POSTGRES SQL, Oracle, Tableau – business intelligence and visual analytics SQL related DB operation, Graph Database, Saleforce Custom relationship management(CRM) - Neo4j - feature:

-1) Use SQL Like easy query language Neo4j CQL, easy and faster to retrieve/traversal/navigation of Traversal Connected data

-2) Data model (Property Graph - flexible schema) − Neo4j follows a data model named native property graph model.

-3) ACID properties − Neo4j supports full ACID (Atomicity, Consistency, Isolation, and Durability) rules.

-4) Uses Native graph storage with Native GPE(Graph Processing Engine) supports exporting of query data to JSON / XLS format, and provides REST API.

-5) It supports two kinds of Java API: Cypher API and Native Java API to develop Java applications.

● AMAZON cloud platform – cloud CLI, EC2 (VM), cloud Formation, {{ S3 [Bucket, Upload File, static hosting, etc…], AMI /IAM, Lambda Function – Server less, Lambda Framework rest API, cloud API gateway Web service, route53 (domain name and DNS setup), certification manager, Step function, cloud Front(CDN), COGNITO User Pool and security, SNS, SQS / SES (simple mail service), cloud watch/trail, Kinesis, Dynamo DB – all these AWS services should be orchestrated by - role/permission/event/ customized app event /mail/s3 /AWS SDK- to achieve the so called lambda functionality } } AWS quick Sight - BI, AWS x-ray and, Mongo DB - ATLAS, ECS/EKS, AWS SDK. AWS Glue ( AWS managed ETL services), using crawler/classifier/trigger to predefine the data catalog and extract (metadata table repository) as data store from the data source –S3, RDS, Redshift, JDBC, Dynamo DB - To data target (one classifier one table) –s3,RDS,Redshift,JDBC, Dynamo DB-; the crawler can use scheduler /Job-script ( PYSPARK [python] or SCALA) to define and run crawler job. AWS Glue can be pipelined with AWS Athena ( AWS query serves for data analysis) and AWS Redshift ( data warehouse and data analytic services) to perform the data science job or merge with some message MQ ingest streaming (KAFKA ) for data pipeline job. AWS IOT - device gateway (MQTT message broker). Thing registry, Thing shadow, Rule Engine, Security (Certificates, Policy, etc.); Aws IOT device SDK provide MQTT Mosquito client for Pub/Sub and define topic with AWS IOT MQTT message broker, The rule engine can be configured to invoke AWS lambda function. AWS glue and IOT may need orchestrate with other AWS services.

● Microsoft Azure cloud – Azure console (portal), Window power shell, Azure CLI, ARM /API for services deployment and management . VM – like EC2 approach—use SSH or RDP to connect,, Azure Storage – File / Queue/Blob (like AWS S3)/Table /Disk – Azure storage explorer Azure function –trigger,event,action-

can use blob as its image storage, Logic application (work flow), Web apps and Mobile Apps, Application gateway ( client side load balance), Application Insight ( for monitoring/Log Analysis ), { HD Insight – blob / data lake storage, ETL (Data Factory), Hadoop, Spark, KAFKA, Hive LLAP(uses persistent daemons to provide an I/O layer and in-memory caching for low latency queries), Hbase, IOT and some useful additions for data visualization

/ monitoring /tracing / logging /–such as Garfana, Prometheus and kiali Application Front Door (CDN), Azure Notification and Notification Hub

(push message – by ways of PNses, …), IOT Hub (similar to AWS IOT) / Azure IOT edge (Device focused run time that enable user to deploy, run, and monitor containerized Linux workloads), and Event Hub (concurrently real time data ingestion can be connected with KAFKA eco system), Azure IAM access control (Role/policy based, or even RBAC), Cosmos DB ( multi model and turnkey global distribution –SQL( core). Mongo DB, Cassandra. Azure Table, Gremlin (Graph based) - NOSQL and relational database/table, and SQL (SQL, MYSQL, Maria DB, Postgres ) Database, Azure monitor – based on Dyna trace and log analytic, TLS/SSL certification biding for application, Security – Microsoft Active Directory, ACI/AKS, Azure SDK .

● Google cloud platform google console, GCLOUD (command CLI), GSUTIL ( a python application for storage op), cloud IAM – account/roles/ACL ; - VM, Computer/App engineer, google cloud Kubernetes Engineer(GKE), google function – google API gateway, google firewall-VPC rule for load balance, cloud DNS, cloud CDN( speed / performance) policy/action, Allow/deny for HTTPS/HTTP routing), cloud google operations (aka Stack Driver) for monitoring / logging /debugging/error reporting, google Pub/Sub (queue – Kafka connector) and cloud Task, cloud SQL, cloud Spanner, Google cloud Storage (GCS Buckets), google data store (document storage), cloud Big Table / Big query. Google DATAPROC, cloud Key Management Services(KMS) and Cloud Armor

● Strong Public key infrastructure (PKI) Mechanism work experience–for software integrity authentication and authorization(availability) - PGP, ASN.1, base64, OKTA, SSO, MFA. Cryptography, OPENSSL, TSL / SSL ; UML -write software design specification/documentation and test plan for software design lifecycle.

● OpenAPI (Swagger.IO - Swagger Editor – JSON, YAML /Swagger UI ), HTTPCLIENT, graph QL –Rest API call for fetch/update from server side.

● Working experience of Big Data, Hadoop Application, Hadoop Eco System – ETL, Hadoop, Hive, HBASE, Pig, SQOOP, - Hortonworks ( AMBARI HDP cluster Management), Confluence (Avro Schema registry) / Cloudera distribution -ML, Spark Eco platform, Spark core RDD API – transformation/Action, Spark SQL- dataset, Data Frame API, Spark Streaming – Direct Streaming / Structure Streaming – unbound SQL table approach, MLIB - machine learning, HDFS fs programming, HDFS distributed file system and Linux server software network.

● Design and implement a 3D Visualization for big data clustering machine learning front end Plug In sub system with Eclipse java IDE. Working knowledge of Computer Vision ( Intel Open CV 3.0 – C/C++) – image pattern gathering / feature analysis/ extraction /detection such as line, edge, contour, region,circle and feature detection through supervised training classifier. OpenGL – C /C++ - transformation (scale, translate, shear, projection, perspective view, 2D/3D mapping), lighting /Shading, texture Mapping, etc.

● Machine learning - content Based/ collaborative filtering – Recommendation System, Deep learning –CNN, PCA/KPCA/ICA, HAAR cascade classifier and SVM- HOG binary histogram oriented gradients classifier and Bayesian theory – Mont Carlo - MCMC (similar to Naïve BAYESIAN Interface Method) . NUMPY / SCIPY /Pandas (python & library) and SCIKIT-learn. Programming language and Platform Java [7/8/9/11], c/C++/C#, Node JS / Express framework, Spark with SCALA or java/Maven dependency, Python 3.7.0, Java Script – ES5/ES6, Go language, AngularJS /Node JS /HTML 5 with java script and Linux bash shell script. OS: Mac OS Sierra/High Sierra, Unix/Linux- Ubuntu -16.04-19.10 or above, Open shift Red hat, Centos, Window 10. Architecture/Design/code

API design /Cloud computing

EDUCATION

1976 National Cheng Kung University, Tainan, Taiwan [NCKU] BS in Engineering Science . 1979 National Chiao Tung University, Hsinchu, Taiwan [NCTU] MS in Management Science. 1987 Auburn University, Alabama [Auburn Tiger- Auburn, ALA] MS in Computer Science and Engineering. Professional Experience

Collins Aerospace: through Bentley Global Resources (Oldsmar, FL) May 2022 – Nov 2022 Senior Principal Engineer / System Architecture

Worked on OSGI (Open Software Gateway Initiative) framework - OSGI bundle / features project.

Back-end development using window 10 / Linux (Ubuntu) OS platform with Java 7/8/9, java script / Linux bash script (Ubuntu) Net bean 8.0 / 13.0, Eclipse IDE, Apache maven, Tortoise SVN / GIT, SQL/Oracle Database programming in XAAMR ARCINC integrator, and AMQS – Next Gen on MQ (as IBM MQ – message queue, queue cluster, channel, Topic/topic node, Type B message /service for aviation system, ACMARS (Aircraft Communications, Addressing and Reporting System).

Using OSGI use KARAF as a container, it create/use bundle /feature, service, service register, module, security, Life cycle.

OSGI can perform remote installation, invocation, start/stop bundle/feature.

Oracle and Tableau – business intelligence and visual analytics SQL related DB operation, Saleforce Custom relationship management(CRM)

Developing using spring boot environment, open API (swagger.io), Kubernetes / CRD / operator and docker.

Micro service rest API design with security authentication/authorization (GWT token for Bear authorization – Uauth1 / Uauth2, SAML, SSO, LDAP, PKI), Junit Test – RUNWITH, MOCKIT, MOCKTO, security.

Completed an archive message: advanced search bundle project. Bank of American through Infosys full time (Richardson, TX) Feb 2021 – Mar 2022 Senior JAVA Architecture – Spring framework / Micro service

Work on Application Program performance capacity and planning development (APM like) using java/java script – CI/CD Jenkins.

Spring framework/ micro service backend APIs and Oracle DB –JDBC with window 10 and Ubuntu OS platform.

Perform java software architecture – using UML to abstraction ( module decomposition, to form a set of java system structure and its behavior; -carry out system domain requirement, analysis / design and implementation based on quality, availability, interoperability, modifiability, performance, Security, Testability and Usability.

Design and implements a backend Rest API set for on boarding financial service application to cisco App Dynamic / Dyna trace cluster- server/agent APM monitoring tool and Performance Center APM. It provide create system, create application, update system, update application, update system state and update application state,…etc. its main purpose is to collect various PM matrix, JVM, hardware resource memory

-network traffic usage/failure for various financial / banking software performance measurement via ASYNC event profile mechanism.

Using VS Code and Extension to programming HTML5, NODEJS, REACT and Angular.

Development tool used : window and Unix environment. java 7-8-9, GIT, Tortoise GIT, GITHUB, tomcat server, Oracle DB JDBC, Tableau – business intelligence and visual analytics SQL related DB operation,

Saleforce Custom relationship management(CRM) Eclipse, INTELIJ IDE.

Experience on Audio /.wav files to text using Veed third party software or transcription services like Happy Scribe for Custom relationship management(CRM) purpose. T-Mobile - Bellevue, WA through HCL American full time Sunny Vale /CA Mar 2020 – Jan 2021 Senior Solution Architecture – Full Stack Project Development

Concentrate on java 7/8/9/11 micro services / Micro service cloud / Eco system and SOAP XML services/client development

Using java 7/8 write code for payment system/method, developing Data Base repository API, using SPLUNK query/search for project/product support.

Payment security API concentrate on credit card/debit card fraud for CNP/CP (CVV /pin/zip code/billing address, device fingerprint using SOAP xml to carry card data. Neo4j graph database.

Using Postman and sonar cubic for rest API testing and code coverage

Programming Kafka streaming/connector; Tableau – business intelligence and visual analytics SQL related DB operation

Saleforce Custom relationship management(CRM)

Experience on Audio /.wav files to text using Veed third party software or transcription services like Happy Scribe for CRM purpose.

Programming On Soap web service – WSDL, UDDI, XML (XSD) using java and Eclipse IDE.

Tools used Eclipse, STS, INTELLIJ IDE, java script with CI/CD, node JS {Express / Web back} /Mocha and Chia Testing, NPM, angular/47/8 with RXJS and NGRX, angular JS, react JS, Junit Test –RUNWITH, MOCKIT, MOCKTO, security- OPENSSL SSL/TLS.

AMAZON cloud platform – EC2 [ PaaS, S3 [ Upload File, static hosting, etc…], Lambda Function – Server less Framework rest API, API gateway Web service, route53 (domain name and DNS setup, certification manager, cloud Front, COGNITO User Pool and security, SNA, SQS.

Cloud Watch / Trail and Dynamo DB operation, AWS SDK.

Working knowledge of Docker (root file system for image repository,Docker file, container, Docker client and daemon ) and Kubernetes (container orchestration platform).

Data pipeline Extraction Transform Loading data processing. Wipro Visa Foster City/CA full time Mar 2019 – Mar 2020 Technical lead –Full Stack Project Development

● Coded and take product responsibility of completing a 8 months of Linux based financial management system from start to end.

● UI is based on HTML5 –elements /Java Script/Ajax/ JQUERY then convert to amgular6/angular 7 html web page – via node JS / NPM, angular CLI,SCSS / Type Script, and Visual Studio Code editor - (UI module is controlled using -landing page {State Machine transition flow} – currency, exchange rate, RXJS –Http / Https, NGRX –Actions, Reducers, Selectors, Store / State / Configure / RXJS–Observables, Observers, subscription / server State Transition Mapping and deploy to Tomcat 8.2). UI module includes Services, Routes, Modules, Models, Directives, CONSTS, Components, SCSS, Assets, Store, Containers, ..etc. Angular use Two way data binding

/Typescript for UI element manipulation, UI adopts HTML5 / SCSS or CSS and typescript for rending, it use assets concept for java script, ajax, JQUERY Compatibility. Both angular and React use Node JS, NPM and CLI interface for hosting environment.

● AMAZON cloud platform – EC2 [ PaaS, S3 [ Upload File, static hosting, etc…], Lambda Function – Server less Framework Rest API, API gateway Web service, route53 (domain name and DNS setup, certification manager, cloud Front, cloud Watch /Trail and Dynamo DB operation. Working knowledge of CI/CD: Docker (root file system for image repository, Docker file, container, Docker client and daemon) and Kubernetes (container orchestration platform). Working knowledge of Azure Active Directory (use LDAP).

● Completed a FullStack web project implementation which include several major service modules, such as:

-1) User Interface module: This depicts the way user is going to use/manipulate this whole User management

Session Management

-2) Programming Technology used to design your business User Interface

-3) How to host the UI – tomcat / Jetty / Node JS ?

-4) Which Web Service mechanism is used to provide

-5) Any Design pattern to follow / adoption?

-6) Security Processing management

-7) Database selection /installation

-8) Client registry / discovery, API routing and Security, client side load balance monitoring and circuit breaker for micro services management or use service mesh –istio for micro services

● ELK [ Log stash / Elastic search / KIBANA ] stack programming, familiar with SOLR / LUCENE search library (Inverted index-hash table index writer/reader/query, B-Tree), test and research in graphnDB – Neo4j.

● React use component re usage design approach; proper (input - attribute), state ( every component has some sort of initial state which are used to store /modify, delete proper within component ). React use one way data flow, and/or may use JSX (HTML5, java script ) as its rending mechanism; Flux or REDUX for reducer, dispatch, store, action /subscription, Component life cycle – mound, update,.. –can be controlled inside render method. Router and/or HTML5 link (JSX) for routing mechanism. To consume a HTTP Rest API, HTTPCLIENT, The Ajax request, XML Http Request and/or React AXIOS module inside node JS NPM can be used. Use Jest and ENZYME for react programming testing.

● Saleforce Custom relationship management(CRM)

● Service layer adopts Spring boot based mini micro service server with TLS/SSL certificate authentication For Https rest API, ITEXT PDF Generating /Parsing, secured session/ SAML -single sign on / User Management/Password-Global Pre Authorize Method Authorization on secured session layer, Provide Https Rest API.

● Backend resources adopts MYSQL 8.05 Data Base using ORM my bats /Hibernate /JDBC template, Spring JDBC HIKARICP data source for MYSQL SQL database usage. DB Schema design need consider constraint, Normalization ( 1NF – 3NF), sequence; while designing DB repository API should consider data mapping ( one to one, one to many, … using set, list or map), locking (optimistic or pessimistic ) mechanism and cache level for DB performance.

● Completed a Kafka (Apache, HORNTOWORKD and Confluent Distribution) for ELK cluster stack integration project.

● Object Relation Mapping myBatis.xml schema for MYSQL SQL database, also familiar with Hibernate ORDBMS mapping and JDBC template.

Nortek Security and Control San Jose/CA Sep 2018 – Mar 2019 Senior Full Stack Engineer

Mac OS and Linux – Ubuntu development environment, smart home / Z-wave sensor controller / Security platform Mainly working on Gnu C/ C++ language with static (. a) and (.so) shared library / executable on make/ CMAKE and eclipse IDE for C/C++ for security / MQTT IoT related software product. Java [Eclipse/ spring framework STS] for Rest API

Micro service – web server, python script with MAPI and Jira, Linux and bash script programming.

MQTT IoT message queue /eclipse PAHO client programming in C/C++ and java Go language with security authentication related application.

REACT JS / node JS programming. Design a Timeseries Data visualization Utility.

Writing code for security camera – using OpenCV4 library via Hog SVM HOG classifier / HAAR cascade Classifier and template matcher for image detection/recognition plus face/ face feature / pedestrian detection.

Programming OPENCV / OpenGL cloud platform – EC2[ PaaS, S3 [ Upload File, static hosting, etc…], Lambda Function – Server less Framework rest API, API gateway Web service, route53 (domain name and DNS setup, certification manager, cloud Front, cloud Trail and Dynamo DB operation.

Working knowledge of Docker (root file system for image repository, CI/CD-Docker file, container, Docker client and daemon) and Kubernetes (container orchestration platform).

Working knowledge of Azure Active Directory.

Developed Canny line / edge / contour image detection algorithm.

Deep learning like tensor flow, THEANO and CAFFE multi array computation via algorithm –such as LAPLACE, GUASSIAN operator, FASTFOURIER transform and convolution mechanism (kernel matrix and anchor point) to transform image pixels matrix to extract image pattern feature and/or transformation.

Write code for Support Vector Machine /PCA/ KPCA /ICA for supervised learning with discriminative classifier / dimension reduction method.

Senior Software Engineer Fujitsu Network Communication (SunnyVale CA) Feb 2017 – Aug 2018

[Network & Telecommunication Domain]

Linux – Ubuntu / CentOS server development environment and SDN platform.

Mainly working on google Go language, Java [Eclipse/spring framework STS], C/C++ PKI, OpenSSL, Linux and bash script programming and python, Hadoop – Big Data machine learning using java / java script.

Design a full brown GO web server as below:

this go server talk to a third party server using HTTP client Protocol and expose rest APIs for the third party web server to consume with; on the other side it also use a remote procedure call with SSH protocol (SSH dialing) to connect to a remote XM RPC web service hosted(connected to) a Net Conference network device to acquire network device configuration Information. The Go webserver use a configuration module (package) in YML file format to control how it setup the http/https ports and IP addresses, server/client mutual authentication, the Go Web Server used TLS certification file name (KPI key pair file name) – an OPENSSL tool coupled with openssl.conf file - can be used to generate the key pair, cert and sign the cert and some of encryption/decryption – base64 and AESSHA256 Decode/Encode are used -, and the preset worker numbers, queue sizes, thread time out value. Logging file name, … all sort of Go web server configuration related information are included within this configure file – config.go .

The Go webserver initiate and setup the logger file, configuration file info inside the package Main init function ( func init . Also within the main function of the package Main, it initiate a new server and use TCP listener ( may coupled with tcpKeepLiveListener, ListenerWrAP ) to binding to the IP address of the web server either in HTTPS or HTTP mode, then startup this web server to wait for the Rest API call issued from the third party web server (the webserver is handled with Go concurrency go routines with signal, system call, sync waitgroup, channel shutdown. Cleanup, done, panic and recover operation),. When the Go web server receives the exposed rest API call, it will check the CORS (Pre Flight Request -OPTIONS) authentication and also check the TLS certificate file (X509 cert, and key pair) . If everything is validation ok, the Go web server will using dispatcher to dispatch ( worker pool, queue and sync channel are passed from here) the related Rest APIs to its proper route and proceed to handle the rest API requirement

(Get/Post/update/Delete – CRUD )rest operation inside (http default handler and http handler through http Mux: worker, queue size are passed down here through configure file)); And Go database repository and services API are handled here. Note that the dispatch will invoke the needed worker

(thread) and command to connect to XML RPC and exchange data (network device configuration information and setting) through Web Server and XML RPC server.

Depending on each rest API requirement, it will prepare the related type named structure inform for the project needed purposed of remote procedure call data structure (Type named structure embedded with xml and JSON message data) – some of the Type named structure/Interface function command /API are wrapped inside package Data/Lib- and initiate the remote RPC call to open and setup SSH session with XML RPC web server(port number: 830) with the hello message handshaking to exchange capacity within each other, it then will use the request and response command mechanism for data exchange coupled with xml marshal and un marshal. This Go server contains Go package main, package configuration(YML), package data, package HTTP, package net conference. Package Lib, package server, package session. Each package lived within a separate as Go language specified. This Go webserver is running insider Ubuntu and use some Linux bash scrip file to control its operation.

Programming in Hadoop Hive / Spark SQL analytics and aggregative functions programming such as Rank, dense/percent/cume_dist rank and ntitle analytic analysis for machine learning application.

Successfully completed/implemented several micro service modules using spring boot framework/J2EE– Model/View/Controller MVC design pattern coupled with programming Cassandra DB services DB to produce/ consume Rest API HTTP/HTTPS protocol.

Programming with Web Socket – [ We Client and Jetty Web Socket server, user connection list, Socket] full duplex communication in https upgrade protocol via TCP state full stream message connection communication.

Working knowledge – deep machine learning for image pattern analysis (unsupervised) and computer vision feature extraction / transformation – classification (supervised) via nonlinear cascade of multiple layers processing units–GPU intensive. Such as CAFFE –multimedia convolution neural network framework in C++ / python / Tensor Flow -graphic node tree of multi-dimensional array- tensor (graph edges) and mathematic express (graph node)- a numerical computation via data flow graphs.

Ability to perform CI/CD Container Docker / Kubernetes ecosystem CI/CD programming.

Working Knowledge of hypervisor virtualization computing environment.

Design/programming platform adopted can be: open source (spring-STS MVC, Spring Boot, Tomcat / Eclipse Jersey

/ JBOSS Development SDK / Node JS scripting) java platform, or Boost C++11+ /C/C++ platform. Medtronic Inc., (Coral Creek Road Louisville, CO) May 2015 – May 2016 Surgical Technology Software Technical Lead Navigation-

Successfully writing code to port, design/redesign and debug a major Client/Server Service module – Stealth Link Service -From cranial Q based to Rabbit MQ / Service Framework based application within the Medtronic Navigation new S8 Stealth Application Service framework.

S8 is a Linux based (Ubuntu) C++/Python/Java mixed AMPQ message protocol which using Qt 5/6 for User interface

/visualization (segmentation /modeling /exam /volume / DICOM and Rabbit MQ as its main message-based service.

It adopts Boost C++ ASYNC IO Socket service and boost C++ library (V1.54-1.57), STL, Python, P O S T G R E S Database, Rabbit MQ PIKA client library and C++ LIBQ C++ library, JSON tree message parsing, GitHub source control, C++11, and bjam source build tool as its main developing tools.

Stealth Link Server is implemented as a service within the S8 Service Frame Work, it can 0 also run in standalone mode.

The Server exposes its C++ data object class library, so the third party can integrate this library and design a third party medical application which link the S8.

Stealth Link Server also spawns multiple Unix POSIX threads via Boost ASYNC IO service to let its client connect to and to request the Stealth Link server exposed data object, thus Stealth Link Server can relay the data objects back to the client.

All the communication between the client and Stealth Link Server are handled by boost ASYNCIO service and boost C++ library.

It supports blocking / non-blocking message IO protocol service, the client can also subscribe the needed data objects from the Stealth Link Server which will be treated as an event that will be notified through S8 Stealth Link Server and delivered back to the client.

This is implemented similar to OSGI and CORBA technology approach.

Work on adopting Cassandra and MongoDB for future NoSQL database applied access, Java/Python module design.

Perform



Contact this candidate