Richard Katz.
PROFESSIONAL EXPERIENCE:
DataEngArch with client PACKT Ltd. (Richard Katz, Author) Jan 2021- to present
Advanced Computing Math and Data Pipeline Architecture (2025-2026)
Working with machine learning work with Spark/ML plus multiple languages especially Julia and Python I am also an expert in Java/Scala/Python in Machine Learning. Starting this current phase in 2023. It has grown and evolved some since. Currently I am working with Julia and its Math (MatLabs related) systems I am finding exciting. Julia is a kind of extension to Python. They work together and I have them both running is Jupyter Labs. Also, Julia gives great extension capability for working with SymP /Spark mathematical object systems: SymPy.jl and Symbolics.jl, Deriving math solutions in Julia based on its language adds a Matlab-like capability greatly improving upon and adding to what Python can do, is extensive via the LLM also providing Mathematica and Lisp heritage. All together Julia is more capable and runs faster than Python. Julia runs on top of the LLM it enabling AI connections. Working with Julia we are able to jump into a future facing system on top of the LLM.
Data Engineering Architectural Overview (2022-2024). I completed this part of the project and it is already in draft. Data engineering has been my main concern with UC Berkeley and Autodesk. So I constructed an overview of the data engineering field as a whole. This architectural overview helps people understand what the general process is of data engineering But also what software is used in data engineering and where. And, the overview also exposes the various methods of Data Integration – obtaining data from multiple sources using keys and patterns. In this way we can see how we develop strategies to deal the main data engineering targets: ML, AI and BI. I am documenting these efforts act in cooperation with client PACKT Ltd to show it show evolution and industry changes.
Analytics and Logic (earlier work before 2025) I have done work to describe analytic logic language (See CetusLinks/prolog) as a major added capability and even alternative to what we commonly refer tp as “AI” though AI in really includes aggregated data not necessarily even systematically gathered. Describing and using logic provides a set of meaningful upgrades that make AI actually intelligent instead of random
Google (Info Objects) via info objects, San Jose, ca May 2020- to Oct 2020
Software and Data Engineer
●Remote directly for Google. I learned and worked with GCP/GCloud. I helped Google modernize and convert their Python-based mapping system, the Geographic Information System (GIS) platform. Google basically taught me and I the work on Google Cloud Platform (GCP) plus Python 3.
●Built data graphics, data visualization application in new Python 3 infrastructure and worked on conversion of application from the previous Python 2/6/2.7 to current 3,7 and upgrading documents.
Python 3.7, GCP, GoLang, Google App Engine, Gcloud, on Chrome Remote Desktop (CRD)
Loanpal, contract, Roseville, California NOV 2019-JAN 2020
Technical Data Engineer (Scala)
●Worked on Data Pipeline with Startup LoanPal, a PayPal offshoot, to set up its Spark Data Analytics application in Scala Spark and Lift App Server. Set up data aggregation system using Airflow and CI/CD system using Jenkins. Then company moved to Roseville, CA.
Autodesk via rose international, SAN FRANCISCO Sep 2018-Apr 2019
Senior Data Engineer, Spark Data, Performance Engineer (MLOps), (REST API)
●AWS High performance Data Pipeline Analytics using Log Analysis with EMR: Spark. )
● Working inside Spark I added capabilities to track input data as required by AutoDesk management. This enabled Autodesk to view the source of customer data as it progressed and came in from cutomers around the world. I built an interface into Spark. I added public variables to the SparkContext and I built a viewing program outside of Spark in PHP.
●Before working on this Spark project, my job here actually started with building a RESTful API in a Pythnon framework and building a multi-container reporting system to monitor performance.
● I then started working on Spark performance and performance tracking. Autodesk was processing over 800 Petabytes of data into Spark - incoming customer data from around the world, pushed it in through Spark. Keeping track of Spark was highly important. I was brought in to work on Spark to be able to track progress. I set up Spark to collect and identify location data so that operations could tell where the data was coming from. Using identifying data inside the SparkContext to enable tracking data statistics by location keeping track actually inside Spark.
●Spark MLOPS and Devops.The extension involved working within the JVM memory inside Spark. able to modify the SparkContext to add tracking information in JVM storage.
● Strong experience in SQL and Relational Database, ML Ops, Scaling Determine ways to work with the AWS architecture to be able to address and connect and message between containers using Infrastructure as code Terraform for instance – EC2 to EMR.
●Built Web Dashboard with Django and Message Processing (MQ)
●Analyzed Stream set and Monitoring Logs. Used Spark to Integrate multiple multiple stream sets from around the world. At AutoDesk we received and managed approximately 8 Petabytes of data – it came from around the world arriving at different times. And we were to run this through many Spark instances. Basically I built the ability monitor and manage into Spark adding these capabilities to the SparkContext.
●Worked with Web Engineering team as we re-engineered business processes
●AWS EC2 statistical Tracking system PHP I wrote a PHP-based Web server on EC2 system that connect to the EMR Spark Server and displayed location data to track EMR work progress.
●AWS EC2 Python Django and Flask API, with SQL Alchemy system. worked another 2nd EC2 Instance using an API-based Django and Flask System, with SQL Alchemy back end.
●Ran multiple Spark instances in support of major work to optimize and monitor performance of very long running Spark job.
●Continuous delivery – ETL Also worked with and deployed Airflow. Wrote tracking of the KPI output to RDS (AWS RDS Aurora PG SQL). Wrote and deployed data graphics Developed with PHP plus Google Charts,
●SQL Alchemy: Also designed implemented new SQL Alchemy Query Group based on Amazon Athena PostgreSQL Improved SQL- Alchemy MySQL back end: added user selectable query feature. Did Docker deployment for Chalice (Flask) application on EC2. Upgraded architecture document using Lucid Chart. Built TDD Unit Tests and Continuous Integration Tests: Unit Test To do this I did complete configuration and deployment of both EMR and EC2.
PySpark and Python 3.6, on AWS, with Docker, EMR/EC2, Spark/PySpark, Spark 2.4m Java 8, Spark SQL, Spark-JDBC, S3, Parquet, Aurora, MySQL, EC2, Google Charts, Bulk load, Boto3, Lambda, SQL Alchemy, MySQL, Aurora, Magic Mock, Patch, Jupyter, Eclipse, Python, Visual Code
Stanford University, (fulltime), Palo Alto Nov 2017 to Apr 2018
Senior Software Engineer
●Designed and implemented Email Alert processing system converting existing Perl program to Python. Especially worked with end users at Stanford Hospital.
●Implemented DevOps JIRA, Addressed issues using ServiceNow. Used Microsoft Excel to plan out work.
●Managed support issues with Service Now
●Worked to make Email program both Python 3.6 and 2.7 compatible. Converted original Perl used Regex extensively and lacked a menu-driven method to identify types of email. I wrote special Regex functions that returned multiple values so as to modularize and I used a Dict with else to create a proper menu. Produced output in JSON and wrote to MongoDB Document the conversion in detail on Git using Sphinx using Windows power shell.
●Scripting in Perl, Python, JavaScript
Education Cloud and Test Training: As of Stanford, I am AWS Certified but my original cloud training is on Azure and have worked under Google Directly in GCP.
Debian Linux, bastion host, Python 2.7, Python 3.6, Perl 5, Sphinx, Email, Pip, Nose, PyLint, PyTest, flake8, MongoDB, JSON, click, MongoDB (NoSQL) AWS, ITIL Certification
University of California, Berkeley Jan 2009 to July 2017
Project Leader, Department Group Trainer in JavaJEE App Building, and Data Engineering
●Project leader support a new department technology, my job was to train other developers and get development moving in the Java/JEE Spring Platform. I trained software developers in how to use Java, the JVM and Spring Boot.
●At the start of the project introducing the Java/JEE/Springs stack to developers I coordinated with a multi-university effort called the Kuali project a joint project with 60 universities, including UC, MIT and others. Introduced the building of top grade highly scalable solutions, highly reliable solutions. Project Leader Managing up to 8 software developers
●We started with a small project, Kuali Ready Dashboard that monitored Security.
●We then and then used sophisticated design and mapping components including Elastic Search and the “ELK Stack” to make it possible to enable simultaneous consideration of multiple models for course design at the same time.
●The result was he Berkeley Course Management System (CMS). It Integrated multiple dashboard views simultaneously to represent multiple course content sources. This enabled UC to aid the faculty in course design and courseorganizion, plus implementation of course and course materials. This led to another project where SIS worked with students to build an additional component called ScheduleBuilder. I trained on a development model for a Java/JEE Applications plus front-end JavaScript and backend relational database systems.
●Continuous Delivery. Work with Agile methods and Extreme Programming to speed and organize development.
●Service Management: Analyzed, Re-mapped and re-engineered business process to integrate with Service Now. Also Set up Java Message MQ Processing.
●Security. I als specialized on Network Security to address issues. I obtained ITIL Certification. Worked in the Security (OWASP) Group. We set up data governance on relational applications, assisted with with JIRA and application management tools, ANS 522 training. Consulted with Dawn Song of CS on secure healthcare data transfer using REST. Organized Integration with Service Now to introduce added security.
●Databricks, Spark, Scala With people who worked with me learning the Java/JEEand Spring Framework, we delved into learning Spark and Hadoop and worked with Sparks numerous solutions. Over time, our Berkeley developer group networked with the Databricks team as Databricks was formed. I obtained and worked with the Berkeley Amplabs version of Spark locally on my local computer. After Databricks was formed, I and 2 others took the actual UC Berkeley course on Spark. In the course we covered Matei Zaharia’s book on Spark and worked through every algorithm in Spark Mllib. This included Natural Language Processing (NLP), Classification, Regression, Clustering, Feature Extraction using Hadoop MapReduce and PIG plus Spark Worked with Spark and Parquet columnar file format. Deep Learning the PIG solvers plus TensorFlow, Keras, PyTorch, Deep Learning, and Robotic methods including Simultaneous Localization and Process Mapping, and PyTorch. We also used Pandas-Spark integration methods to load data into Spark.
●MLOps Azure: Our version of Spark worked on the Azure Cloudand used the Azure Databricks. Worked Azure Databricks and Azure Data Factory tools to load data into Spark, using Cal Trans data, developed testing and training sets. We were some of Azure’s first users of the Azure Data Factory (ADF)
● AI LLM, LangChain, SageMaker with Python, Worked in group with AI Technologies.
●Patient Safety, Health IT: On Health Information Technology (HIT) in US Healthcare. Held discussions and worked and helped UC Computer Science professor on additional security and patient privacy issues in SOA and REST API and message-based data exchanges. And I studied and consulted with UC Davis Professor Norm Matloff on Spark using his favorite tool, - the R language.
●AI Class: Peter Norvig and Sebastian Thun. I along with 65,000 AI enthusiasts took the original Norvig/Thrun class In 2011. The class covered the development of AI, and principles of safety for automated robotic equipment (including self-driving vehicles) plus the Semantic Web – how we store and manipulate AI related material emphasized safety as a core design principle focus.
Tech expert on JEE, Java 6, 7, 8. 9, GWT, Spring Framework, Spring Security, Spring Java Database/JDBC, Spring Boot, Lucene, Jenkins, Hudson, Nexus, Jasper, CAS, Installed Elastic Search, Shibboleth, Tomcat, SOA Workflow, Scala, Python, R, Django, Memcached, Oracle, Spark, PySpark, PostgreSQL, Maven, IntelliJ, Rake, Maven, Jenkins, Nexus, Mac OS.X, Anaconda, Jupyter, Spark, PySpark, Pandas, NumPy, Numba, Matplotlib, Bokeh, DB-API, psycopg2, Virtual Box, Vagrant, Windows, Ubuntu, CENT OS Red Hat, PeopleSoft.
Wells Fargo San Francisco(contract), CA Nov 2006 to Sep 2008
Platform Senior Software Architect and Engineer
●Designed and built successful SAAS SOAP SOA API message driven microservices system on Linux.Worked in Network Cyber Security group: The Wells Fargo Alerts Team: Designed and built integration with existing SOA financial services. As tech lead, designed and architected processing for the Alerts platform and its SOA interaction SOA integration for Customer Notification Authoring System. Implemented scalable parallel processing high- performance environment with Web and JMS integrated architecture using Identify and resolve Security issues using Fortify. Design code and implement new user tool (Julius) to design customized content with expanded domain model to reduce service alert cost: for customer banking messages. Implemented 2FA, DKIM solutions.
●We worked with numerous file and data formats: XML, JSON, comma separated, delimited.
●Software and hardware testing. Also provided architectural long-range planning to corporate team. Coordinate work between Alerts Team and other development teams. Develop domain object, model, and described business use cases
WebLogic 8.1, 9.0, WebSphere MQ/MQI, J2EE, JMS, Spring, Velocity/Ajax Hibernate, Oracle 10i, Toad, Web services, Hermes, MQJ-Explorer, JUnit, Solaris Unix, MyEclipse, Ant, Sparx Enterprise Architect, Toad, ERWin, JDK 1.4, 5.
Charles Schwab(contract), San Francisco, CA Apr 2006 - Oct 2006
Senior Engineer/ Developer
●Buit Employee Stock Purchase Dashboard. Starting from Ruby, built J2EE Applications in Java, Struts, Hibernate, Swing, Web services, for WebLogic, Oracle. Clarify use cases and UI Specs. Write component specifications in UML 2.0.
WebLogic Oracle, Ruby on, Rails, Web services, Hibernate, Ant, Linux.
Ingenuity Systems, Redwood City, CA Jan 2006 - Mar 2006
Software Architect
●Young software architecture company maturing out of startup. Work in Java/JEE and XML Configured SQL for Web Ontology bio-informatics systems, knowledge base and ontological and software with young growing company cataloging genomic data.
Spring, ACEGI, Hibernate, iBatis, ORM-based Database, Java Web framework with, Ant, Maven, Protégé, Eclipse, Swing, YWorks, Oracle 9i, UML, Sparx, JUnit, ER Studio, Python.
Wells Fargo, San Francisco, CA (contract) Sep 2004 to Nov 2005
EBAM SOA BPEL Business Process Project Lead Engineer
●Defined and built SaaS SOAP API technical Core Java architecture and led team l of 4 developers implementing IBM-based BPEL API and message-based solutions, Worked with the IBM development and implementation teams directly. I organized business process analysis and Led design, illustration, and implementation to set up BPM based business process message routing solutions for the bank, including a design of special monitoring for claims-fraud management.
●Project leader of team that created proof-of-concepts, and BP prototypes, and the actual implementation. Defined the integrations, design methods, development methods, target architecture for SOA BPEL solutions using IBM WBI-SF Process Choreographer. Documented integration architecture including JMS and Web Service integrations with MQ series, mainframe systems, and, Microsoft-based Web Services.
●Designed and tested high-performance scalable engineering hardware and software solutions for business and systems using UML 2.0, BPMN and BPEL tools with design patterns in WebSphere Process Choreographer integration applications WSADIE 5.1 J2EE JAX-RPC, BPEL and Web Services, WSDL and XSD through complete life cycle, end-to-end testing, Q/A, and production. Designed for SOAP fault handlers, logging handlers, process-to-process integration, and XML schema validation and implemented overall BPEL and Java/J2EE workflows.
IBM WebSphere 5.1.1, Linux Cluster, MQ Series, DB2, WSADIE 5.1, WBI-SF, BPEL, J2EE, EJB, WSDL, SOAP, XSD (XML Schema), UML, BPMN, Sparx Enterprise Architect, JAX-RPC, SAAJ, SoapScope, XML Spy, Andromeda, Hibernate, Maven, Axis, Tomcat, Linux, Eclipse, ClearCase.
USPS via Northrop Grumman, San Mateo, CA July 2003 to Aug 2004
Tech Lead Manager and Architect Lead
●I was a tech lead, officially architect for a multi-team WebSphere project. As result of work and proposal with Northrop, San Mateo center become the central focus of WebSphere projects for the USPS. Lead multiple software projects and designed architecture for Web-based and Web Service systems using MDA-based modeling with Java Spring and database
●As team leader I supervised multiple teams of from 3 to 5 and worked as architectural design resource for WebSphere applications. Provided instructions and assignments using Agile methods. With the group, design, model, prototype, and develop J2EE Web and Web Service applications using Struts, Spring framework with Oracle 9i and DB2. Analyze and prototype integrated enterprise retail payment system. Produced models and design documents for WebSphere apps – UML, XMI and other tools. Work with IBM to analyze WAS deployment, single-sign-on, and XML-based Web Services.
●Trained of USPS staff in J2EE, WSADIE, Struts and Web Service technologies and UML tools.
●Implemented CVS change control with WebSphere WSAD 5.0 and 5.1, and defined security realms page protection, roles, and sign-on mapping for WebSphere applications Developed USPS management plans for security, and disaster recovery and multi -system Active Directory single-sign-on solution
J2EE 1.4/1.3 Spring, Struts 1.1, JAX-RPC, JDBC, EJB, JSP, XML, XSL, XSL-FO, Apache Axis WebSphere/WSAD 5.0., 5.1, Eclipse UML, Tomcat, JDeveloper, MetaMill, Visio, Describe, UML 1.5-2.0, SPF, DB2 7.1, Oracle 9i PL/SQL, MQ Series, Axis, FOP, Krysalis Barcode4J, Ant, Quartz, -WAS-RBAC, Active Directory, Oblix, CVS. Retail accounts payable, weighing and rating, logistics.
SBCGlobal, San Ramon, CA May 2002 - Nov 2002
Front-End Developer
●Built front end dashboards with Web scripting tools. Design and implement High-volume on-line Web tool for SBC Yahoo! DSL service conversion/provisioning, Build JSP/J2EE/IBM WebSphere 4.0 Design, develop implement Web-based tool in J2EE: JSP to EJB interface, MVC architecture, flow control manager using, Bean-based components, extensive JavaScript, DHTML, HTTP Design Web application using MVC and DTO design patterns. Application design. Train other developers on WSAD, ADO Oracle 9i.
WebSphere 4.0, WSAD 4.02, CORBA, JSP, JavaScript, ASP, JDeveloper, Perl, Dreamweaver Oracle 9i, BMC Patrol, Tomcat, Macromedia Dreamweaver, Perl, DHTML, Win 2000, Solaris, JacORB, CORBA
Java Skyline, Pacifica, CA Aug 2001 to Mar 2003
Editor-in-chief
●I created and managed an industry tracking website, JavaSkyline.com: Tracked major advances in Java/JEE JMS, APIs, JDBC platforms. The website listed and gave overviews for manufacturers of Java/JEE related systems including JDBC drivers, databases, JEE Servers, API and SOA development.
●Wrote articles and maintained industry magazine for Java and Object Oriented Language Software Engineering. Studied and track industry news about the Software field, the J2EE and Web Service platforms. Develop technical info on: SOA, UML, WS-BPEL. Develop J2EE developer tutorial solutions and training guides for J2EE, Web Services and IDEs.
Author for Cetus Links: Object Oriented Prolog, The document lists and describes Prolog and related Object Oriented Languages and how they work. It including: OCAML, LogTalk, Mercury, Mozart, Gödel and LIFE. And the Warren Abstract Machine.
WebSphere, WebLogic, JBoss, Tomcat, Java, J2EE, OCAML, Web services, UML, XML
Amdahl Corporation (Fujitsu), Sunnyvale CA 1/1998 -12/1999
Data Engineer: ETL Developer Architect
●Here I first worked with the service bus architecture that we use in data engineering. I developed data extraction, transformation, routing and Data Warehousing procedures to feed Business Intelligence (BI) solution for Corporate Performance Analytics.
●Learned and worked with Data Mining techniques using Ralph Kimball as a guide. Built extraction system using Amdahl/Oracle Essbase (a.k.a Huron, similar to Tableau, now known as Oracle OLAP). Worked with and built SQL Star Schema. Used tools to extract data from corporate systems.
●Used Data Analytics. Built data extractions from Amdahl corporate source data with their ETL system, and DB2 database used SQL queries written in Object-Star and FoxPro. Loaded data from Object-Star into Essbase (now known as Oracle OLAP, a BI tool) from which business intelligence cubes were created.
Worked with Oracle Analytics (Essbase) to generate, build, Microsoft Excel workbooks. Amdahl, DB2, SQL, Object-Star, Essbase, FoxPro, Excel
InsWeb, San Mateo, CA 8/1995- 3/1996
Software Developer Web
●Helped lead new online insurance startup InsWeb, I and sveral other C/C++ developers built Insurance Web (InsWeb). We wrote it as an online insurance sales and market statistics tracking system. It enabled insurance customers to run an interactive vehicle insurance rating system, ww.
●The project was designed using a centralized facility service approach and had a centralized database core. After I finished the vehicle rating system I worked with and helped Lee Fesperman build the database facility service – in C language
●The company eventually filed patents for the flexible vehicle rating system
C, C++, Linux, XML/HTML, Excel.
Guidant (Abbott Labs/Boston Scientific), Saint Paul, MN 1/1994- 6/1995
Data Engineer
●ETL data engineering for Pacemaker – Embedded System. Produced FDA Clinical Trial Tests for Regulatory Requirement Reporting: I and Paul and Randy Smith came up with workable solutions to build ISO 13485 FDA Reporting System for the Guidant Tachycardia product team I built and the Clinical for FDA reporting. We used Object Oriented SQL and we resolved complex query problems to achieve this complex Data analysis. We extracted data from multiple sources: Hospital, Patient, Intake, Doctor, Device, Operation, Observations, Complications, Data Governance, follow-up. Organized Data. Wrote FDA required analytics. At one point I resolved a complex 7 way SQL join to do this.
●Direct work with HIPAA and HL7 related data. Set up Electronic Medical Records Database. I and several other developers working in teams, created the SQL-based data repository and built Clinical Reporting system for FDA. We worked with device product managers. We stored necessary device and monitoring data to report to the FDA to gain device approval. We built SQL scripts and schemas and to load this data into a relational database,
●Wrote, and tested queries. I wrote analytical queries in SQL to produce the values that would match reports by statisticians for the FDA. These determine the patient prognosis and risks of life-threatening and/or device failure.
●Work on tachyarrhythmia team. We used SQL to create and generate statistical reports on Device Efficacy, patient survival probability FDA clinical reports.
MS-DOS, SQL, XML VFP, queries tailored to meet FDA Specs
Bechtel Corp/Bechtel Power Division, San Francisco, CA 1/1977 - 6/1986
Project Team Technical Leader, Scientific Engineering Programmer
●Led highly successful and recognized engineering programmer team for the ME101 Pipe Stress analysis program. Led and managed 4 to 6 programmers. ME101 Pipe Stress – including the civil Structural Analysis Program, Berkeley SAP aka (Finite Element Analysis Program) written by Klaus-Jürgen Bathe and Ed Wilson they produced graphic including three-dimensional stress analysis and also time-series results of seismic events and seismic safety. Using SAP, ME101 provided very rapid analysis of pipe stress, was used to validate the piping system on nearly 150 nuclear and conventional power plants.
●My work included performance analysis amd optimization engineering softwarebuilt on SAP which was made parto of Bechtels ME101 Mechancll (Piping) engineering system. SAP made it run faster faster than predecessor ME632 and faster than what could be achieved with outside service bureaus .
●c.
●Hardware and software testing and optimization. Specialized in obtaining speed and algorithm optimization I wrote analytic engineering programs, and led team of engineering programmers to validated structural and mechanical engineering programs Fortran and Assembly Language on Univac 1100-40. The group consisted of 4-6 engineering programmers and required mathematical, programming, and engineering skills. I wrote code in Fortran and Assembly language.
●The program analyzed structural metrics and structural risk statistics, safety and integrity of piping system. It looked at the stresses and strains on the piping systems. We reported to both Bechtel Corp and Bechtel Power Division. I worked directly with and reported to Pipe Stress Engineering and advised both pipe stress and structural engineering groups in Bechtel Power,
●Worked in Business-Process Reengineering to automate business processes workflow
●A main task was to improve the speed and optimize this major analytic job.
●Technical Manager Pipe Stress Engineering Support Group. Fortran, Basic, Assembly Language,
Sperry Rand Univac, Arlington, VA and San Francisco, CA 6/1972- 12/1976
Customer Support, Business and Scientific Engineering Programmer
●Systems Analysis and Business Process Re-engineering. Help Sperry be able automate business processes. Re-design and re-engineer business processes to take advantage of new techniques and capabilities. One customer in particular was an engineering company: Learned Important adaptive skills as we helped customers and dod re-engineer process to automate numerous kinds of process..
EDUCATION:
●BA, Syracuse University, (1964-1968) Maxwell School: Sociology plus Math/Science, Logic, 2 years of Calculus included Factor Analysis – techniques that we commonly use in real world data science and analysis.
●Masters Level American University: Masters Program in Computer Systems and Operations Research (1975-1976). I finished much of a Masters degree at American before I was shipped to the west coast
●Masters Level at UC Berkeley: After coming to the West Coast I took 2 very high level courses first in Finite Element Math and Solutions with Berkeley SAP (1979) and then I took Databricks Spark, Hadoop, Keras, TensorFlow and Torch (2025). Plus: Graduate level Artificial Intelligence with Peter Norvig and Sebastian Thrun
●Graduate level Economics: Full year of economics - Golden Gate University
●Graduate Degree in Independent Film At JFK University (2004) 18 months with independent film leader Sandra Davis of Canyon Cinema
●Industry Coursework and Certifications: AWS Certificate and Stanford: AWS Technical Essentials (at Stanford), SANS Security and OWASP Security Leadership ITIL (Berkeley).
TDD (Wells Fargo) Java Refactoring, Management Training,
Fluid Mechanics Programming (Bechtel), Large Scale and Real Time Systems, (Unisys),
Presentations, Papers, Writing
●Current : My current project with PACKT I call Python and Mathematics 2024-2025: Mojo, Julia, Rust, SymPy, SciPy
●Bechtel: Optimization of Scientific Programs. Performance measurement and improvement of finite element programs (Bechtel Engineering), Using Scala Actors (Berkeley IST), Spreadsheets for Business (InfoWorld), Servlets & JDBC (Amdahl IT), Implementing Applications in IBM WebSphere (USPS IT), SQL OOP with VFP (Cardiac Pacemakers), Using Cookie-Cutter at BayPiggies (Python Software Foundation)
TECHNICAL SUMMARY:
●Regulatory Experience: I have direct experience building the FDA regulatory processing system for Guidant medical equipment. I also strong work experience Data Engineering: MLOPS/ Spark/Hadoop, Deep Learning, Medical Equipment (HIPAA, FDA) Reporting and ETL, Process Optimization, and Project Leadership. I also have regulatory experience in education work at Berkeley and Stanford and computer supplier experience with Sperry
●Project Management Experience: I have worked as project leader and team manager and as tech specialist in MLOPS . I began project lead at at Bechtel Power and have had lead roles at Wells Fargo, UC Berkeley, Northrup Grumman/USPS. In doing so I have and have worked with incredible managers. With their guidance I have achieved some great results in software and specialized in data engineering.
●Data Engineering Process: Overall, I have worked on 7 Data Science and Data Engineering projects including Guidant (Boston Scientific): ISO 13485 FDA Product Reporting, Amdahl Corporate Data/Essbase, Wells Fargo, UC Berkeley (HIPAA) Autodesk, Loan-Pal plus highly technical projects at Bechtel. Overview: I have worked with Spark MLOPS on Azure as well as AWS. I was taught GCP by Google. I have some experience with LLM and