Highlands Ranch, CO *****
***@*******.***
JAY K GARDNER
OBJECTIVE My objective is to automate your enterprise so that you can accomplish more with less, increase your organizational agility and shrink the time needed to deliver value to your customers. LANUGUAGES,
SKILLS,
KNOWLEDGE
- DevOps – Cloud Bees Jenkins, Artifactory, Bitbucket, GitHub Enterprise, SonarQube, Veracode, Bamboo, Jira, Elasticsearch, Docker Containers, Docker Desktop, Kubernetes, Digital.ai, Dora Metrics, IBM Build Forge, Camunda, Terraform custom providers and modules.
- C#, Java, Go, Perl, VB.net, Terraform HCL, Bash scripting, Batch scripting, TSQL, YAML, JSP, Servlets, Custom tag libraries, JDBC, JNDI, HTML, CSS, Javascript, Jquery, took C & C++ in school.
- .Net 1.1-.Net 7, Blazor Server, ASP.net webforms, MVC 4&5, WEBAPI 2.0, Windows Forms Apps, Windows Services, Console Apps, Linq, SOAP Web Services, WCF, WF, MSMQ, SSRS, SSIS, OOA, OOD, SOA, UML, TDD.
- Enterprise Architect Requirements, Activity Diagrams, Use Case Diagrams, Domain Models, Class Diagrams, UI Diagrams, Robustness Diagrams, Deployment Diagrams, Entity Relational Diagrams.
- SQL Server 2000-2022 Enterprise, Stored Procedures, User Defined Functions, Database Design, Oracle, MySQL 5, Postgres, Access, Mongo
- Visual Studio 2000-2022, Visual Studio Code, IntelliJ, JBuilder, Vi, Netbeans, Eclipse.
- GIT, MSTest, XTest, NUnit, Moq, Cruise Control.net, Team City, Team Foundation Server, Azure DevOps Server, Subversion, CVS, PVCS, ClearCase, Visual Source Safe.
- Certified ITL Practitioner (Service Desk, Incident & Problem Management), Six Sigma White Belt, Microsoft Business Intelligence and Design, Forte Enterprise Java training from SynerJ, Completed Solaris 2.x system administration I & II (Media-based Training).
- Windows Server 2016-2019, Windows XP-11, Ubuntu Server 18.04-22.04, Redhat 7-8, Unix, HPUX, Solaris.
Page 2
WORK HISTORY CHARLES SCHWAB, LONE TREE
Employee – Director, Software Development & Engineering Principal 6/2022-1/2024
- Business Information Model (BIM)
o Schwab utilizes a BIM providing the skeletal
information taxonomy of information type
relationships ensuring a common data type
nomenclature. The Schwab Infrastructure and
Operations (I&O) team lacked the needed model
Infrastructure types with the required fidelity. I initiated, and co-authored, necessary model
enhancements, enabling Strategic Engineering
information mappings for their various engineering blueprints. These changes also provided a common
data map for solutions architects to anchor their data designs.
- JIRA Common Service
o I designed and coded a .net 7 core automation façade in front of the Jira REST API. Requests received by either a REST API or RabbitMQ listener create Jira requests for creating, updating, or returning status of a Jira Issue. Response status of the Jira issue could use either Jira webhook event for near real-time status or API polling using a reasonable backoff strategy. This automation façade enabled Jira workflows to be
injected into a larger automation workflow, such as an BPMN or other workflow.
- REST API GOVERNANCE
o I was one of 4 architects in I&O that provided REST API oversight enforcing Schwab standards. We utilized a
“weekly office hours” to conduct our reviews.
- REST API to Powershell Reference App
o I wrote a REST API that called powershell scripts and commands. The idea was that the windows system
administrators had thousands of scripts they used for
“micro automation” but hundreds of these tasks
needed inclusion in higher-level workflows. Wrapping them with this REST API enabled reuse of these
powershell scripts while still providing API based AuthN, AuthZ, Logging and API standards.
- ACOP
Page 3
o I created an “Architecture Center of Practice” (ACOP) to be used by the Process Engineering and Cloud
Platform team within Schwab I&O. This employed a
confluence space containing standard architecture
artifacts and guidance on how to use and maintain
them.
o Created templates for Use Case, Activity, Service Context and Sequence diagrams.
o Tracked team creation of architecture artifacts using Jira Kanban board.
- BPMN Vendor comparison
o Led an effort to evaluate 3 automation workflow
vendors. The products were Appian, Camunda and
Argo. I gathered the workflow requirements from our 3 teams and built a scoring document where these
teams evaluated the products against the required
criteria and then scored each product. I personally wrote Camunda 7 workflows that proved we could
have both manual and automated service requests. I wrote demo services using JAVA, node and C# to
demonstrate that the Schwab polyglot environment
could easily participate in the workflows. I also should how to check the workflows into source control and deploy them using a GitOps pipeline enabling
workflow definition versioning. I demonstrated
integration with JIRA showing how workflow control could be passed to JIRA, executed and then passed
back to the Camunda workflow.
- Network Firewall team process analysis
o Participated in the requirements gather process
related to how the Schwab Network Firewall team
would process incoming work requests.
o I shadowed various team members and documented
their tools and process flows.
o One of three team members that contributed to the final document that described optimization and
automation opportunities for the Network Firewall
team.
CHARLES SCHWAB, LONE TREE
Employee – Cloud Automation Technical Lead
2/2020-6/2022
Page 4
- Central Control Plane (CCP)
o This application provided an automation capability where development teams could go to a UI (CMP) and select infrastructure resources needed for their
applications. The CMP then forwarded the requests to a backend Rest API where a microservice translated them to Terraform HCL and stored them in Bitbucket within that team’s repo. This triggered an automation pipeline that ran the Terraform, init, plan and apply commands against the HCL in scope. All Terraform
state was managed in a remote backend using the
Schwab S3 storage product. When the Terraform apply command finished running, the pipeline parsed the
Terraform output. RabbitMQ listeners received the
output status and informed the front end. The initial resource types supported were caching services,
object storage, database instances and VM instances. I wrote the initial Java implementation of this
application and then functioned as the primary
solutions architect to enhance and build out the
automation platform. I also led the creation of a
custom Terraform provider for the Schwab S3 object storage product.
Features
o Plug-n-play architecture for new service type
provisioning and configuration.
o Common downstream adapter architecture for which infrastructure platform teams could integrate.
o The UI (CMP) utilized data-driven forms that were dynamically rendered based on metadata sourced
from the DB and served via cache.
o A common database for capturing and managing
resource inventory.
o A Schwab “product level” API that would aggregate all downstream integrations into asynchronous
workflows producing a fully functional product usable to the development team.
o Resource type versioning management enabling
resource definitions to morph over time without
breaking existing definitions.
Page 5
o JSON API request validation and transformation to Terraform HCL.
o Provisioning pipeline for processing the terraform and returning request status.
- Cloud Management Platform (CMP) vendor comparison o I evaluated 3 Cloud Management Platform (CMP)
products. Included in the exercise was Morpheus,
Cloudbolt, and Scalar.
o I brought each vendor in for demos and used the
Gartner CMP capability matrix to create a list of
capabilities representing Schwab requirements. I then collaborated with team members to evaluate, and
score, the products before presenting a
recommendation to management.
- CARB Member
o I participated as a voting member of the Schwab
“Cloud Architecture Review Board” to review, refine, and approve architecture related requests for
enterprise applications at Schwab. This meeting was a weekly commitment with additional offline review and commentary throughout the week.
- Deliver Pipeline Strategy
o I represented Schwab Infrastructure and Operations, in an enterprise level collaboration creating the
Delivery Pipeline Strategy document that described the new software delivery process and the
recommended tooling for the next 3 to 5 years.
o I wrote updates to 3l sections of this document. The Office of the CTO (OCTO) accepted the strategy
document.
VISA, HIGHLANDS RANCH
Employee – Chief Systems Engineer
5/2014-12/2019
- SCM Station – I designed, coded, and operationalized a self- service/automation application providing application resource provisioning and user access management. The platform included an MVC 5 website, a C# console application acting as a message processor, a C# console application acting as a job runner and a REST web service for system integration. The resulting user experience enabled Software Configuration Management (SCM) customers easy onboarding, team and IAM Page 6
configuration and build tool resource creation giving teams an immediate velocity increase. The time to onboard a team onto all the required tools went from 3 weeks to 30 minutes. Features of SCM Station are:
o Visa dev teams can onboard/register themselves in SCM Station.
o Teams can order SCM managed application resources from a self-service web page or rest API. From this portal, team admins can create Artifactory repos,
Bitbucket projects and repos, Jenkins folders and
projects, and Azure DevOps Projects.
o Teams can create and manage virtual repos in
Artifactory through SCM Station.
o Eligible Visa users self-provision windows service accounts to use in pipeline tool integrations. Users can also manage the account password after creation.
o Once these resources are provisioned, the admins can then add users as team members with team defined
roles. These “team defined” roles contain mappings to build tool native IAM permissions. When team
members receive a role entitlement, they are given access to the COTS applications within minutes.
o Team owners execute User Access Revalidations (UAR) for their teams. The system automatically produces all required governance and IAM audit artifacts in
support of SOX and SSAE audits. This audit feature reduced the number of people required to do the 3- week audits from 5 to 1 individual.
o Annually, this application processes 10,000+
provisioning and user access requests/configurations. These were previously manual at Visa. 98%+ of the
application teams within Visa used the tools managed by SCM Station automation platform.
- CloudBees Advisory Board – As a Visa representative, I participated for two years as a member of the CloudBees Advisory Board. I represented the enterprise challenges faced by Visa using Jenkins.
- IBM Rational Sunset – We received a mandate to stop using IBM Rational tools. I performed the architect role, designing the port of software build jobs from Build Forge projects to Jenkins pipeline jobs. I worked directly with CloudBees on designing the new pipeline patterns. Finally, I worked with a Page 7
third-party consulting company to complete the project migrations.
- New Tools – I worked to bring in new opensource tools to support SCM software delivery pipelines. I participated in an Engineering Excellence campaign where I gave a presentation to all of Visa Technology staff describing the new stack.
- I managed the SCM Station Development Team – I managed the team SDLC, tools, standards, documentation, and training.
- SCM Data Mart – I initiated the designing, building and operationalizing of a data mart providing observability of SDLC KPIs, and other build and platform related metrics. We captured raw data in Elasticsearch indexes and mined them using the Microsoft Business Analysis stack.
VISA, HIGHLANDS RANCH
Contractor, ASCENT
4/2013 – 5/2014
- IBM Build Forge projects – I wrote about 50 Build Forge deployment projects for the Digital Mobile team. I also designed and implemented a hierarchical, environment structure where base level environments would contain common configuration key/value pair and child environment files would inherit them. I wrote a Build Forge project creation script in PERL as well as scripts that added user access when approved in Remedy.
- Attestation Station – This was a web portal that enabled the application technical contacts (ATCs) of various SCM managed tools, such as Artifactory, Bitbucket, Subversion and TFS, to upload user access account and resource data to a central database. It then managed the process of processing user access validations, and subsequent team owner attestations. Finally, it enabled the attestation information to be easily extracted in the form of audit reports.
AMERICAN DATABANK, DENVER
Contractor, Direct to Client
1/2011-4/2013
- Software Solutions Architect – When I resumed work at American Databank (ADB), the Applicant Management System
(AMS) project was in trouble and the required integration between their three web-based portals did not work. I refactored the application design, finished the coding, managed a successful deployment the first production version of AMS. I Page 8
performed design and code reviews for developers to ensure adherence to standards and design principles.
- ASP.net Membership Provider – I created a custom ASP.net membership provider that allowed single sign-on (SSO) across all AMS web portals.
- Team Foundation Server – I configured the ADB Team Foundation Server environments, projects, and CI builds. In addition, I, along with the Director of IT, developed a new more efficient SDLC process. I documented the usage of TFS and trained the development team.
- Enterprise Architect – I created standards for the usage of Enterprise Architecture design documentation repository.
- Incident Manager – I wrote an incident management system that operations used to manage all AMS related incidents. This integrated with the “Contact Us” functionality of AMS and captured all errors as tickets.
- .net MVC 4 POC – I created an MVC4 web application POC application that used Inversion of Control (IOC), existing AMS repository interfaces, Zurb Foundation CSS framework, for responsive website design, and a new custom role provider to integrate MVC4 security (using Fluent Security) with the current AMS ISecure code.
- Tools used – Enterprise Architect 10, Visual Studio 10, ASP.net web forms 3.5, MVC 4.0, Team Foundation Server 2010, SQL Server 2008 Standard.
TRAVELPORT, DENVER (IGT CONTRACTOR)
Contractor, IGT
6/2010 – 12/2010
- Process Engineering – I worked with cross-team leaders to establish and implement change management processes for the provisioning team.
- Deployment Scripting – I wrote a series of scripts that automatically compiled, labeled, and deployed code. I wrote this with a combination of VB scripting and PERL code. This enabled the team to deploy code to a given environment quickly. A set of XML files determined the parameters used to drive the entire process. The scripts integrated TFS as well as the customer’s “home grown” configuration management database.
- Password Manager – I wrote a C# console application that would retrieve, and decrypt, system credentials used to access Page 9
system resources at various points along the deployment process.
- Tools Used – VB script, TF command line interface for TFS, custom command line API for change management system, MSBuild, MSTest, MS Word for deployment process and tooling documentation.
AMERICAN DATABANK, DENVER
Contractor, Inceed
3/2010 – 6/2010
- I performed the Software Solutions Architect role for a team of 3 developers.
- The project ported a classic ASP application to the .net 2.0 platform.
- The application consisted of 2 web portals that shared a common SQL Server database.
- I led JAD sessions for the purpose of gathering requirements in the form of use cases.
- Tools – ASP.net 3.5, Visual Studio 2008, SQL Server 2005, LLBLGEN, Enterprise Architect (Sparxsystems), Subversion, CruiseControl.net, Nant, NUnit.
LENDER PROCESSING SERVICES (LPS), BROOMFIELD
Contractor, Modis
5/2009 – 12/2009
- Request Tracker – I coded several feature enhancements, including several reports for an opensource application named Report Tracker. This was an object-oriented PERL application.
- AIM – LPS ported a client/server application to an ASP.net web application. I wrote 4 web forms for them as well as the business and data access layer classes and interfaces. We used the Microsoft Enterprise Library Application Building Blocks for data access and IOC. We used Linq-to-objects, and Castle.
- Data Feed ETL – I wrote a business-to-business ETL application.
- Tools – Linq-to-objects, Castle proxy, SQL Server 2008 Enterprise, log4net, Team Foundation Server, MSTest, Selenium UI, Enterprise Architecture, FTP, ADO
Page 10
ONTARGETJOBS, DENVER
Contractor, Robert Half
11/2008 – 4/2009
- WCF Web Services – I wrote .net WCF services in support of the OnTargetJobs SOA infrastructure. These utilized Microsoft Enterprise Libraries for Data Access and Logging. We used Spring.net for dependency injection. I also wrote unit tests for the code.
- DNN Web Pages – I wired up the service to ASP.net pages running within DotNetNuke.
- Application Analysis – I reverse engineered the business analyst requirements and existing code into the Enterprise Architect modeling tools. I imported UI screenshots, and requirements and then created diagrams showing the relationships. UNIVERSITY CORPORTATION FOR ATMOSPHERIC RESEARCH (UCAR), BOULDER Contractor, VOLT
11/2008 – 12/2008
- Ported a satellite weather notification system to a GSM cellular delivery system. This application enabled farmers, mostly in Africa, and other remote parts of the world to receive weather notifications and crop reports. The full application deployment was distributed using an 8-gig flash drive.
- Tools - OO PERL, GSM cellular modems, Drupal
QUANTITUE, CENDANT, GALILEO, TRAVELPORT
Employee, Network Management Automation Engineer
1/2003 – 11/2008
- Automation Framework (AF) – I designed and developed the AF architecture which included a central configuration database
(Oracle), a SQL server data mart, operational adapters, inter- system messaging and central processing core.
o DNS Automation – I wrote an adapter in AF using
VB.net, ADO, COM+, Microsoft DNS provider and
WMI, to synchronize two large DNS domains,
containing 40,000+ network devices, with the
configuration data housed in the Voyager
configuration database.
o OpenView Automation – I wrote an adapter in AF
using the Oracle XML query interface for retrieving Page 11
device configuration data from the Voyager system. This abstraction decoupled the AF form its primary source of configuration data.
o Device Checker Web App – I wrote PERL CGI
application that queried HP OpenView, Voyager
Configuration DB, DNS, and Cisco Reporting so that Network Operations could see Configuration and
status information for network devices such as
routers, switches, etc…
o ISDN Testing Application – I wrote an ASP.net
application using C#, SOAP, SQL Server and ASP.net that extracted network device information from
CiscoWorks. The application then logged into the
routers to test the ISDN backup interfaces. The
Network Engineering team could view a web-based
report showing which routers had issues.
o Hurricane Reporting Application – I wrote an
application that extracted network device information from HP Openview and created excel based reports
showing whether the network devices were online.
We sent these reports hourly to our customers so that they could see which of their sites were affected by the current hurricane.
o Quantitude Intranet – I worked with a graphics
designer and developed an interactive intranet using ASP.net with an SQL server backend. This allowed
authorized authors to create and verify content before publishing it to the intranet. I integrated the intranet with the Xerox Docushare document management
system using their APIs.
o SSL Webservice – I created a .net webservice that other applications used to authenticate clients. The backend was SQL Server and there was a real-time
synchronization process in place that would update the authentication database from another operational system.
o Radius Migration from LDAP to SQL Server – I
managed the changes within our Radius servers and
configured them to use SQL server 2005 as the user store. This required extensive coordination with Global Help Desk and Travelport customers.
Page 12
o Radius DB Webservice – I created a .net webservice to allow remote management of the Radius
authentication database.
o Authception Application – I created an ASP.net
application that allowed Global Help Desk analysts to troubleshoot authentication problems with both the radius and SQL server databases. This application used SOAP webservices and a SQL server 2005 backend.
o Zircon Project – I lead a small team of developers to develop a configuration provisioning application.
We used Microsoft Visual Studio 2008, Test
Driven Development (NUnit) and XP
methodologies coupled with continuous build
(Nant & CruiseControl.net) as our
development environment.
We developed an application SDK coined “the
framework”. It contained reusable code,
database structures and build tools that could
be used by developers to provide messaging
(WCF with MSMQ buffer queues and SOAP
Messaging), workflow (WF) and data storage
(ADO to SQL Server) in a modular, consistent
pattern.
We practiced OOP in our code base, and I
documented the design UML in Enterprise
Architect by Sparxsystems.
I wrote downstream adapters that Zircon
Core used to communicate to operational
systems.
I wrote the entire WCF messaging layer so
that it was reusable code in the framework
SDK and used both synchronous and
asynchronous messaging patterns. In
addition, I utilized the http bridge pattern to
achieve transactional “store and forward” for
guaranteed delivery. Finally, the message
routing was dynamic based on meta data in
the database.
I wrote an application that simulated the
messaging interface of Zircon Core so that the
web UI developers could develop to the mock
messaging allowing parallel development to a
consistent interface.
Page 13
I used XML files and XSLT to auto-generate
code as well as SQL files used to manage the
database.
I created the windows service applications
that hosted the Zircon Core and downstream
adapter applications.
I created and maintained Microsoft Virtual PC
images for our team development
environments so that we would have
consistency across all development machines.
EDUCATION MARCH 1987
Received Associate Degree in Electronic Engineering Technology Denver Technical College
JUNE 1998
Received Bachelor’s degree in Electronic Engineering Technology Denver Technical College
REFERENCES
Provided Upon Request