DON SCOTT
Saint Augustine, Florida
Cell - 904-***-****
********@*****.***
LinkedIn/Google/Yahoo – dscottfl
Most recently acting as global head of Data Governance for Cognizant across 34 corporate portfolios and 27 business units for an IT consulting company of +350,000 associates globally. In this role, Don reported directly to Cognizant’s Chief Data Officer. Data governance responsibilities included establishing structured and unstructured data governance across the entire Cognizant Information Technology estate. The Policy & Strategy team manages all policy artifacts, data minimization work and data custodian work. The Data Governance Ops (DGO) team provided the program management services for rollout of the following global data protection and privacy programs: secure software development lifecycle (SSDLC), secured data at rest (DAR), data access management (DAM), securing data in transit (DIT), system level credential key management via CyberArk, log encryption via QRadar. The DGO team was responsible for implementing various custom solutions that included a GDPR solution, Data Purge Program, DG 360 key performance metrics and compliance reporting platform, DG Automation, DG Telemetry solutions and AI Domain Learning Models. The DGO team delivers all data: inventory, lineage, discovery & classification (DDC), glossary, quality, retention, masking and security programs for Cognizant. Don was the primary liaison for the Data & Analytics department to the global Data Privacy Office (DPO), Infrastructure Services, Corporate Security teams, Legal Department, all Corporate Governance and delivery teams, HR-IT, and FinOps. Through continuous process improvement and constantly deploying enhanced automation solutions, the Data Governance program was able to reduce OpEx and CapEx spend between year one and year five by ~70% while simultaneously increasing overall Data Governance capacity by ~300%. The Data Governance program teams always sought to do more with less year-over-year.
Prior to the Data Governance role, Don had a +12 year tenure at Cognizant in commercial delivery. Don served in a CIO advisory role for many customer accounts for the last four years of his commercial delivery role. Don also retained the title of Chief architect and program director delivering very large scale, multi-million-dollar customer facing commercial solutions as large as ~$50M programs for many of Cognizant’s largest customers across numerous industries. Don specialized in designing and building n-tiered enterprise level distributed client, server solutions on premise and/or in the cloud. Don is well suited to function in a hands-on executive technical program management, operations and delivery roles. Don has extensive current and long-term experience trouble shooting, reverse engineering and application profiling architectural problems with large cross-platform enterprise-wide IT systems. Don has relevant and current practical experience with Agile, DevOps, CloudOps and AI-Ops delivery methodologies and frameworks across Automated Environment Provisioning (AEP), CI/CD, App Lifecycle Management (ALM), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). Don has practical hands-on experience delivery Cloud Native cross platform mobile and web solutions to AWS/Azure environments. Don is currently leading (very hands-on) Data, Model and Application engineering development and delivery for multiple leading edge multi-cloud AI Large Learning Models (LLMs) and AI solutions within the Cognizant Corporate IT portfolio.
DETAILED PROJECT EXPERIENCE
.
Cognizant; Senior Director, Portfolio Owner & Chief Architect (December 7, 2016 - Present)
Corporate Systems & Technology (S&T), Data & Analytics (DnA) group (October 2019 - Present)
Portfolio Owner – Global Head of Data Governance: Direct report to the Cognizant Chief Data Officer. Responsible for delivery of all structured and unstructured data governance and data protection programs. The annual budget for the Data Governance Operations (DGO) team is +$5M/year. Directly accountable for:
Leading the Data Governance Operations, Policy & Strategy Portfolio
Approximately ~12 different Data Governance value streams
Managing DGO program delivery relationships with 34 corporate portfolios and 27 business units
As many as 180 development and support resources on my team via ~8 direct report managers
Staffing and career development for team resources; regular management reviews and skip level meetings with all team members on a rotating basis across all programs; team retention strategies and policies.
All DGO Quarterly Business Reviews (QBR’s)
DGO Weekly Status Reports (WSRs) to executive leadership team and data governance stakeholders
Rolling three-year program delivery budgets and roadmaps
Providing executive dashboards with real time reporting of program delivery objectives
Defining and maintaining documentation about all program engagement processes including…
Data Access Management
Communications Decks
Runbooks/Engagement protocols
Disaster Recovery/Business Continuity Management
Target Operating Model
Business/Technical Architecture
Record of Processing Artifacts
Azure DevOps Project Delivery Boards
Clarizen Enterprise PMO/PPM Reporting Boards
Monthly budget review and burndown planning, requests & ongoing management
Infrastructure build-out and ongoing support and data security
Internal Audit management
Working with legal and procurement departments on vendor/supplier contract renewals for governance compliance
Cross functional support for: Corporate InfoSec threat vulnerability management, systems compliance and data security
Ethical AI Governance board member and advisor
Design and build of multiple AI Domain Learning Models (DLM) in support data security and protection objectives
Vendor management
Data Governance Operations (DGO) major value streams in my portfolio included:
Data Discovery & Classification (DDC) Program: The DDC program was the tip of the spear for the overall data governance program. The DDC program implemented platforms for discovering, classifying, labeling, and reporting the location of all sensitive data in structured production data sources. Corporate defined sensitive data including business, financial (ICFR), healthcare, or personal information (PII). This solution identified risk-based target databases that required elevated data protection and minimization controls (~4,000 production databases out of ~24,000 database instances in the IT estate). The DDC program was under continuous automation enhancements that would alert the DG operations team when new databases came online or if existing database data models were changed. These alerts would automatically add tasks to the Data Governance work backlog to perform discovery work that would potentially trigger other Data Governance program processes like data minimization, GDPR, data quality, data retention, data masking, data lineage, et al.
This program was supported by ~20 DG team members at a cost of approximately $600k/year. Licensing for year 1-3 of this program was $2.32M. Licensing TCO for years 4-6 was going to be $3.7M. An impeding license renewal in FY24 put the DG Ops team on a path to implement a homegrown solution that would replace the licensed COTS product and reduce OpEx for this solution and support team by approximately $7M over the next 6 years. The homegrown solution was a hybrid ServiceNow/.NET/OpenAI based solution.
Global Data Retention Program: The data retention solution performed data purges from databases across all records deemed sensitive to the company based on record retention schedules and record classes defined by the corporate legal department. 84 record retention classes were correlated to 35 geographic regions that the company does business in. The primary intent for this program was to minimize data for risk management purposes. Risk management purposes included potential loss due to cyber theft, litigation, regional regulatory requirements, and audits. There were approximately 500 unique production databases that qualified for management under the data retention program. Company data growth compounded ~7% annually which is consistent with similarly sized global IT organizations. Through continuous process improvement and automation a 60+ member data retention team was reduced to ~20 team members focused exclusively on implementing data purges on in scope databases. The data retention software licensing AMC was ~$500k/year. The operational support and delivery costs were approximately $1M/year (C&B).
Data Minimization Program: The data masking program is a multi-faceted program that was focused on driving down cost of management, movement and provisioning of data. These programs include: 1. Data Masking of structured data sources. Data masking required anonymization and/or pseudonymization of copies of production data copies for use by application testers and/or analytics purposes. 2. Minimization of shared of data from one internal system to another through modification of online and batch processes to reduce sharing of sensitive information 3. Data Market Place project democratized access to data and provided a self-service platform that permitted business stakeholders to shop for business data domains that can be used by business portfolios. The business stakeholders would be able to request access to the business data domains from the data owners. The data owners would determine how much or little data access would be granted to a business stakeholder and for how long the privileges would be authorized. 4. The Test Data Warehouse project automated provisioning and movement of masked data subsets from production source systems to target lower environments for analytics and application testing purposes. From a DataOps perspective, these solutions were tightly coupled with conventional application CI/CD frameworks. Application teams were able to configure their DataOps data pipelines to land refreshed datasets in targeted lower environments. The record of processing (RoPA) from request for data access to closure of the requests is tracked step-by-step for audit purposes due to global data regulatory requirements. The operational support and delivery costs for this ~30-person program was originally ~$600k/year (C&B). Through process optimization and automation the Data Masking team was reduced to ~ 10 full time BAU team members and their support budget was reduced to ~200k/year by year 4. Perpetual data masking licenses initially cost about $2M. The AMC for license and environment support was approximately $250k/year. Since mid 2004, the data masking team has
Unstructured Data Governance Program: Implemented pilot program with assistance from Ernst and Young as well as support from Corporate Security, Enterprise Architecture, and Infrastructure Services teams. Down selected for evaluation for performing data scans and threat vulnerability management (TVM). Three vendors were evaluated: Informatica Data Privacy Manager (DPM; a.k.a. Secure@Source), MinerEye and BigID. The down selected long-term solution was designed to support unstructured data sources in the following technology spaces and data sources: ML, AI, NoSQL/Big Data (MongoDb, Dynamo, Hadoop), Enterprise Content Management systems (Microsoft, Box, Alfresco), Business Intelligence (BI) systems (SAP, SalesForce, Tableau, Microsoft Power BI). Regular data scans for both structured and unstructured systems were planned to span approximately 2,000 unique data sources in year one and two. Roadmap included scanning approximately 4,000 unique data sources in year three and four across all in scope corporate and business unit systems. As the solution rolled out, it would have scanned ~35TB of unstructured data sources across +25,000 servers and network connected devices. Total Cost of Ownership (TCO) of the down selected solution was budgeted for ~$5M unlimited perpetual license investment. In out years 3-10, the minimum OpEx Annual Maintenance Contracts (AMC) + cloud infrastructure spend was ~$600k/year. BAU delivery support team spend for years 1-3 was budgeted at ~$500k/year for ~16 full time associates to support high volume document ingestion and authoring of regional document policy authoring and policy management. Actively building a homegrown AI driven unstructured data governance (UDG) solution that will perform bespoke UDG functionality for approximately 10% of the annual OpEx infrastructure and personnel costs of any commercially offered solution on the market.
Data Lineage Program: The objective of the data-lineage program was to document the origin of in-scope data domains and then track and report the propagation of that data to other data consumers within the IT estate. This is a hybrid home grown/COTS solution. Since inception this solution has been highly manual in nature and relies on tribal knowledge. Establishing data lineage for 500 in-scope databases is an intractable problem from an operational perspective due to data sprawl, the nature of employee turnover and general tribal knowledge. The Data Governance Operations team is actively automating interfaces to infrastructure services monitoring tools like ServiceNow CMDB to automate the capture and archiving of live data flow telemetry information. This information is stored in Data Governance data lakes by custom built data governance batch processes. The DG program mapped Application to Server lineage in the SNOW Service Mapping module. Custom DG processes were able to link database servers and database schemas back to application through the Service Mapping Module. These automated processes would generate artifacts that were fed to Informatica Axon that rendered Data Lineage Artifacts. The DG Telemetry program detected events like creation of database servers and database schema changes that permitted automated updates to Data Lineage artifacts. This automated data-lineage artifact streamlines the manual intake process required to get data-lineage down to the table and column level. This automation minimizes the head count and operating expenses associated with initially documenting and maintaining data lineage changes over time.
Data Quality Program: The data quality program leveraged near real time data lineage information to measure, score and improve data quality across the IT estate. Data quality scans were run constantly across multiple business data domains. Data owners and consumers had real time access to data quality dashboards. As a function of data quality governance practices, the Data Governance Data Quality team would assign data quality improvement tasks to data owners and consumers in order to sustain a high level of confidence in the data assets being monitored. The operational support and delivery costs for this ~15-person program was approximately $400k/year (C&B). The data quality PaaS solution costs approximately $250k/year. The Data Quality team performed AI LLM profiling in their core responsibilities in order to detect and remediate observed data model drift.
Data Governance Telemetry (DG Telemetry) Program: This homegrown system was used to detect and publish facts via Azure Service Bus in near real time about the commission, change and decommission of ~27,000 servers, ~1,300 database servers and ~23,000 databases. This system also detected and notified observers in real time when any schema changed on any database in the IT estate. This Pub/Sub solution was used to perform complex data governance, corporate security, IT asset management and IT operations management process orchestration in near real time between IT Asset Management, Data Governance, Delivery Excellence, Corporate Security and other stakeholders.
Data Governance 360 (DG 360) Program: This homegrown Microsoft Power (BI) Business Intelligence based platform was a reporting dashboard that provided a 360-degree view of Data Governance compliance Key Performance Indicators (KPI’s) across all 27 business units and 34 corporate portfolios at Cognizant. DG 360 Dashboard provided an easy to understand top-down reporting platform for hundreds of different data governance metrics related to all data governance and data security programs.
Application Data Inventory Program: This homegrown system provided key tracking information about all corporate and business unit systems that were under the purview of the Data Governance program. There were over 600 systems tracked in this custom ServiceNow (SNOW) based application. This system provided a top-down view for everything from application business owners, technical leads, key application-level server and database information, data governance and corporate security program compliance. This system was used to initiate and manage recurring data governance workflows. Custom point solutions were implemented that monitored the IT estate to detect when new databases came online, changed or went offline. These events triggered custom workflows that spanned the major work streams within the Data Governance portfolio.
Data Subject Rights Management (DSRM) System: This was a homegrown implementation of a Global Data Privacy Rights (GDPR) solution that could be commercialized by Cognizant. The DSRM system will manage intake of request by data subjects through email, phone and web channels. Fulfillment of the Subject Access Request (SAR) were a combination of manual and automated fulfillment. The solution was a composite architecture which included a combination of custom-built system components and commercial off the shelf (COTS) components. The COTS components include: OneTrust DSAR module, Informatica Enterprise Data Catalog (EDC), Informatica Data Privacy Manager (DPM), UIPath Robotics Process Automation, Service Now IT Service Management (ITSM) modules. This solution paid for itself by automating most of the manual processes that the SAR fulfillment team was performing. Each SAR request from a data subject averaged $4,500 per transaction. The DSRM system drove those average costs down below $1,100 per transaction. Based on DSAR transaction request trend volumes, this system had 100% RoI on all sunk costs in approximately 18 months.
Retail Consumer Goods, Travel & Hospitality (December 2016 – October 2019)
Chief Architect for the Cognizant Travel & Hospitality Practice– Lead solution architecture and program delivery activities and account development support around all accounts that I was assigned to work. Provided CIO advisory services to the accounts that I was assigned to. The CIO advisory services were meant to help client organizations harness technology and innovative solutions offered by Cognizant to shape the client’s technology visions, execute their digital transformation journeys, and reinvent their businesses to create enhanced, cost effective and sustainable value from their technology investments.
Carnival Cruise Lines, Miami, FL (August 2017 – October 2019)
MS Office 365 Global Assessment – (March 2019 – September 2019); As a member of the account team in conjunction with the Cognizant O365 practice answered both a request for quote and a longer RFP process for a global Office 365 assessment. The contract value for the fifteen (16) week assessment was approximately $1M. The O365 assessment was conducted on site in Genoa, Italy; Rostock, Germany; Southampton, England; Seattle, WA; Valencia, CA; and Miami Florida. I lead the assessment team on the ground from all customer sites. I was the program manager with support from executive leads from Organizational Change Management (OCM), Cognizant Business Consulting (CBC) and the O365 practices. The delivery teams included resources from Europe, North America and India. The assessment team also had support from the Cognizant Global Alliance team, Microsoft and regional Cognizant client partners.
The assessment team towers included security, network, messaging and collaboration technical teams as well as a business case/financial planning & analysis tower and an Organizational Change Management (OCM) team. The final deliverables included a delivery road map, technical blueprint for delivering O365 globally across all brands, OCM transformation plans and a business case for making the investment in the O365 platform. The business case included budgets for current state messaging and collaboration, O365 implementation costs and timelines and future steady state costs for operational support and licensing of the O365 services across every brand.
In the last month of the assessment, Cognizant was asked to submit proposals and SoWs for an additional $500,000 in Azure AD remediation and O365 DLP security work. An additional $200k of short term OCM work was signed in October. Due to the estimated $12M total global contract value of the projected implementation work, CarnCorp IT procurement rules compelled Cognizant to compete via RFP for the follow-up work.
Dry Dock Optimization Assessment – (March 2019 – May 2019); Performed a two man follow up assessment for two months around Dry Dock process optimization per request of CCL CIO, EVP of Fleet IT and the head of the CCL PMO. This work was to document at a low level the current state dry dock delivery processes and to propose the optimized future state dry dock processes. This process optimization exercise included creating the architecture blueprint for automating the capture and reporting of Dry Dock delivery Key Performance Indicators (KPI’s). Thresholds for optimal performance of planning, engineering and delivery activities for all IT projects were defined, captured and reported. The deliverables included: Current state low-level program delivery process flows across infrastructure, application upgrades and network engineering projects for dry docks. Processes documented where related to program inception, engineering of solutions, staging delivery of solutions, creation of, reconciliation of and tracking of Bills of Materials (BoM) to support implementation of the infrastructure and networking projects. Processes documented and optimized between current and future state included: project scheduling, vendor management and travel planning and post implementation read outs, and project delivery scoring for delivery resources. RoI calculators were prototyped for IT projects in order to articulate pay back and impact of Dry Dock IT projects.
Dry Dock Program – (May 2018 – September 2018); As a member of the account leadership team, I actively performed a process optimization assessment during a live dry dock in Portland, Oregon. Over the two weeks living and working on the ship, the assessment team observed all aspects of managing the 3,000 plus construction contractors from 100+ different global companies. The primary objective was to assess the onboarding process and the IT upgrade program that occurred on the ship. The assessment team also worked with the Hotel and bursar’s office regarding process improvements and technical solutions that would expedite the onboarding and assignment of cabins to the 3000+ contractors. The platform and network upgrade assessment has led to Cognizant being directed to supply a permanent Dry Dock delivery team for network teams for a term of 3 years for ~25 dry dock deliveries spread across ~7 global dry dock locations as well as supplying on shore network engineers to support delivery of network projects in North America.
Data Warehouse Master Data Management Program – (July 2018 – April 2019); As a member of the account leadership team, I actively pitched the concepts around applying data life cycle management principles to all of information architecture at CCL. The client’s executive leadership recognized the value in the concepts. This initiative has also led to Cognizant taking over the then floundering in-flight Master Data Management (MDM) CCL All Brands Group initiative. Cognizant originally pitched the MDM solution and performed the MDM assessment. After 9 months of internal delivery challenges and political challenges, the Director Information architecture and the CCL Chief architect have requested CTS to assume and deliver the MDM solution originally proposed to CCL as a function of delivering this program. The global Casino roll out that is the center of the ADF assessment will be a 7 year program for CCL.
GDPR Compliance Program – (August 2017 – August 2019); As a member of the account leadership team, I worked on the proposal and SoW that sold the initial $1M+ solution. As solution architect and technical lead for end-to-end design and delivery of a EU GDPR data privacy compliance project. The agile-pod team implemented a solution that specifically supported management and tracking of all GDPR data subject rights requests for the following GDPR Rights: Access, Inform, Consent, Object, Rectify, Port and Erase. The technology stack included: Informatica Power Center (IPC), .NET 4.7 web applications, Oracle v13, ServiceNow (SNOW), Microsoft Exchange Server, LDAP AD. The project required implementation of a “Product” that could be used across all ten Carnival Cruise Line brands. The solution had a UI component, multiple batch interfaces, and a classic RESTful web services component layer that was used as an API for clients that did not require the various GDPR application interfaces: Data Subject Facing, Customer Care Facing, Administrative (Admin Application). Under the CarnCorp umbrella, this product solution is actively being offered to the 9 sister brands of Carnival Cruise Lines as a GDPR/CCPA SaaS solution. As a function of expanded regulatory compliance by various US states (California leading) and a couple of foreign countries, this project has expanded to 6 follow-on delivery projects since GA release in March of 2018. CCPA and Brazil are currently under development.
Darden Restaurants iKitchen – (May 2018 – March 2019); Pitched technical solution for RFP response, architect and technical lead for end-to-end design and delivery of iKitchen program re-platform. This solution was used in all ten Darden brands nationally across +1,800 restaurants across the United States. (Cheddars Scratch Kitchen, Olive Garden, Longhorn Steak House, Bahama Breeze, Seasons 52, Yard House, Capital Grille, Eddie V’s) The solution was an iPad platform inventory capture system that replaced daily paper and manual transcription from paper to desktop inventory protocols. The interesting capability of this solution was that it could seamlessly perform and synchronize inventory capture regardless of network connectivity. Darden associates would be able digitally take inventory inside walk-in freezers, kitchens and service areas, bar areas and dining areas regardless of the intermittent quality of network Wi-Fi connectivity. This solution increased accuracy of daily inventory by +80% and decreased the time required to perform inventory activities by ~70% for literally thousands of associates across hundreds of stores on a weekly basis. The solution that was implemented for AngularJS, .NET, Xamarin cross-platform that natively supported: iOS, Android, Windows. The agile-pod team worked off of Version One. Full stack DevOps tool chaining was implemented for CI/CD and Automated Environment Provisioning (AEP). The mobile application supported offline large record data synchronization with the core Oracle 12g database. The platform suite was composed of five separate applications. Each of those applications were decomposed into single page responsive web design applications.
HubSpot Sox/Security/Architecture Scalability; Cambridge, MA – (April 2018 – May 2018); Performed architecture assessment to determine if HubSpot could scale its current core product along with its billing system from its current transactional volume that supports $300M/year to $1B/year. HubSpot uses a cloud native architecture. They are the 4th largest AWS cloud customer on the East coast of the US. I consulted directly with AWS in Boston regarding this assessment before, during and after this assessment along with the CDE Cloud team. I directly interfaced with HubSpot delivery directors, infrastructure VPs, CIO, CTO and VP of software development along with their direct reports. I reviewed their current billing system requirements, information architecture, application design, cloud infrastructure, data access and web services architecture, data models, security, audit and monitoring capabilities. HubSpot is a unicorn in the Cloud Native Application architecture and DevOps capabilities. This was the first assessment of its type for HubSpot. Cognizant beat out Accenture and Deloitte to perform this assessment. The deliverable outcomes were well received.
Royal Caribbean IT Transformation RFP Pursuit (July 2017 – August 2017) – Solution architect lead for proposal pursuit for a five year $100M RFP for takeover of RCCL application support teams, infrastructure support teams and application development teams. The RFP was submitted to RCCL. The initiative was indefinitely tabled by the client after the CIO left the company and the RFP was withdrawn by RCCL leadership.
Project Knight a.k.a. HCSC/TMG (January 2017 – July 2017) – Solution Architect lead for all technical towers during the due diligence phase of the merger & acquisition of TMG from HCSC by Cognizant. TMG Health is a leading national provider of Business Process Outsource solutions for Medicare Advantage, Part D and Managed Medicaid plans. With more than 19 years of experience of providing technology-enabled services to the government market exclusively, its knowledge of health plan processes, State & Federal regulatory requirements and the daily challenges plans face within the government market and its book of business to six of the major BCBS