Resume:
Chief Data Architect
John Ormond Norwell, MA ******.****@*****.*** 617-***-****
Professional Summary
Proven data architect with formal hands on data analytics training and Master’s Degree in Applied Economics from Boston College and progressive experience in AI Data Analytics, data architecture, data engineering, data governance, product strategy, and dimensional reporting for advanced analytics across financial services, banking, and insurance services sectors— including MIT Lincoln Laboratory, MassMutual, Bank of America, Arbella Insurance, Liberty Mutual, Progressive Insurance, Hannover Reinsurance DE, and many other fortune 100 organizations. Proven expertise in architecting scalable enterprise data visions, building cloud- native data products supporting real-time processing, AI/ML, analytics, and self-service consumption, while fostering data-as-a-product cultures and robust governance frameworks for advanced holistic (all department’s) correlated enterprise reporting . Excel at working with global, multi-level teams of engineers and product managers, driving innovation through modern data stacks (Snowflake, Databricks, Kafka, Airflow, Informatica,Virtual API Data Gateways, Data flattening, AWS/Azure), and delivering measurable business value in complex, regulated environments. Advanced education in Financial Economics with Data Science, Predictive Analytics, andAIModelingcomplementdeeptechnicalproficiencyintranslating business needs into scalable, secure data solutions. Hands on data modeler and developer and can program in SQL, R, python and C#. Experience in most reporting tools and can build and times series or other multidimensional dashboards to highlight correlation and causation. Key Skills
Data Data Architecture and Data Governance: Expert in data modeling, data engieering, AI modeling designing Star Schemas, Snowflake, OLTP/OLAP models (using ERwin/ER/Studio/other); established robust data quality and governance frameworks for regulated industries and built meta data corporate repositories to manage change control activities. Pedictive anlytics modeling.
ETL/ELT Pipelines: Excellent hands on programming Skills in SQL, python and R. Built scalable workflows with Airflow, Informatica, Python, and SQL SSIS, dbt, DataBricks, integrating structured/unstructured data for real-time analytics and data flows for staging, merging, cleansing and pushing into a star schema. Hybrid structures internally a cloud data pipelines.Complex multi source data merging and cleaning into star and other schema. AI ModeltTraining skills.
API dataflow simplification: Built and implemented secure virtual gateways to external APIs to enable seamless and secure integration with existing star schema database and snowflake data schemas. This enabled seamless integration with many external sources in various API formats to align with a flattened star schema architecture.
Cloud Platforms: Hands-on proficiency withAzure(SQL Database, Data Factory, databricks, voltage encryption), AWS (S3, Vertica); adaptable to hybrid cloud environments like Synapse and GCP.
BI & Reporting: Developed Power BI andTableaudashboardsforfinancial,leasing, and operational insights, enhancing enterprise decision-making.
Data Analytics: AI & Predictive Analytics: Expert in Data analytics Modeling from my economics degree at Boston College. IntegratedARIMA,Neural Networks, and generative AI into data architectures for forecasting and merchant ecosystem optimization. Can model and correlate and demonstrate causation on the relationship of trends and how they impact each other and can establish automatedAIpredictive analytics.
Financial Expertise: Automated ERP/GL/AP/AR integrations, managing $343B balance sheets and optimizing reporting for large-scale financial systems. Managed a 4.5 billion dollar portfolio assets and liabilities (Reg 126) at Mutual of NewYork.
Compliance: Ensured SOX, HIPAA, PCI DSS, andGDPRcompliancewithautomated audits, security compliance, logging all transactions and loads, developing secure audit trails and using methods likeVoltagesourcetocloud,cloud to source SecureData encryption.Voltage End-to-End Encryption: Data is secured during transfer over networks using strong cryptography (e.g., AES-256 equivalents via FPE), preventing interception or breaches. This includes support for secure protocols in hybrid environments, ensuring compliance with GDPR data localization (storing EU data within the EU) and ISO 20022 standards.
Collaboration skills: Collaborated effectively with cross-functional teams and stakeholders across the organization to bridge silos, unify disparate data sources, and foster a collaborative environment that enables seamless data integration and empowers intelligent, insightful reporting systems.
Led architecture teams in designing and implementing scalable, future-proof data solutions that align closely with business strategies, driving enhanced data accessibility, operational efficiency, and informed decision-making throughout the enterprise. Maintained all financial and SDLC change controls to ensure compliance and data integrity.
Professional Experience
Consultant: Data Engineering, Data Analytics, Data Architecture, Data Pipelines MIT Lincoln Labs, Lexington, MA June 2022 – Present
Architected enterprise data architecture and solutions, integrating structured/unstructured data using ERwin/ER/Studio, aligning with hybrid cloud strategies.
Architected and Developed encrypted ETL/ELT pipelines with secure databaseVirtual Gateways(make anAPIcallatableseamlessly) for scalable and manageable API extraction and management, SSIS, Python and SQL, embeddingAI-driven predictive models (ARIMA, Neural Networks) for budgeting and asset management, mirroring real-time data processing needs.
Established federated governance frameworks with automated audits, achieving 100% reporting accuracy and compliance with security standards, ready for Data Mesh adoption.
Developed advanced CyberVulnerabilityDatawarehousefromeightunstructureddata sources into a Star schema for MS Power BI Reporting using Co-pilot 360.
Embedded and Enriched the DataArchitectureofdatawarehousetoEmbed any AI and predictive analytics to enhance and more easily observe correlation and causation in the data reporting function.
Head of Financial Data Architecture / Financial Pipelines Architecture MassMutual, Boston, MAAugust2020–June2022
Led the automation of a $343B balance sheet by integrating six systems (SAP, Oracle Financials, Investments Systems, derivatives systems, trading systems) into a Star Schema data warehouse, streamlining ERP/GL/AP/AR reporting for MicroStrategy - reporting data modernization.
Conformed dimensions to reduce reporting hierarchies by 40%, enabling consolidated financial views under GAAP/IFRS, showcasing expertise in scalable data modeling.
Implemented generative AI models for financial forecasting, enhancing cost efficiency and profitability analysis, a skillset transferable to a data product lifecycle.
Built Pipelines and controls for loading over $340 billion in assets from over 23 data sources internal and external.
Built audit schemas and pipeline automation to ensure accurate consistent data loading and data governance and accuracy with automated balancing to the corporate balance sheet(s).
Data Analytics and Data Modeling AI DataWarehouseInc,Boston, MAApril2018–August 2020
Designed dimensional models with ERwin for predictive analytics, integrating time- series data for real-time financial and operational reporting, aligning with ISO 20022 requirements.
Created Power BI/Tableau dashboards with embeddedAI,optimizing decision-making for leasing and operational analytics, demonstrating cross-functional leadership. VP Data Engineering, Data architecture and Data Analytics Energi/Hannover Re Insurance, Peabody, MA April 2012 –April2018
Directed a 30-person data management team to build an 800-table data model using ER/Studio, integrating financial and operational data for real-time analytics in a regulated environment.
Built Asset Liability Models and enabled actionable star schema dimensional reporting.
Build Numerous claims triangles and other actuary financial models for risk management.
Designed and built Pipelines for over 30 external feeds with controls and secure encryption.
Developed audit schemas and led teams to ensure proper controls and checks for accurate data loading.
DeployedAzurewebappswithAPIintegrations(ISO, NCCI), ensuring SOX compliance and scalability, paralleling Blue Cross’s security and governance needs.
CertifiedASP.Net Developer and DataArchitect.
Director Pharma Business Intelligence and DatabaseTeamCommonwealthMedicine, Shrewsbury, MA June 2008 –April2012
Architected Star Schema systems with ERwin forTableauanalytics,integrating financial and operational data with automated governance, applicable to self-service models.
Managed ETL pipelines with MSSQL and Python, ensuring data quality for compliance, showcasing agile delivery expertise.
Director of DataWarehouseDevelopmentAllianceDataManagement(Epsilon), Burlington, MAApril2006–June2008
Designed data models for financial applications using ERwin/ER/Studio, supporting high-throughput ETL pipelines for 50TB databases at Bank of America, Pfizer, Astra Zeneca, and Progressive Insurance.
Enabled $400M quarterly revenue growth at Bank of America through scalable data marketing databases solutions, reflecting proven delivery in large-scale architecture and transformations(over 40TBVLDBonNetezza).
Direct Mail Marketing for Credit Cards: This involved architecting targeted promotional materials (e.g., pre-approved credit card offers) to potential customers to boost market penetration. The goal is to acquire new users by leveraging data-driven segmentation (e.g., based on credit scores, demographics) to identify prospects, a process you likely supported through your work on large-scale data warehouses and analytics platforms at Bank of America.
Manager/Senior BI Solutions Engineer EMC, Hopkinton, MA 1999 – 2006
Developed very large database Star Schemas for various client reporting architecture, optimizing pipelines for large-scale datasets, and patented a database optimization process for enhanced performance for databases on EMC storage.
Presented advanced technical topics at OracleWorldandEMCWorld,establishing thought leadership in data architecture.
Principal/Senior DataWarehouseConsultantAnswerThinkConsulting(General Motors & Arbella Insurance), Burlington, MA 1997 – 1999
Designed global financial data warehouses for GM, enhancing profitability with multidimensional reporting, transferable to standards. Senior Consultant/Technical Lead Bank Boston, Boston, MA 1995 – 1997
Architected secure teller transaction systems with ER/Studio, mentoring teams on database design, aligning with cross-functional leadership needs. Data Architect & Analytics Lead, Real Estate Management the Flatley Company, Boston, MA 1993 – 1995
Architected and built pipelines for a data warehouse for 10M+ square feet of properties, integrating leasing and financial data with Informatica and SQL ETL pipelines.
Reduced lease reporting times by 30% withTableaudashboards,optimizing portfolio performance and demonstrating data-driven decision-making. Actuarial Manager/Developer Mutual of NewYork(MONY), NewYork,NY 1990 – 1993
Modeled a $4.5B pension portfolio with APL-based systems, automating ETL for DB2 legacy platforms, ensuring accurate asset/liability reporting. Education
Master of Science inAppliedEconomics/Statistics (Data Science Methods) Boston College, Newton, MA 2022 GPA 3.95 Summa Cum Laude
Thesis: "Advanced Economic AI Model Development and Integration into Star Schema Data Models" – Focused on embedding AI for predictive analytics in financial ecosystems.
Bachelor of Science in Computer Science Alfred University, Alfred, NY Four-Year Scholarship, Dean’s List
Significant Applicable Achievements
Recent Hands-On Data Architecture Experience: Architected a comprehensive Financial Reporting Data Lake at MassMutual, automating the management of a $343 billion balance sheet. This involved designing and integrating a cohesive data architecture that unified six financial investment and accounting systems (SAP, Property etc.) into a single conformed dimension Star Schema, streamlining financial reporting processes and enhancing data consistency and accessibility.
Reduced reporting hierarchies by 40% at MassMutual, enhancing financial transparency.
Contributed to $400M in quarterly revenue at Bank of America by architecting and managing an advanced 40-terabyte marketing database campaign on the Netezza platform. This campaign effectively targeted 350 million consumers in the USA using over 2,000 data attributes, enabling highly cost-effective and focused look alike modeling marketing efforts. This achievement highlights my expertise in handling large-scale, scalable big data solutions and delivering intelligent reporting, which are essential for direct mail marketing campaigns aimed at boosting credit card market penetration. This strategy involves distributing targeted promotional materials, such as pre-approved credit card offers, to potential customers to expand market share. The approach leverages data-driven segmentation—using factors like credit scores and demographics—to identify prospects, a capability I supported through my work on extensive data warehouses and analytics platforms at Bank of America.
Secured a U.S. patent for an innovative database optimization process tailored for Network Attached Storage (NAS) at EMC, revolutionizing the management of very large databases by introducing a novel block storage allocation technique. This patented method enhances performance by optimizing data retrieval and storage efficiency through dynamic block sizing and intelligent caching mechanisms, reducing latency by up to 35% and improving throughput for databases exceeding 50 terabytes. This breakthrough demonstrates exceptional leadership in scalable storage solutions, enabling robust support for high-volume data environments and establishing a foundation for advanced data architecture in enterprise systems.
Designed and built a sophisticated Cyber Vulnerability Data Warehouse at MIT Lincoln Labs by architecting and eventually integrating eight unstructured data sources—BigFix, Active Directory (AD), Global Connect, Network, Tenable, and other systems—into a cohesive conformed dimension Star Schema, optimized for Power BI reporting with the assistance of Co-pilot 360. This process required conforming the diverse data by normalizing it and re-establishing primary keys, enabling robust conformed dimension reporting within the Star Schema architecture for seamless data consistency and historical/real-time analysis for Cyber Security/Compliance reviews for over 175,000 assets at MIT Lincoln Labs.