SANTOSH LABADE
*******.******@*****.***
SUMMARY
18+ years of professional experience as an Enterprise Architect, Data Engineer in the Design and development of DWH applications and Data Modeling using ER Studio/Erwin, and databases using Oracle, SQL Server, NoSQL databases like MongoDB, Power BI, Cassandra, and cloud technologies Microsoft Azure.
Experience in Medallion Architecture, Data Vault, Data Virtualization using Composite/Denodo.
Design and develop the Azure Data Factory, Databricks pipelines to migrate data from On Prim to Azure Cloud and Vis-a-versa.
Design and develop DWH solution using Snowflake, Azure Synapse DW and ADF Pipeline to move data from ODS Layer to Azure Synapse DW.
Design and development of near real time data pipelines from Microsoft Dynamics to ODS Layer using Tibco’s Scribe Tool.
Exposure to manage daily/weekly/release builds for the entire solution as a continuous Integration/continuous Deployment (CI/CD) and version control using Git Hub in Microsoft Team Foundation Server (ADO).
Experience in design and implementation of enterprise-wide data strategy, data governance processes, data modeling best practices, data quality, data store and data security.
Developed machine learning model based on predictive analysis for the wealth customers to analyze the predictions for their overall liquidity for next 3 months, highlighting the risks and rewards based on the exposure different market conditions with the US Bank Client.
Developed program that automates and generates the architecture diagrams and necessary documentation using AI model with the Intel project.
EDUCATION
Bachelor of Engineering, Pune University, Maharashtra, India.
CERTIFICATIONS
AZ-900
DP-900
Power BI Data Analyst Associate: PL-300
Azure AI-900
Azure Data AZ-104
SKILLS
Databases: Oracle, MS-SQL Server, Snowflake, MongoDB, Cassandra
Languages: SQL-PL/SQL, Python
Cloud Technologies: Microsoft Azure (Azure Data Factory, Databricks, Azure Synapse)
Tools/Software: Erwin/ER Studio/EA Data Modeler, Informatica 8.0, SSIS
Other Tools: Composite/Denodo DV
Project Management Tools: MS Project 2000, MS Visio, VSS
Operating Systems: Windows, UNIX
PROFESSIONAL PROFILE
Intel INC July 2022 – Aug 2025
Enterprise Cloud Data Architect
Platform & Tools: SAP, ER Studio, Data Vault Data Modeling Tool, Power BI, Azure Data Factory, Databricks, Azure Synapse, Snowflake, Python
Enterprise Cloud Solutions (ECA) is the enterprise level initiative to move from SAP S/4 Hana to Azure Cloud (Snowflake/Azure Synapse) solution based on Medallion Architecture and Data Vault Architecture to support the separation of the Intel Product Design and Intel Manufacturing Division of the Intel Foundry.
As an Enterprise Cloud Data Architect:
Design and support the Enterprise wise Logical Model to convert the existing On Prim system to Cloud Based system in Azure.
Design and develop the Hub, Satellites and Link tables
Help Design and development of the Azure Data Factory, Databricks, Delta Lake to migrate data from on prim to Azure cloud based on Medallion Architecture
Serve as the Data Integration Services SME and collaborate with business teams.
Manage and support Enterprise Data Taxonomy for Manufacturing Subject Area
Manage and support the Value Cost Chain (VCC like Data Lineage) for Manufacturing
Help manage daily/weekly/release builds for the entire solution as CICD and version control for the Manufacturing Subject Area data objects in Microsoft Team Foundation Server (ADO).
Wells Fargo Inc, Chandler, Arizona, USA Mar 2021 - Jul 2022
Enterprise BI Data Architect
Project Enterprise Data Mart
Platform & Tools SQL Server, SAP, SAP Data Modeling Tool, Power BI
Enterprise Data Mart is the enterprise level reporting solution for corporate functions to move and integrate data from SAP S/4 Hana and other data sources to SQL Server Data Mart.
As an Enterprise BI Data Architect:
Design and support the Enterprise wise Logical/Physical Data Model
Collaborate with Business to understand the requirements and chair the compliance board.
Troubleshooting with the development team for data integration issues, performance turning of the reports and design and developing the ETL pipelines.
ARIZONA STATE (Ace Software) Phoenix, Arizona, USA Oct 2019 - Mar 2021
Data Architect
Platform & Tools: SQL Server, SSIS, ER Studio Data Modeling Tool, Power BI, Azure Synapse, Scribe, Microsoft Dynamics
The Guardian Program is a child centered, user-friendly technological solution, which provides quality data and improved processes to support all Department of Child Safety work for the safety of Arizona Children.
As a Data Architect/Sr Data Engineer:
Collaborate with Business to understand the requirements, design, and develop the Logical/Physical Model to convert the existing On Prim system to Cloud Based Data warehouse system in Azure.
Load data from different interfaces into SQL Server (VM) using Azure Logic Apps, Azure Data Factory.
Helped managing daily/weekly/release builds for the entire solution as a continuous Integration/continuous Deployment and version controlled for the Guardian solution in Microsoft Team Foundation Server (ADO).
Design and development of near real time data pipelines from Microsoft Dynamics to ODS Layer using Tibco’s Scribe Tool.
Design and development of the Azure Data Factory to migrate date from on prem to Azure cloud.
Design and develop of the Azure Synapse DW layer for the Longitudinal BI Reports of the system along with the ODS layer.
Create the mapping documents to get the raw data from desperate sources and move it to the ODS layer of the Data Warehouse
Design and development of the Guardian Interfaces that interact with third party.
Design and Development of SSIS packages and deploying it on Microsoft Azure and designing and development of the pipeline to load data into Microsoft Dynamics
WIPRO LTD / US Bank (Minneapolis, MN, USA) May 2018 - Oct 2019
Data Architect
Project Liquidity Dashboard
Platform & Tools: SQL Server, ETL Informatica, Erwin, Tableau, Denodo DV
June 2019 to October 2019
Client’s Liquidity Dashboard is an Investment Services Data Warehouse project that helps to get the liquidity balances of the client and builds the analytics tool to know the money movement.
As a Data Architect:
Maintain Logical and Physical model and the mapping document for the data movement from source to the target.
Creating Raw Data for the Data Science stories make modifications as per requirements from the Data Scientists
Help the Tableau team to get the required data for visualization.
EDW (Enterprise Data warehouse) / Bank of The West
Platform & Tools: Oracle 11g/12C, ETL Informatica, Erwin, Denodo
Client’s Data Management program is to create an Enterprise Data Warehouse (EDW) to support business intelligence reporting and improve decision support. EDW is going to be a lone source of information retrieval across all services. Using EDW, one can create a holistic view of organization from various perspectives.
As a Sr Data Modeler:
Working on building WMG Mart & Finance and Risk Mart (FARM) for Federal Reporting and Business Analysis.
Collaborate with Business to understand their need and model Dimensional Model for WMG & FARM.
Created Conceptual, Logical and Physical dimensional model and worked with Business, Development, and testing teams to aid data flow/Lineage, Data dictionary, Transformation and Mapping from Pre-stage to Stage to Data store to ODS to EDW to Wealth & FARM Data Mart.
Data quality relies on six key dimensions, such as accuracy, completeness, and consistency.
Served as the Deposits subject matter expert, advising analysts and development teams on EDR deposit gaps relevant to FARM and Regulatory Reporting.
Served as SME for Data Virtualization, delivering the DV layer to the OBIEE Reporting Layer
COGNIZANT TECHNOLOGY SOLUTIONS / Kaiser Permanente (Pleasanton, California) Feb 2006 - May 2018
Manager Projects
Project: CDW (Claims Data warehouse)
Platform & Tools: Oracle 11g/12C, ETL Informatica, Erwin Data Modeling Tool
KP's CDW is a comprehensive, historical database of verified claims that is subject-oriented, integrated, time-variant, and non-volatile. CDW is the national platform for finished claims governed under the Health Plan Data Strategy program.
CDW vision is to provide a single trusted source of data that reflects the system of record, provide consistent and timely data availability, harmonize data across applications and different source systems, and be SOX compliant with appropriate audit controls, and support business decision making with high level of confidence.
As a SR. Data Modeler:
Accountable for managing both logical and physical dimensional data models for the CDW Data Warehouse using Erwin.
Develop and keep the latest version data model, data dictionary, data mapping, metadata and other data architecture related design documents.
Suggesting alternatives to the customer and taking the architecture-related decision whenever needed.
Data Migration
Coordinate with DBA/Infrastructure support team for infrastructure setup
Monitoring the database health checkup & performance tuning.
Toyota Motor (Torrance, California, USA)
Project: iPlus 4.0
Platform & Tools: Oracle 11g, ETL Informatica, Erwin, OBIEE
The client's project provides comprehensive incentive calculation services to dealers working at both regional and national levels.
As a Data Architect:
Developed Conceptual, Logical, and Physical Data Models using Erwin for OLTP and Data Mart systems.
Develop and keep the latest version data model, data dictionary, data mapping, metadata and other data architecture related design documents.
Developing Architecture Handbook for the System.
Suggesting alternatives to the customer and taking the architecture-related decision whenever needed.
Data Analysis, Data Modeling & Database design.
Novartis Pharma
Project: Novartis-CCEx, GBS-CMF
Platform & Tools: Oracle 11g, ETL-Informatica, Composite DV
Client’s Change Control Excellence covers the rules necessary to transform data from legacy system into format suitable for the Product Lifecycle Management (PLM) system for the Life Science Manufacturing Domain. Cognizant has successfully implemented this project along with on-going support and monthly re-run of the modules.
My responsibilities as a Technical Project include:
IGM Remediation Activities
Developing Architecture Handbook for the System
Suggesting alternatives to the customer and taking architecture-related decisions
Data Analysis and Database design
Technical and functional aid to the team members.
Providing technical solutions to develop analytical platforms using ETL-Informatica &Oracle11g
Credit Suisse (Singapore)
Project: Opera
Platform & Tools: Oracle 10g, Unix, ETL-Informatica, Java
OPERA (Online Profitability, Exceptions, Reporting and Adjustments) is an online system for the Client, a leading global financial services company, for offering clients different financial advice in all aspects of investment banking, private banking and asset management services.
Responsibilities as an Onsite Consultant:
Coordinating with the PMBAs (Business Users) to prepare the Disparate data needed for generating the online profitability, exceptions, reporting and adjustments.
Reporting daily status to the client. Ownership of the offshore deliverables and conducting the peer reviews for the offshore deliverables with the client.
Enhancement to the existing Data Model using Erwin includes creating logical/physical data model of the system.
Data mapping for the source system database, including T24, text files, Sybase, and MS Access into the target Oracle database.