Post Job Free
Sign in

Senior BI/Data Engineer - 17+ Years Enterprise Data Solutions

Location:
Novi, MI
Posted:
February 23, 2026

Contact this candidate

Resume:

Amir Khan Sr BI / Data Engineer

Detroit, Michigan

***********@*******.***

248-***-****

Professional Summary

Experienced Data Engineer and Business Intelligence Professional with 17+ years delivering enterprise-scale solutions across cloud and on-prem environments, including cloud computing platforms, data infrastructure, analytics, data warehouse, and ETL/ELT pipelines. Expert in designing and optimizing data architectures, enterprise data warehouses(EDW), and data lakes with strong experience in BigQuery, Redshift, Snowflake migrations that enable self-service analytics and real-time insights. Developed and maintained scalable ETL/ELT pipelines using SQL, Python, and modern data frameworks.

Skilled in GCP BigQuery, DBT-ELT, AWS, S3, Redshift, Glue, Snowflake, Oracle, ETL Informatica IICS, and IDMC, building scalable, secure cloud data platforms. Proven in data governance, lineage, metadata, and performance optimization, and delivered high-impact Business Intelligence using SAP BusinessObjects, Tableau, Power BI, and AWS QuickSight.

Technical Skills

Data Integration & ETL: GCP-BigQuery, DBT, AWS Glue, S3, Redshift, Azure Data Factory, Data Cleansing, Cloud data transformation, Python, Bash, PowerShell, and SQL Scripting. ETL/ELT Development, Informatica (IDMC, IICS, PowerCenter),

Data Modeling & Warehousing: Dimensional Modeling, Star & Snowflake Schema, Fact & Dimension Tables, Data Lineage, Enterprise Data Warehouse (EDW), Snowflake, Amazon Redshift/RDS, Oracle, SQL Server, PostgreSQL.

Data Analysis & Reporting: Advanced SQL, SQL Optimization, Ad Hoc Reporting, Data Validation, Trend & Pattern Analysis, Real-Time Analytics, Analytical Problem Solving.

Cloud Platforms & Data Lakes: Google Cloud Platform (GCP), Google BigQuery, DBT, GitHub, AWS (S3, Redshift, Glue, RDS, Athena), Snowflake, Data Lakes, Cloud ETL Pipelines, Data Governance, Data Lineage.

BI Tools & Visualization: Amazon QuickSight, Tableau, Power BI, SAP BusinessObjects, reports. Dashboards with AI Development, KPI & Metrics Reporting, Data Storytelling, and Data-Driven Decision Making.

Automation & Orchestration: CI/CD Workflows, Version Control (Git), Automation of Data Loads and Reports.

Professional Experience

Sr Data Engineer - United Wholesale Mortgage Detroit Michigan

July 2025 - Jan 2026

●Designed and delivered the end-to-end HMDA (Home Mortgage Disclosure Act) database migration from SQL Server to Google Cloud Platform (GCP BigQuery), modernizing legacy mortgage reporting pipelines into a scalable, cloud-native analytics platform that supports regulatory compliance and enterprise reporting. Designed and implemented governed BigQuery data models across preparation and serving layers, applying dimensional modeling principles and leveraging dbt models and macros to create materialized views and curated datasets that significantly improved query performance while reducing processing costs. Developed reliable ELT pipelines using dbt, migrating complex business logic from legacy systems to a modern BigQuery(GCP) architecture.

●Developed and maintained dbt models, macros, snapshots, and incremental loads to transform raw mortgage data into curated analytics layers, including staging, intermediate, and mart schemas. Built and standardized mortgage domain data models—covering loan, borrower, property, underwriting, and HMDA datasets—to support enterprise analytics and downstream BI consumption. Created governed BigQuery views using dbt to enable ad-hoc analytics and BI tools while enforcing data governance standards, consistency, data lineage, and reuse across the platform. Managed dbt materializations and warehouse scaling configurations to balance performance requirements in GCP-BigQuery.

●Developed SQL Server and Oracle stored procedures, along with complex ad-hoc SQL queries, to support business and compliance reporting during and after the migration. Produced detailed source-to-target mapping documentation and migrated standardized lookup and reference tables—such as loan purpose, occupancy type, lien status, action taken, race and ethnicity, and geographic attributes (MSA/MD)—into BigQuery to ensure consistent and auditable regulatory reporting. Wrote advanced SQL queries against the Integra database to support HMDA and internal reporting needs.

●Implemented GitHub-based version control and CI workflows for dbt projects, enabling peer reviews, controlled deployments, and traceable change management. Optimized BigQuery performance using partitioning, clustering, materialized views, and efficient SQL patterns. Ensured HMDA data accuracy, completeness, and audit readiness by implementing data quality rules, validation checks, and reconciliations aligned with regulatory and governance requirements.

Platforms & Environments

●Cloud & Data Platforms: Google Cloud Platform (GCP), Google BigQuery, DBT, GitHub, SQL Server, dbt (models, snapshots, tests, macros), ETL/ELT pipelines, data lineage, analytics engineering, GitHub, CI/CD for data pipelines: SQL & Optimization: Advanced SQL, BigQuery Performance Tuning: Partitioning, Clustering, Views, Materialized Views.

Cloud Data Engineer

INNOVATIVE Solutions - AWS Premier Services Partner

Aug 2024 - March 2025

Developed and maintained robust data integrations between diverse source systems and enterprise analytics platforms, leveraging cloud services such as Informatica IDMC, AWS Redshift, RDS, S3, Athena, Glue, and Parquet file formats to support comprehensive and near real-time analytics for Respiratory Healthcare Systems and a top U.S. HVAC service provider. Played a key role in data modeling and ETL pipeline development initiatives, improving data preparation workflows, ensuring data consistency, and optimizing data structures for analytical consumption.

Designed and implemented scalable ETL pipelines using Informatica IDMC Cloud Data Integration (CDI) to extract, transform, and load diverse datasets into AWS S3, improving data availability and reliability for downstream reporting and analytics. Architected and developed cloud-native data pipelines using AWS Glue and core AWS data services to efficiently ingest, transform, and load data into Redshift, S3, and RDS. Executed complex data extraction, transformation, and validation processes across Glue, Redshift, RDS, and Athena, building resilient and accurate data pipelines aligned with business requirements.

Optimized AWS-based data warehousing solutions and ETL processes within Informatica IDMC and AWS Glue, significantly enhancing performance, scalability, and usability of data for multiple business use cases, including healthcare analytics and sales and revenue reporting. Delivered optimized data models and integrated datasets to support high-impact dashboards and reports in AWS QuickSight and Tableau, ensuring trusted, accessible data for strategic decision-making.

Designed and implemented data architectures that enabled GenAI integrations (including Amazon Q and chatbot solutions), optimizing data delivery and structure to support enhanced data exploration within downstream BI tools. Built and delivered interactive dashboards and analytical reports using Tableau, Power BI, and AWS QuickSight, providing actionable insights and key performance indicators related to sales, revenue, and regional performance trends. Collaborated closely with data engineering and BI teams to ensure ETL processes and data structures supported consistent reporting and accurate insights, while managing and version-controlling ETL scripts, SQL queries, and project artifacts using Git and GitHub to support collaborative development and strong code governance.

Advanced data integration and BI solutions, developing scalable ETL pipelines and data models to support enterprise analytics. Optimized workflows, ensured data quality, and enabled actionable insights through AWS QuickSight dashboards and reporting, helping drive business decisions and operational efficiency.

Data Engineer - Volkswagen Group of America, Michigan,

March 2015 - April 2024

Over the years, as a key member of the Data Engineering and Business Intelligence (BI) team at Volkswagen Group of America, I have developed deep expertise in designing, building, and supporting end-to-end data and ETL pipelines for Enterprise Data Warehouse (EDW) solutions. I have worked across the full data lifecycle from data ingestion and integration to transformation, validation, and front-end consumption via SAP BusinessObjects.

My experience includes architecting and maintaining scalable ETL/ELT frameworks using Informatica Intelligent Cloud Services (IICS) and Informatica Data Management Cloud (IDMC), while simultaneously streamlining the BI layer using SAP Web Intelligence (WebI). I have partnered closely with business stakeholders to translate complex requirements into dynamic WebI reports and interactive dashboards that drive data-led decision-making.

Expertise in SAP BusinessObjects (4.2/4.3), including Information Design Tool (IDT) and Universe Design Tool (UDT). Developed complex Web Intelligence (WebI) reports and dashboards that provided actionable insights to executive leadership. Managed BI launchpad security and scheduled automated report distribution to ensure timely delivery of business-critical data.

Designed, developed, and maintained Enterprise Data Warehouses and Data Lakes supporting large-scale automotive data, including manufacturing, vehicle, dealer, and customer domains. Built and managed end-to-end ETL pipelines using Informatica PowerCenter 10.4.1 and Informatica IICS/IDMC, integrating high-volume data from a wide range of source systems. Developed advanced Informatica transformations such as Aggregators, Joiners, Lookups, and Routers—to support data cleansing, enrichment, and standardization initiatives.

Collaborated on architecture reviews, technical validations, and user acceptance testing (UAT) to ensure data accuracy, pipeline reliability, and alignment with business requirements. Actively monitored and troubleshot IICS jobs, task flows, and secure agents, resolving performance bottlenecks and complex data integration issues across environments. Configured and supported secure integrations across AWS (S3, Redshift), Snowflake, and on-premise platforms, including Oracle, SQL Server, and MDM systems.

Implemented robust data quality frameworks and validation rules to ensure trusted, analytics-ready data for downstream reporting and BI consumption. Led production support and operational activities in an Agile environment, ensuring high availability, stability, and continuous improvement of ETL processes and enterprise data platforms.

Provided end-to-end data operations and production support, ensuring the reliability, availability, and performance of enterprise data pipelines and platforms. Monitored ETL workflows, tasks, and batch jobs across cloud and on-premise environments, proactively identifying and resolving issues to minimize downtime. Implemented validation checks, reconciliations, and error-handling processes to maintain data accuracy, consistency, and audit readiness, while supporting business and compliance reporting requirements.

.

Data Warehouse Developer - Blue Cross Blue Shield of Michigan December 2012 - February 2015

●Led multiple end-to-end Enterprise Data Warehouse (EDW) initiatives from requirements gathering through deployment, supporting both inbound and outbound data flows across cloud and on-premises environments. Designed and developed complex ETL workflows using Informatica PowerCenter 9.5, including mapping design, transformation logic, and performance tuning to support large-volume data processing. Built and maintained scalable data pipelines for healthcare and automotive clients, integrating monthly and weekly data feeds with automated scheduling through IBM Tivoli and custom UNIX shell scripts.

●Produced comprehensive Source-to-Target Mapping (STTM), Functional Design Documents (FDD), and Technical Design Documents (TDD) to ensure alignment between business requirements and technical implementation. Implemented robust data cleansing, validation, and transformation logic to improve data quality, consistency, and regulatory compliance across clinical and operational datasets. Collaborated closely with Data Architects, Data Modelers, DBAs, and QA teams to coordinate cross-functional development and ensure timely, accurate delivery of project milestones.

●Developed optimized PL/SQL procedures, packages, and functions integrated into ETL workflows, leveraging tools such as SQL Developer, TOAD, and Teradata SQL Assistant for advanced query development and performance analysis. Executed System Integration Testing (SIT) and User Acceptance Testing (UAT) across QA environments, creating detailed test cases to validate ETL logic and data accuracy. Partnered with business stakeholders to gather reporting requirements and translate them into technical specifications and visualization-ready data structures.

●Designed scalable star and snowflake schemas to support multidimensional reporting and analytics across claims, clinical, vehicle telemetry, and supply chain domains. Applied ETL and reporting performance-tuning best practices in accordance with project standards and Informatica development guidelines, ensuring reliable, high-performing, and maintainable data solutions.

Business Intelligence Application Lead, General Electric (GE Capital Americas), Michigan, February 2010 - December 2012

Served as a lead contributor to the BI Center of Excellence (BI COE) at GE Capital Americas, supporting enterprise-wide BI initiatives and establishing standardized analytics practices. Played a key role in the rollout of OBIEE 11g, defining BI architecture standards and best practices for scalable, consistent reporting.

Led the full BI SDLC, from requirements gathering and design to development, QA, and production support. Delivered interactive dashboards and analytics using OBIEE 11g and IBM Cognos, and designed Informatica PowerCenter ETL workflows for full and incremental data loads. Managed UAT and QA testing, ensuring data accuracy and performance, and optimized OBIEE RPD models across Physical, BMM, and Presentation layers. Oversaw OBIEE administration and performance tuning, including security, configuration, troubleshooting, and server operations to maintain high system reliability.

BI Developer, John Hopkins Applied Physics Lab (JHU APL),

Maryland- February 2010 - November 2010

Project: EBSS

●Contributed to building a next-generation BI and Data Warehouse platform for Applied Physics Lab (APL) business systems, supporting enterprise reporting and analytics. Designed and implemented end-to-end ETL processes to extract, transform, and load data, ensuring reliable and accurate data for downstream consumption.

●Developed OBIEE 10g dashboards, reports, and RPD models, extending OBIA and Oracle E-Business Suite Financials and Procurement modules. Created and maintained project documentation, data standards, and source-to-target mapping specifications to support consistent and governed reporting.

●Designed and customized Informatica PowerCenter 8.6.1 ETL mappings, including SDE/SIL and custom transformations. Enhanced fact and dimension tables with custom Descriptive Flex Fields (DFFs), managed Slowly Changing Dimensions (SCDs), and developed reusable transformations. Built custom Oracle DW tables, workflows, and sessions to support new analytics and conversion data requirements.

ETL Developer, July 2005 - June 2009

Clients: Blue Cross Blue Shield, HealthNow, Horizon BCBS, NJ

Project: Business Intelligence (BI)

A multi-year project to build a data warehouse to identify the most profitable or potentially profitable customers for future interaction and to perform claim analysis. Performed a major role in understanding the business requirements and designing, loading, and extracting the data into the Data Warehouse of the Facets system, Pharmacy Claims, Provider, Med Claims, Members, and BHI, and CRMS for McKesson.

Education

Bachelor's in Computer Science

February 1993 - April 1997, Cambridge College of Computer Science

Training and Certifications:

GCP -Google Cloud (Google)

AWS Certified Cloud Practitioner (AWS)

AWS Technical Essentials (AWS)

AWS Certified Solutions Architect - Associate (AWS)

Oracle Certified Database Administrator -DBA (Oracle Corp). USA

Certification in Oracle Developer and SQL



Contact this candidate