EUGENE TURKESTANOV
DATA ENGINEER
Calgary, Alberta
*******@*****.**
(19) Eugene Turkestanov LinkedIn
PROFILE SUMMARY
Senior Data Engineer with 20+ years of hands-on experience designing enterprise ETL pipelines, energy, distribution and financial data integrations, and cloud/hybrid data platforms. Expert in ETL design using SSIS, SQL, ADF, Spark, and Python. Proven track record building star/snowflake models, SCDs, high-volume data ingestion systems, and automated data pipelines in compliance-driven environments (public sector, healthcare-adjacent systems, financial trust accounting). Experienced with governance, sharing agreements, PII handling, and mentoring junior staff.
Core tools: SQL Server • SSIS • ADF • Azure • APIs • Data Modeling (Kimball) • SCD • ETL Optimization
CORE SKILLS
ETL / ELT Development: SSIS, ADF, Synapse, Spark
Database Development: SQL Server, Oracle, Snowflake, T-SQL, Stored Procedures, Views, UDFs, Azure SQL
Data Modeling: Star/Snowflake, SCD, fact/dimension design, multi-dimensional modeling
Data Integration: API ingestion, JSON/XML pipelines, event-based loads, batch orchestration
Governance & Compliance: Data sharing agreements, PII rules, audit support, lineage
Automation: Workflow Manager, DevOps, schema validation, data quality frameworks, orchestration
Leadership: Mentoring junior staff, sprint planning, technical walkthroughs, stakeholder communication
PROFESSIONAL EXPERIENCE
SENIOR DATA ENGINEER — ROCKPOINT GAS STORAGE
April 2025 – Present
Environment: Azure, SQL Server, DELTA Lake, ADF, Fabric, PySpark, Power BI
Designed and delivered large-scale ETL/ELT pipelines using ADF, Fabric Data Pipelines, Synapse, and PySpark.
Engineered dimensional and multi-dimensional models (star, snowflake, SCD) supporting analytics and operational reporting.
Developed high-performance stored procedures, triggers, and UDFs for complex financial and operational datasets.
Built robust ingestion frameworks integrating APIs, SQL Server, Delta Lake, and OneLake sources.
Created Medallion Architecture (Bronze/Silver/Gold) enabling high-quality downstream analytics.
Automated pipeline execution, monitoring, and validation through Fabric orchestration.
Produced technical documentation: lineage, mapping specs, ERDs, governance artifacts.
Created a data quality framework using Azure SQL & Python that integrates Azure OpenAI models for semantic validation, anomaly detection, and automated generation of remediation summaries used in ETL workflows
Mentored junior data engineers on ETL best practices, SQL tuning, and data modeling.
SENIOR DATA ENGINEER — GOVERNMENT OF ALBERTA (Analytics & Trust Accounting)
June 2021 – March 2025
Environment: Informatica (integration support), ADF, Synapse, SQL Server, Oracle, Delta Lake
Led ETL development across multiple healthcare-adjacent and financial trust domains under strict legislative and governance requirements.
Built complex ingestion pipelines integrating APIs, JSON payloads, Oracle, SQL Server, and cloud storage.
Designed enterprise data warehouse models (star/snowflake, SCD) supporting case management, trust accounting, taxation, UAT workflows.
Developed auditable ETL frameworks with built-in error recovery, metadata tracking, and schema drift detection.
Conducted data reconciliation across 30+ years of legacy records: trust balances, disbursements, vendor payments, taxation data.
Authored full source-to-target mappings, lineage, data dictionaries, and governance documentation.
Partnered with ministry leaders to refine analytics requirements and translate them into BI semantic models.
Supported UAT testing, defect triage, reconciliation scripts, and validation datasets for financial compliance.
Automated deployments through Azure DevOps CI/CD, reducing manual promotion errors.
DATABASE DEVELOPER (ETL) — H&R BLOCK CANADA
Nov 2019 – May 2021
Environment: Snowflake, SSIS, Python, ADF, SQL Server**
Built advanced ETL pipelines using SSIS + ADF for tax analytics workflows.
Created Snowflake data models for real-time tax reporting, refunds, and client behavior insights.
Developed Python loaders parsing complex multi-nested IRS-style JSON into curated dimensional structures.
Integrated Power BI datasets for executive dashboards and audit teams.
Designed pipelines synchronizing Dynamics 365 Finance entity data (vendors, batches, receipts).
DATA INTEGRATION SPECIALIST — GRID GROUP / CALFRAC
Oct 2018 – Nov 2019
Designed and implemented data integration architecture consolidating operational, accounting, and CRM systems.
Built ADF ingestion workflows processing sensor telemetry (pressures, pump rates, volumes).
Created dimensional models supporting engineering and safety reporting.
SENIOR BI/ETL DEVELOPER — MAWER INVESTMENT MANAGEMENT
Nov 2013 – Oct 2018
Built enterprise DW using Kimball, SCD, ODS, and SSIS.
Engineered ETL frameworks consolidating financial and trade data.
Designed data quality and anomaly detection rules for NAV & settlement processes.
Led Agile ceremonies, mentoring developers and coordinating multi-team UAT.
SR DATABASE DEVELOPER — PROPAK SYSTEMS
2011 – 2013
Delivered SSIS ETL packages transforming manufacturing & engineering datasets.
Developed C#/ASP.NET internal systems backed by SQL Server ETL layers.
DBA / ETL DEVELOPER — CALL GENIE
2007 – 2011
Managed SQL Server & MySQL environments; tuning, recovery, performance optimization.
Built ETL processes using Pentaho, SQL, and custom scripts.
ETL / DATABASE DEVELOPER (Multiple Earlier Roles)
2000 – 2007
Migrated data from Rocket UniVerse and JD Edwards into SQL Server & Oracle.
Developed ETL scripts, SSIS packages, and reconciliation procedures for financial and trust systems.
Supported web and desktop applications with SQL-based reporting layers.
EDUCATION
Bachelor of Computer Science — Moscow State Industrial University
Diploma, Accounting — Moscow Accounting College