Middle Data Engineer
We're hiring a Middle Data Engineer to build and operate reliable ELT/ETL pipelines on Snowflake + Airflow. You'll consolidate data from databases, REST APIs, and files into trusted, documented datasets with clear SLAs and ownership.
Responsibilities:
Ingest data from RDBMS/APIs/files into Snowflake (batch/incremental; CDC when applicable).
Build modular SQL/Python transformations; handle semi-structured JSON; publish consumer-read tables/views.
Orchestrate Airflow DAGs (dependencies, retries, backfills, SLAs) with monitoring and alerting.
Ensure idempotent re-runs/backfills; maintain runbooks and perform RCA for incidents.
Tune performance & cost in Snowflake (warehouse sizing, pruning; clustering when justified).
Partner with BI/Analytics to refine definitions and SLAs for delivered datasets.
Requirements:
24 years building production ETL/ELT; strong SQL (joins, window functions) + Python for data tooling.
Snowflake hands-on: Streams/Tasks/Time Travel; performance & cost basics; JSON handling.
Airflow proficiency: reliable DAGs, retries/backfills, SLAs; monitoring & alert routing.
Data warehousing/modeling (Kimball/3NF), schema evolution; API integrations (auth, pagination, rate limits, idempotency).
Git-based CI/CD; clear written English; privacy/GDPR basics.
Will Be A Plus:
iGaming familiarity: stakes, wins, GGR/NGR, RTP, retention/ARPDAU, funnels; RG/regulatory awareness.
AI & automation interest/experience: Snowflake Cortex for auto-documentation, semantic search over logs/runbooks, or parsing partner PDFs (with guardrails).
Exposure to cloud storage (GCS/S3/ADLS), Terraform/Docker, and BI consumption patterns (Tableau/Looker/Power BI).
What We Offer:
Direct cooperation with the already successful, long-term, and growing project.
Flexible work arrangements.
20 days of vacation.
Truly competitive salary.
Help and support from our caring HR team.