Post Job Free
Sign in

Cloud Data Platform Engineer

Company:
Lighthouse Technology Services
Location:
Toronto, ON, Canada
Pay:
$40-$45/hr CAD
Posted:
March 11, 2026
Apply

Description:

Lighthouse Technology Services is partnering with our client to fill their Cloud Data Platform Engineer position! This is a 12+ month contract with potential to extend opportunity and can be remote in Canada with a preference for 1-2 days on site in Toronto. This role will be a T4 employee of Lighthouse Technology Services. No C2C or subcontracting arrangements will be considered.

What You ll Be Doing:

Partner closely with Data Scientists and Machine Learning Engineers to ensure they have reliable, well-structured data required for model development, experimentation, and analytics.

Design, build, and maintain data ingestion and data pipeline workflows within an AWS-based data ecosystem.

Develop and maintain Apache Airflow workflows used to orchestrate complex data processing pipelines across the organization s data platform.

Write and maintain Python-based workflow code, ensuring pipelines are efficient, scalable, and reliable.

Manage pipeline code through a Git-based repository, performing standard source control activities such as branching, commits, code reviews, and version control best practices.

Support the ingestion of data from external vendors and internal systems by evaluating and implementing appropriate ingestion methods, including:

APIs

SFTP data transfers

Real-time and event-driven ingestion patterns

AWS webhooks, SQS streams, or other streaming approaches

Build and manage pipelines that process and load data into the organization s platform architecture, including workflows that move data into Apache Iceberg hosted in AWS before integrating downstream into Snowflake.

Work within a modern orchestration environment utilizing Spark scripts, Kubernetes-based container deployments, and Airflow orchestration.

Collaborate with the data platform and infrastructure teams to support CI/CD processes and containerized deployment pipelines, helping identify issues and communicate improvements.

Monitor pipeline performance and reliability, identifying opportunities to improve pipeline efficiency, data availability, and system stability.

Participate in technical discussions around data architecture, ingestion strategies, and real-time data processing approaches as the platform evolves toward more real-time data capabilities.

What You ll Need to Have

Hands-on experience working within an AWS-based data environment.

Strong experience building and maintaining Apache Airflow workflows for orchestrating data pipelines.

Advanced Python programming skills, including experience writing production-level code for workflow orchestration and data processing.

Experience using Git for source control, including core workflows such as branching, commits, merges, and pull requests.

Experience building data ingestion pipelines from a variety of sources such as APIs, SFTP, and streaming/event-driven systems.

Familiarity with Snowflake, including understanding how data pipelines integrate and load data into the platform.

Experience working with modern data pipeline architectures involving tools such as Spark, Kubernetes, or containerized workflows.

Understanding of CI/CD processes and containerization (Docker) in data engineering environments.

Experience supporting or integrating with data lake architectures, ideally including technologies such as Apache Iceberg.

Strong troubleshooting and collaboration skills, with the ability to effectively communicate issues and technical insights with data platform and infrastructure teams.

Ability to work in a fast-paced, collaborative environment supporting data scientists, ML engineers, and platform teams.

Pay Range:?$40-45/hr CAD

Questions about any of our jobs? Email us at?

View all of our open jobs here:?jobs.lhtservices.com

Apply