3+ years Python/PySpark (Spark SQL), 3+ years GCP (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL), 2+ years Airflow (or similar orchestration), experience with cloud migrations (preferably Snowflake to GCP), strong ETL, SQL, and quick adaptability.
Data Engineer (Contract)
Location: 100% Remote
Citizenship: (GC/USC)
Duration: ASAP (open-ended, with potential FTE conversion)
SCOPE OF WORK:
This is someone who ideally has done a Snowflake to GCP migration, but if they don't, then they must have some migration from a Cloud Environment to GCP.
Must have Basic ETL Tasks and Python/PySpark/SQL experience.
Ability to pick things up quick and adapt - very important
Job Details
Strong with PySpark (esp. Spark SQL) for complex transformation pipelines
Hands-on with Airflow for orchestration and BigQuery SQL for querying and data modeling
Good experience in GCP (BigQuery, GCS, Pub/Sub, Cloud Run, Functions,Cloud SQL)
Strong ownership mindset - able to work under pressure and balance speed with reliability
Techs Stack:
Python & PySpark (Spark SQL) - 3+ years
Airflow (or any orchestration tool) - 2+ years
Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) - 3+ years