What were looking for Data Engineer (Only for people located in Latin America)
Responsabilities:
Architect & maintain, cloud-based, Data Warehouses and Data Lakes
Build and orchestrate, reliable, scalable, ETL/ELT pipelines
Optimize batch and streaming workloads, ensuring data quality and security.
Partner with analysts and data scientists to deliver BI, ML, and visualization-ready datasets.
Experience and knowledge
Only for people located in Latin America
46 years as a Data Engineer, owning production-grade DW and Lake environments at enterprise scale.
Cloud Data Warehouses
Hands-on experience with at least one major cloud analytical platform: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse including dimensional modeling and performance tuning.
Data Lakes & Formats
Design and operation of object-store-based lakes (S3, ADLS, GCS) using columnar formats (Parquet, ORC) and ACID table layers (Delta Lake, Apache Iceberg, Apache Hudi).
Orchestration & Automation
Workflow design with Apache Airflow, Dagster, AWS Glue, Azure Data Factory or Cloud Composer; CI/CD with Git, Docker, Terraform (or equivalents).
Data Quality & Observability
Implement automated data tests and lineage/monitoring with dbt tests, Great Expectations, OpenLineage, Prometheus/Datadog, plus encryption & IAM best practices.
Programming & SQL
Advanced SQL and Python proficiency; basic knowledge of Scala/Java or shell scripting is a bonus.
Cloud & DevOps
Practical experience with at least one major cloud provider (AWS, Azure, GCP) and infrastructure-as-code concepts. Relevant certification desirable.
English level B2+ (Please apply ONLY if you meet this requirement).
Compensation & Benefits
Participation in innovative, high-impact regional projects.
People-centric culture with autonomy and continuous learning.
Work 100% Remote
Work from anywhere
We pay in US dollars.
You can enjoy all the holidays in your country.
10 days of PTO payments