Post Job Free
Sign in

Data Architect

Company:
Exclusive Resorts LLC
Location:
Denver, CO, 80202
Pay:
135000USD - 150000USD per year
Posted:
May 24, 2025
Apply

Description:

Job Description

About the Role

We are seeking a visionary and technically skilled Data Architect to lead the design, implementation, and governance of our data infrastructure. This role is critical to ensuring the integrity, security, and scalability of our data systems. You will be responsible for architecting data solutions that support analytics, AI/ML, and business intelligence across the organization. All data-driven initiatives—including those involving LLMs and machine learning—will run through this role.

Key Responsibilities

Design and maintain scalable data architectures across SQL Server, PostgreSQL, MongoDB/DynamoDB, and Snowflake.

Architect and manage AWS RDS, S3, Data Lakes, Athena, and Glue pipelines.

Lead ETL/ELT processes using tools like Stitch Data, and ensure data quality, consistency, and accessibility.

Implement and enforce data governance, including PII protection, data lineage, and change tracking.

Collaborate with engineering and analytics teams to support Power BI dashboards and reporting.

Integrate and support AI/ML workloads using AWS Bedrock, SageMaker, and GCP Vertex AI.

Define and maintain systems of truth and ensure data is clean, reliable, and secure.

Ensure all data infrastructure and deployments are managed via Terraform and SFDX, and tracked in CI/CD pipelines.

Develop and maintain data engineering workflows using Python and modern data processing frameworks.

Stay at the forefront of AWS and data technologies, bringing innovative solutions to the team.

Required Qualifications

5+ years of experience in data architecture, data engineering, or a related field.

Deep expertise in:

SQL Server, PostgreSQL, MongoDB or DynamoDB

AWS RDS, S3, Athena, Glue, Data Lakes

Snowflake, Power BI, and Stitch Data

ETL/ELT concepts and tools

Python for data engineering and automation

Data security, especially around PII

Data cleaning, validation, and transformation

Change tracking and auditability of data systems

Terraform and SFDX for infrastructure and Salesforce deployments

CI/CD pipelines for deployment automation and governance

AWS Bedrock, GCP Vertex AI, and other AI/ML platforms

Strong understanding of systems of truth, data modeling, and data lifecycle management

Excellent communication and collaboration skills

Preferred Qualifications

Experience with data observability and data cataloging tools

Familiarity with data mesh or data fabric architectures

Experience with Spark, Airflow, or other orchestration tools

Full-time

Apply