City: Glendale, CA/ Burbank, CAOnsite/ Hybrid/ Remote: Onsite (4 days a week)
Duration: 12 months
Rate Range: Up to$96/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
Expert-level SQL (data modeling, optimization, query performance)
Hands-on experience with Snowflake
dbt for data transformation and analytics modeling
Proficiency in Python for scripting and automation
Familiarity with AWS cloud services
Version control using GitHub/GitLab
Experience in Agile/Scrum environments Responsibilities:
Build and maintain analytical data models and assets within Snowflake, transforming raw data into trusted, consumable datasets.
Partner with Product Managers and stakeholders to translate business requirements into scalable data products.
Develop and manage transformation workflows using dbt and SQL for analytics use cases.
Ensure quality and performance of data assets through query optimization and validation.
Collaborate with Data Architects, SRE, and Platform teams within an Agile pod structure.
Support model deployment, version control, and documentation using Git-based workflows.
Maintain communication and alignment with cross-functional partners, ensuring deliverables meet evolving business needs. Qualifications:
Bachelor's degree in Computer Science, Information Systems, or a related STEM field (required).
5+ years of experience as an Analytics Engineer or Data Engineer in enterprise-scale environments.
Strong understanding of data warehousing principles, dimensional modeling, and pipeline design.
Working knowledge of AWS ecosystem (e.g., S3, Lambda, Glue).
Strong communication and collaboration skills, with the ability to manage multiple priorities in a fast-paced environment.
Experience working in Agile/Scrum teams with structured DevOps processes (pull requests, merge requests, version control).