City: <\/span b>Burbank, CA
<\/div>
Onsite/ Hybrid/ Remote b> <\/span>Hybrid (4 days onsite per week, no flexibility)
<\/div>
Duration: <\/span b> 12 Months
<\/div>
Rate Range b> <\/span>Upto $96/hr on W2
<\/div>
Work Authorization: <\/span b>GC, USC, All valid EADs except OPT, CPT, H1B
<\/div>
Must Have b>
<\/p>
Snowflake
<\/li>
Snowpark
<\/li>
Python
<\/li>
SQL
<\/li>
Azure Data Factory
<\/li>
ETL / data pipeline migration
<\/li>
REST API integration
<\/li>
Agile / Scrum
<\/li>
AI -assisted development tools such as Cursor or Microsoft Copilot
<\/li ul>
Responsibilities b>
<\/p>
Build, refactor, and support enterprise data pipelines for data collection, transformation, and delivery
<\/li>
Develop and maintain Snowflake -based data solutions using Snowpark, Python, and SQL
<\/li>
Migrate existing Azure Data Factory pipelines into Snowflake Snowpark solutions
<\/li>
Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics use cases
<\/li>
Implement infrastructure that supports secure data storage, processing, and retrieval in Snowflake
<\/li>
Execute work from defined requirements, technical designs, and priorities set by team leads and architects
<\/li>
Identify delivery risks, technical issues, or blockers and escalate as needed
<\/li>
Manage assigned tasks and deliverables against project timelines and sprint commitments
<\/li>
Apply performance tuning and optimization across Python and SQL workflows
<\/li>
Use AI -assisted development tools to support coding, refactoring, debugging, and documentation while following engineering standards
<\/li>
Validate AI -generated output to ensure security, quality, performance, and governance requirements are met
<\/li>
Share AI tool usage patterns and best practices with the broader engineering team
<\/li ul>
Qualifications b>
<\/p>
3 to 5+ years of experience in Data Engineering or Data Integration roles
<\/li>
Strong hands -on experience with Snowflake in a production environment
<\/li>
Strong hands -on experience with Snowpark pipeline development
<\/li>
Senior -level Python skills for data engineering and integration workloads
<\/li>
Advanced SQL skills, including complex transformations and query tuning
<\/li>
Experience working with Azure Data Factory and translating pipeline logic into Python -based implementations
<\/li>
Experience migrating ETL or data pipelines across cloud platforms, especially from Azure to Snowflake
<\/li>
Experience working with REST APIs using Python
<\/li>
Experience in Agile/Scrum teams with sprint -based delivery
<\/li>
Understanding of data security, governance, and enterprise engineering standards
<\/li>
Bachelor’s degree or equivalent practical experience
<\/li ul>
Nice to Have b>
<\/p>
Snowflake Tasks and Streams
<\/li>
Snowflake warehouse configuration and optimization
<\/li>
AWS experience
<\/li>
Azure experience
<\/li>
CI/CD for data engineering workflows
<\/li>
Experience with large -scale cloud data migration projects
<\/li ul>
<\/div span>