Key Responsibilities
• Design, develop, and optimize Snowflake data models, schemas, and warehouses
• Build and maintain ELT/ETL pipelines using Snowflake and cloud-native tools
• Write complex, high-performance SQL queries and stored procedures
• Implement and manage Snowflake features such as Streams, Tasks, Time Travel, and Zero Copy Cloning
• Optimize performance, cost, and scalability of Snowflake workloads
• Integrate Snowflake with upstream/downstream systems (APIs, data lakes, BI tools)
• Collaborate with data engineers, analysts, and AI/ML teams to enable advanced analytics use cases
• Ensure data quality, governance, security, and compliance best practices
• Support production issues, monitoring, and performance tuning
Required Skills & Qualifications
• 5+ years of experience in data engineering or data warehousing roles
• Strong hands-on experience with Snowflake (minimum 2-3 years)
• dvanced SQL skills (query optimization, performance tuning)
• Experience with cloud platforms: AWS, GCP, or Azure
• Strong understanding of data modeling (star/snowflake schemas)
• Experience with ETL/ELT tools (dbt, Airflow, Matillion, Fivetran, or similar)
• Familiarity with Python for data processing and automation
• Knowledge of data security, RBAC, and governance in Snowflake
• Excellent communication and collaboration skills
Nice to Have
• Experience supporting AI/ML or analytics-heavy environments
• Exposure to real-time or large-scale data processing
• Experience with BI tools (Tableau, Looker, Power BI)
• Snowflake or cloud certifications
Why Join This Role
• Opportunity to work with OpenAI, supporting cutting-edge AI and data initiatives
• Exposure to large-scale, high-impact data platforms
• Work in a highly collaborative, fast-paced, innovation-driven environment
• Competitive compensation and long-term growth potential