Position Overview
Location Requirement:
Must be local to Columbus, Ohio, or willing to relocate for a long-term, multi-year engagement at the Ohio Department of Medicaid (ODM) offices.
Snowflake Development & Support:
Provide technical support in developing reliable, efficient, and scalable solutions on Snowflake.
Design and develop features using Snowpark with Python, including gathering and iterating on requirements.
Explore and implement new Snowflake capabilities through proof of concepts (POCs) aligned with business needs.
Performance & Optimization:
Conduct performance tuning of queries and procedures.
Recommend and document Snowflake best practices.
Security & Access Control:
Configure and manage role-based access controls, Virtual Warehouses, Tasks, Snowpipe, and Streams to support diverse use cases.
Implement data governance, including row- and column-level security using secure views and dynamic data masking.
Set up user/query log analysis, history capture, and email alert configurations.
Documentation & Collaboration:
Maintain Snowflake technical documentation in compliance with data governance and security policies.
Engage with the open-source community and contribute to Snowflake’s libraries, such as Snowpark Python and the Snowflake Python Connector.
Mandatory Qualifications
Core Expertise:
Proficient in Data Warehousing, Data Migration, and Snowflake (required).
Strong background in implementing, executing, and maintaining data integration solutions.
Snowflake Experience:
2–3 years of hands-on experience with the Snowflake platform, including Snowpipe and Snowpark.
Skilled in Snow SQL, PL/SQL, and writing Snowflake procedures using SQL, Python, or Java.
Experience optimizing Snowflake performance and performing real-time database monitoring.
Cloud & Database Experience:
4–6 years of hands-on experience with cloud-based databases.
Strong understanding of database architecture, with excellent critical thinking and problem-solving skills.
Big Data & Distributed Computing:
Knowledge of big data tools and platforms, including experience with Hadoop and PySpark.
Familiarity with distributed computing frameworks such as Spark and DASK.
Cloud & Platform Experience:
Hands-on experience with AWS services.
Experience with data migration using Snowflake.
Proficient in Snowpark with Python.
Security & Access Control:
Understanding of security protocols and frameworks such as SAML, SCIM, OAuth, OpenID, Kerberos, and access policies/entitlements.
Required Certifications
Snowflake Certification:
SnowPro Associate (minimum)
SnowPro Core (preferred)
SnowPro Advanced (given highest consideration).
Bachelor's degree in information technology, computer science or a related field.