Post Job Free
Sign in

Senior Data Engineer - DHCF

Company:
ShiftCode Analytics
Location:
Washington, DC, 20022
Posted:
April 01, 2026
Apply

Description:

JOB TITLE:

Senior Data Engineer

ENGAGEMENT TYPE:

Contract

WORK ARRANFEMENT:

Hybrid (Onsite & Remote) - 2-5 days onsite per week.

Candidate MUST be local to the DC Metro area.

SHORT DESCRIPTION:

DHCF is looking for a senior data engineer for its data modernization efforts associated with its Medicaid data ecosystem.

Visa : USC and Gc Only

COMPLETE JOB DESCRIPTION:

1. Position Purpose

The Senior Data Engineer serves as the primary technical engine for the agency's Medicaid data ecosystem.

This role is unique:

It requires a high-level mastery of our current legacy environment-characterized by SSIS ETL processes

managed via Team Foundation Server (TFS)-while actively spearheading the execution of our cloud

modernization roadmap.

Under the direction of the Lead Data Warehouse Solution Architect, you will ensure the stability of current

Medicaid reporting while building the future-state Azure Synapse and Databricks Lakehouse.

2. Key Responsibilities

A. Legacy Maintenance & Operational Excellence (Current State)

• ETL Management: Maintain, troubleshoot, and modify complex SSIS packages handling high-volume

Medicaid claims, provider, and member data.

• Version Control: Manage code deployments and branching strategies within TFS, ensuring

continuous integration of legacy SQL assets.

• Legacy Reporting Support: Support and optimize SSRS report queries

and SSAS tabular/multidimensional models to ensure federal and state compliance reporting remains

uninterrupted.

B. Modernization & Migration Execution (Future State)

• Cloud Development: Implement "Medallion Architecture " (Bronze/Silver/Gold) using Azure

Databricks (PySpark/SQL) as designed by the Lead Architect.

• Pipeline Refactoring: Lead the transition of legacy SSIS logic into Azure Data Factory (ADF) and

Databricks notebooks.

• DevOps Transformation: Facilitate the migration of source control and CI/CD pipelines from TFS to

Azure DevOps (Git).

• Synapse Integration: Build and tune Dedicated and Serverless SQL Pools within Azure Synapse to

facilitate advanced analytics and AI-readiness.

C. Data Governance & Security

• Medicaid Compliance: Implement Row-Level Security (RLS) and automated data masking for PHI/PII

in accordance with HIPAA, CMS MARS-E, and NIST standards.

• Data Quality: Develop automated data validation frameworks to ensure data parity between legacy

SQL systems and the new Cloud Lakehouse.

3. Competencies for Success

• Technical Agility: The ability to pivot between a 10-year-old SSIS package and a modern Databricks Spark job in the same day.

• Collaboration: Ability to take high-level architectural blueprints from the Lead Architect and translate

them into high-performance, production-ready code.

• Attention to Detail: Absolute precision in Medicaid data handling, where an error in logic can impact

member benefits or federal funding.

ADDITIONAL JOB REQUIREMENTS:

1. Leads the adoption or implementation of an advanced technology or platform.

2. Expert on the functionality or usage of a particular system, platform, or technology product.

3. Serves as a consultant to clients, guiding the efficient use or adoption of a particular IT product or

platform.

4. Creates implementation, testing, and/or integration plans.

5. Demonstrates expertise in a particular IT platform or service, allowing for maximum IT investment.

MINIMUM EDUCATION/CERTIFICATION REQUIREMENTS:

• Bachelor's degree in Information Technology, or related field or equivalent experience

• Training or certification in a particular product or IT platform/service, as required 793454 - DHCF Senior Data Engineer CLIENT REQUIREMENTS Item Skills Required or Years of Candidate's Years of Exp. Desired Experience 1 Maintaining SQL Server (SSIS/SSAS/SSRS) while concurrently deploying Azure cloud solutions. Required 7 2 Expert-level proficiency in SSIS and T-SQL. Advanced proficiency in Azure Databricks (Unity Catalog, Delta Lake) and Azure Synapse. Required 7 3 Deep experience with TFS (Team Foundation Server) and a strong desire to migrate workflows to Git/Azure DevOps. Required 7 4 Mastery of SQL and Python (PySpark). Required 7 5 6-10 yrs. leading advanced technology projects or service projects Required 7 6 6-10 yrs. full system engineering lifecycle Required 7 7 Bachelor's degree Required 8 Experience with Microsoft Purview for data cataloging. Highly Desired 3 9 Extensive experience with Medicaid/Medicare data structures (e.g., MMIS, EDI 837/835, claims processing). Highly Desired 3 10 Microsoft Certified: Azure Data Engineer Associate or Databricks Certified Professional Data Engineer. Highly Desired 11 Experience in a government or highly regulated environment. Highly Desired 7 12 Accepts In-Person interview if requested Required Y/N 13 Accepts HYBRID work requirement Required Y/N

Apply