Salary is 130k to 140k
We’re looking for a skilled Data Engineer to join our growing team and help shape the backbone of our data infrastructure. In this role, you’ll be responsible for designing, building, and maintaining robust data pipelines using SSIS and other ETL tools, while working with large-scale datasets to support analytics and business intelligence efforts.
Key Responsibilities:
Design, develop, and maintain ETL workflows using SSIS to process structured and unstructured data
Work with large datasets to build scalable data pipelines and optimize data flows across various platforms
Collaborate with data analysts, data scientists, and other engineers to deliver high-quality data solutions
Integrate data from diverse sources (on-prem and cloud) into centralized repositories or data lakes
Ensure data integrity, security, and performance across the pipeline lifecycle
Participate in the development of data warehousing solutions and contribute to data modeling efforts
Requirements:
3+ years of hands-on experience with SSIS and building ETL pipelines
Solid understanding of relational databases (e.g., SQL Server, Oracle) and data warehousing concepts
Experience working with big data technologies (e.g., Hadoop, Spark, Databricks, or similar)
Strong SQL skills and familiarity with scripting languages (e.g., Python or PowerShell)
Knowledge of data governance, version control, and workflow automation tools
Experience with cloud platforms (Azure, AWS, or GCP) is a plus