Post Job Free
Sign in

AWS Data Engineer

Company:
msysinc
Location:
Remote, OR
Posted:
May 02, 2025
Apply

Description:

Title: AWS Data Engineer

Location: Remote

Length: Long term

Restriction: W2 or C2C

Description:

*** Webcam interview long term project initial PO for 1 year with multiyear extensions *** ***Remote*** Need 3 references

Project: Data Lake Implementation and Maintenance

The client is looking to ingest datasets into a scalable and efficient data lake on the AWS platform and Azure platforms to ingest and process a variety of data sources, including ERP data, labor/workforce data, business data, affordability data, health data, and more. The data lake will follow a medallion architecture, with these resources focused on hydrating the bronze and silver layers. The selected data engineering staff will be responsible for designing and

implementing the data ingestion workflows, data quality checks, and data transformation pipelines to populate the bronze and silver layers of the data lake. Key requirements include expertise in AWS data services (e.g., S3, Glue, Athena, Redshift), Azure Data Factory, data modeling and optimization, data management, data security, data governance, all in close collaboration with technologists, data analysts, and business stakeholders. The successful vendor will demonstrate a proven track record in building and maintaining large-scale, enterprise-grade data lake solutions.

Data engineers will be responsible for designing and implementing data ingestion workflows to ingest data from a variety of sources, including government operations, business data, workforce data, and health data. Domain expertise in one or more of these areas is preferred. They will build robust, scalable, and efficient ELT (Extract, Load, Transform) patterns using AWS services such as S3, Glue, Athena, and Redshift. The data engineers will also be responsible for designing the data models and managing the schemas for the bronze and silver layers of the medallion architecture, as well as implementing data quality checks, data lineage tracking, and metadata management to ensure data governance and compliance. Additionally, they will automate data pipeline workflows using tools like AWS Glue, AWS Step Functions, or Apache Airflow, and collaborate closely with technical staff, data analysts, data scientists, and business stakeholders. The selected data engineers should have a minimum of 3-5 years of experience in building and maintaining large-scale, enterprise-grade data lake solutions on the AWS platform and will be required to maintain comprehensive documentation of the data lake architecture, data models, and data pipeline workflows. Engineers will also support a few existing Azure Data factory workloads.

Apply