Overview We are seeking a talented and experienced Data Engineer to join our dynamic team at Daston.
The ideal candidate will possess a strong background in Google Cloud Platform (GCP) technologies or other public CSPs, coupled with proficiency in Python and SQL.
This role requires a focus on building and maintaining robust data architectures, optimizing data platforms, and ensuring data quality and governance.
Key Responsibilities: Design, build, and maintain data platforms using GCP services.Design, implement and optimize SQL for ETL/ELT data workflows.Write efficient and maintainable Python code for data processing and automation.Collaborate with public sector organizations to understand data requirements and deliver scalable solutions.Implement data quality checks and monitoring processes to ensure data integrity.Troubleshoot and resolve data-related issues in a timely manner.Stay up-to-date with the latest GCP technologies and best practices.
Required Skills and Experience: Strong understanding of Google Cloud Platform (GCP) services (e.g., BigQuery, Dataflow, Dataproc, Cloud Storage, Cloud Run).Proficiency in Python programming language.Advanced SQL skills for data manipulation and analysis.Experience in designing and implementing data pipelines.Knowledge of data lake & warehousing concepts.Knowledge of data modeling and database design principles.Ability to work independently and as part of a team.Excellent problem-solving and communication skills.
Preferred Qualifications: GCP certifications (e.g., Professional Data Engineer).Experience with data visualization tools.Infrastructure as Code and serverless technologies a plusFamiliarity with CI/CD pipelines for data engineering workflows Education: Bachelor's degree in Computer Science, Data Science, Information Systems or a related field.In lieu of degree, relevant experience is acceptable