Job Description
ob Description
We are looking for candidates who have a broad set of technical skills across these areas. You will:
Work in collaborative environment that leverages paired programming
Work on a small agile team to deliver curated data products
Work effectively with fellow data engineers, product owners, data champions and other technical experts
Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions
Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles
Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies
Skills Required:
Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
Implement methods for automation of all parts of the pipeline to minimize labor in development and production
Experience in analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products
Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption
Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Experience Required:
5+ years of SQL development experience
5+ years of analytics/data product development experience required
3+ years of Google cloud experience with solutions designed and implemented at production scale
Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc. -
Experience migrating Teradata to GCP
Experience working with Airflow for scheduling and orchestration of data pipelines
Experience working with Terraform to provision Infrastructure as Code
2 + years professional development experience in Java or Python
Education Required:
Bachelor's degree in computer science or related scientific field
Education Preferred:
GCP Professional Data Engineer Certified
Master's degree in computer science or related field
2+ years mentoring engineers
In-depth software engineering knowledgeCompany Description
Founded in 1998 and headquartered in Farmington Hills, MI, Kyyba has a global presence delivering high-quality resources and top-notch recruiting services, enabling businesses to effectively respond to organizational changes and technological advances.
At Kyyba, the overall well-being of our employees and their families is important to us. We are proud of our work culture which embodies our core values; incorporating value, passion, excellence, empowerment, and happiness, creates a vibrant and productive atmosphere. We empower our employees with the resources, incentives, and flexibility that they need to support a healthy, balanced, and fulfilling career by providing many valuable benefits and a balanced compensation structure combined with career development.
Kyyba is an Equal Opportunity Employer.
Kyyba does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other b