Data Engineer
• Be a critical senior member of a data engineering team focused on creating distributed analysis capabilities around a large variety of datasets
• Take pride in software craftsmanship, apply a deep knowledge of algorithms and data structures to continuously improve and innovate
• Work with other top-level talent solving a wide range of complex and unique challenges that have real world impact
• Explore relevant technology stacks to find the best fit for each dataset
• Pursue opportunities to present our work at relevant technical conferences
o Google Cloud Next 2019:
o GraphConnect 2015:
o Google Cloud Blog:
• Project your talent into relevant projects. Strength of ideas trumps position on an org chart
You should have:
• At least 7 years experience in software engineering
• At least 2 years experience with Go
• Proven experience (2 years) building and maintaining data-intensive APIs using a RESTful approach
• Experience with stream processing using Apache Kafka
• A level of comfort with Unit Testing and Test Driven Development methodologies
• Familiarity with creating and maintaining containerized application deployments with a platform like Docker
• A proven ability to build and maintain cloud based infrastructure on a major cloud provider like AWS, Azure or Google Cloud Platform
• Experience data modeling for large scale databases, either relational or NoSQL
Bonus points for:
• Experience with protocol buffers and gRPC
• Experience with: Google Cloud Platform, Apache Beam and or Google Cloud Dataflow, Google Kubernetes Engine or Kubernetes