Post Job Free
Sign in

GCP CloudOps Engineer

Company:
neteffects
Location:
Maryland Heights, MO, 63043
Posted:
April 15, 2024
Apply

Description:

W2 Candidates Only Please we can not use C2C for this position

Very Long Term Contract position (starts as 12+ months, but will most likely be a few years in duration

Rate: $90-95/hr. W2 with benefits/PTO

GCP CloudOps Engineer (with Data intensive environment experience)

Team:

Focus is mainly on building data pipelines and solutions for environmental data sets

Will be only person in this role on the team (other data engineers and data stewards on the sub team). There are other people in this role/peers in other teams doing the same role they will be able to speak with. But expectations is to complete the needs independently for this team.

Role:

Mainly work on deploying and maintain cloud infrastructures.

Adding observabilities to their pipelines & services

ID opportunities for workflow optimization & cloud cost effectiveness/savings.

Implementing the CI/CD pipelines for their data pipelines

Cloud: GCP preferred, will be expanding footprint in GCP for a bunch of projects.

GCP is their preference. AWS acceptable. If you have two equal profiles in all the skills, they will select GCP proficiency over AWS.

Experience with mutli cloud preferable (GCP/AWS preferred)

IaC - Terraform

Programming – Golang and/or Python – used for automation.

Golang preferred (working on a lot of google resources; native lang. to be able to write automation for a bunch of stuff they are working on)

Python also preferred.

Bash – write command lines, etc

Containerization: apps deployed in Docker, and Kubernetes (orchestration)

Not a software dev but need to be able to develop automation or implement automation.

Monitoring and adding observability to their services is critical function – proficiency in tool needed. Grafana, Prometheus, etc.

Build the baseline then identify the SLI, and build things SLO, SLA’ to commit to customer.

Security principles – building secure and compliant solution.

DATA Background:

Extensive cloud infrastructure mgmt. in a data intensive environment (bc they are data engineering team.)

THIS MEANS:

Work experience handling terabytes or petabytes of the data; magnitude of data. The solution will be different in terms of processing big data as opposed to dealing with mega or gigabytes of data.

2Modern data processing technologies or data pipelines or workflows involved. Meaning: data coming from various different sources such as data collected by IoT, and there are things like data streaming, data ingestion, data pipelining involved.

SCALE MATTERS & PERFORMANCE MATTERS ^ there are different solutions for huge data and working with modern data processing technologies.

Background in data environment & familiar w Data Engineering concepts.

A lot of data products in CS Warehouse built on top of Google Big Query

A lot of the data pipelines or services will have the input/output connection with data warehouse so that’s where a basic understanding of SQL is required.

SQL: be able to manipulate or build automation scripts using sql

Know how SQL and Kafka works; concepts.

7 years exp

Strong Communication: Ability to articulate technical concepts: What is your choice of technology and be able to provide the reasons as to why for technical decisions.

Desirable:

Domain: Geospatial tools, or working with geospatial data or environmental or Ag data

Machine learning – exploring some projects around building some analytics solutions feature engineering to support the modeling teams.

Data preprocess – like feature engineering, preparing data

Apply