Post Job Free
Sign in

Data Engineer

Company:
Cohesive Technologies
Location:
Dallas, TX
Posted:
December 13, 2023
Apply

Description:

Roles & Responsibilities:

• Apply data and systems engineering principles to develop code spanning the data lifecycle including ingest, transform, consume end to end from source to consumption for operational and analytical workloads that minimize complexity and maximize business value.

• Work as part of an agile scrum team to deliver business value.

• Participate in design sessions to understand customers' functional needs.

• Work with solution architect and development team to build quick prototypes leveraging existing or new architecture.

• Provide end-to-end flow for a data process, map technical solutions to the process.

• Develop and deploy code in continuous development pipelines leveraging off-the-shelf and open-source components of Enterprise Data

• Warehouse, ETL, and Data Management processes adhering to the solution architecture.

• Perform software analysis, code analysis, requirements analysis, release analysis and deployment.

Experience & Qualifications:

• Hands-on development experience in distributed, analytical, cloud-based, and/or open-source technologies.

• At least 10+ years of professional experience, building software for Data Ingestion/Data Movement ETL pipelines for operational and/or analytical systems.

• Hands-on programming experience in Python, Java, and Scala.

• Should have strong implementation skills redshift, kafka, realtime streaming solutions.

• Expertise with coding and implementing data pipelines in cloud-based data infrastructure, analytical, and no-SQL databases (i.e.: AWS, Snowflake, MongoDB, Postgres).

• Experience leveraging build and deploy tools (i.e.: Github, Gradle, Maven, Jenkins).

• Ability to travel up to 10% of the time, if not less.

• Bachelor's degree (or higher)

• Experience Implementing software leveraging flow-based pipelines such as NiFi or Airflow and Streaming services such as Kafka.

• Experience building data pipeline framework.

Apply