Post Job Free
Sign in

Data Systems Engineer ELK / Kafka / Linux

Company:
MARKS IT Solutions
Location:
Alpharetta, GA
Posted:
February 10, 2026
Apply

Description:

Job Role: Data Systems Engineer – ELK / Kafka / Linux

Team: Real-Time Operations Intelligence (RTOI) – Enterprise Computing

Location: Hybrid

Alpharetta, GA or Menlo Park, CA

3 days onsite per week

Experience Level: 7–15 years

Education: Bachelor’s Degree preferred (not required)

Industry Background: Plus

Role Financial services /Banking/Investment banking

The Real-Time Operations Intelligence (RTOI) team is responsible for streaming terabytes of data daily to support enterprise-scale operational and business intelligence platforms. The team builds and supports large-scale, real-time ETL and streaming pipelines using Kafka, ELK (ElasticSearch), Snowflake, Hadoop, and Linux-based job frameworks.

This role is ideal for a hands-on Data Systems Engineer who is equally comfortable with application development, data engineering, Linux-based deployment, and production support. The engineer will work across the full development lifecycle and support hundreds of internal customers relying on real-time data systems.

Responsibilities

(Including but not limited to)

• Design, develop, deploy, and support real-time data pipelines using Kafka and ELK (ElasticSearch).

• Build and maintain large-scale ETL and streaming frameworks running on Linux platforms.

• Develop and run applications directly on Linux, including debugging CPU, memory, and performance issues.

• Support and monitor pipelines running across large-scale Kafka clusters, ensuring high availability and scalability.

• Troubleshoot and resolve production issues; ensure jobs are up and running for hundreds of internal users.

• Work with data storage and indexing in ElasticSearch, understanding how data is written, stored, and queried.

• Participate in the full software development lifecycle: requirements, design, implementation, testing, deployment, and support.

• Collaborate closely with cross-functional teams and communicate technical concepts clearly.

• Continuously learn new tools and technologies and contribute hands-on in a fast-paced environment.

Required Qualifications

• Strong, hands-on experience working on the Linux platform (development, deployment, debugging).

• 7+ years of overall professional experience in software and/or data engineering.

• Strong application development experience with:

• Python (primary)

• Ruby or Shell scripting (secondary)

• Experience building and maintaining Kafka-based data pipelines.

• Hands-on experience with ELK (ElasticSearch) for data ingestion, storage, and observability.

• Ability to understand and debug application behavior related to CPU, memory, and system performance.

• Experience working in distributed systems environments, with an understanding of scalability and trade-offs.

• Strong communication skills, team collaboration, curiosity, and willingness to “get hands dirty.”

Preferred Qualifications

• Experience with Snowflake database.

• Experience with Spark or large-scale data processing frameworks.

• Strong data analysis background.

• Experience with Flink.

• ELK / ElasticSearch certification (Observability or Data Analysis).

• Experience with cloud platforms (AWS or similar).

• Experience supporting mission-critical, real-time systems.

Technical Environment

Languages: Python, Ruby, Shell (plus Java, C/C++, or Go a plus)

Streaming & Data: Kafka, ElasticSearch (ELK), Snowflake, Hadoop

Platforms: Linux (on-prem and cloud)

Databases: SQL-based systems

Focus Areas: Real-time streaming, observability, scalability, and operational support

Interview Process

• Technical Screening (1 hour) – Focus on Linux experience and hands-on technical background

• Onsite Technical Panel – With senior team members (Ying-Yi & Yenni)

Additional Notes

• This is not a narrow or cookie-cutter data engineering role.

• Candidates must be both data engineers and application developers, not tooling-only profiles.

• The role includes development, deployment, and production support.

• Team works directly within Linux environments—deep Linux knowledge is critical.

Apply