ITTConnect is seeking a senior Confluent Kafka/Python Engineer to work on site for a client in the Columbus, OH area. This is a contract position with a client that is a global leader in consulting, digital transformation, technology and engineering services with over 300,000 team members in nearly 50 countries. The end client is in the Utilities sector.
Job location: Columbus, OH or Nashville, TN (both locations would be for on site work)
We are seeking a Senior Kafka / Python Developer to support real-time data streaming initiatives. This role requires deep hands-on experience with the Confluent Kafka platform along with strong Python development skills to build microservices, connectors, and streaming workflows.
The ideal candidate is not just familiar with Kafka but has actively built and operated event-driven systems at scale.
Key Responsibilities:
â
Design and implement real-time data streaming solutions using Confluent Kafka
Develop Kafka producers/consumers, connectors, and streaming pipelines
Build Python-based microservices supporting event-driven workflows
Work with Kafka Streams / ksqlDB / Confluent components as needed
Design and implement event-driven architectures across distributed systems
Ensure reliability, scalability, and performance of streaming pipelines
Troubleshoot data flow, lag, and message processing issues
Collaborate with cross-functional teams integrating Kafka into enterprise systems
Requirements
Strong hands-on experience with Confluent Kafka (not just open-source Kafka exposure)
Experience building producers, consumers, and streaming pipelines
Strong Python development experience (microservices, data processing)
Solid understanding of event-driven architecture
Experience working with real-time or near real-time data systems
Ability to debug and optimize Kafka performance (lag, throughput, partitions)
Preferred Qualifications (Nice-to-Have):
Experience with Kafka Streams, ksqlDB, or Schema Registry
Experience building or managing Kafka connectors (source/sink)
Exposure to cloud environments (Azure/AWS)
Experience integrating Kafka with APIs, data platforms, or microservices ecosystems
Familiarity with CI/CD and containerization (Docker/Kubernetes)
Familiar with Utilities companies.