I have used Kafka Streams in Java. I could not find similar API in python. Do Apache Kafka support stream processing in python?

- 179,855
- 19
- 132
- 245

- 333
- 1
- 2
- 5
-
There is https://github.com/wintoncode/winton-kafka-streams -- this is not part of Apache Kafka. I don't know how stable it is and if it's suitable for production yet. – Matthias J. Sax Aug 19 '18 at 17:43
-
3And there is also https://github.com/robinhood/faust – miguno Aug 20 '18 at 07:05
3 Answers
Kafka Streams is only available as a JVM library, but there are a few comparable Python implementations of it
- robinhood/faust (Not maintained as of 2020, but was forked)
- wintincode/winton-kafka-streams (appears not to be maintained)
fluvii
(see discussion)- bytewax
In theory, you could try playing with Jython or Py4j to work with the JVM implementation, but probably would require more work than necessary.
Outside of those options, you can also try Apache Beam, Flink or Spark, but they each require an external cluster scheduler to scale out (and also require a Java installation).
If you are okay with HTTP methods, then running a KSQLDB instance (again, requiring Java for that server) and invoking its REST interface from Python with the built-in SQL functions can work. However, building your own functions there will requiring writing JVM compiled code, last I checked.
If none of those options are suitable, then you're stuck with the basic consumer/producer methods.

- 179,855
- 19
- 132
- 245
-
Is there any example or tutorials to use https://docs.confluent.io/current/ksql/docs/tutorials/index.html#ksql-tutorials with faust streaming? – Mahamutha M Apr 08 '19 at 06:47
-
KSQL is implemented in Java, so I'm not sure I understand the question – OneCricketeer Apr 08 '19 at 22:31
-
@circket_007, KSQL is not available in python. This is what you mean. Am I right? – Mahamutha M Apr 09 '19 at 04:09
-
3@Maha KSQL server has a REST API, so you can submit queries from any language – OneCricketeer Apr 11 '19 at 00:58
-
1btw: here is the direct link to the forked project: https://github.com/faust-streaming/faust – coproc Nov 04 '22 at 11:25
If you are using Apache Spark, you can use Kafka as producer and Spark Structured Streaming as consumer. No need to rely on 3rd part libraries like Faust.
To consume Kafka data streams in Spark, use the Structured Streaming + Kafka Integration Guide.
Keep in mind that you will have to append spark-sql-kafka
package when using spark-submit
:
spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1 StructuredStreaming.py
This solution has been tested with Spark 3.0.1 and Kafka 2.7.0 with PySpark.
This resource can also be useful.

- 186
- 2
- 9
Previously KStrame python API was not available but now its available with new KStream python library https://pypi.org/project/kstreams/
Features:
- Produce events
- Consumer events with Streams
- Prometheus metrics and custom monitoring
- TestClient
- Custom Serialization and Deserialization
- Easy to integrate with any async framework. No tied to any library!!
- Yield events from streams
- Store (kafka streams pattern)
- Stream Join
- Windowing

- 11
- 1
-
Those last three features are not implemented, according to the docs – OneCricketeer Jan 22 '23 at 14:34