I am looking at the relevant library for PySpark to get the schema registry from Kafka and decode the data. Does anyone know what is the code/library convert from scala to pyspark in scala-code?
Asked
Active
Viewed 226 times
2 Answers
0
You can use requests
package to send requests to schema-registry
restAPI and get the schema of your topic and also if you are listening to some specific topics you can cache the schema of them on spark and use them

Hossein Torabi
- 694
- 1
- 7
- 18
0
Pyspark can import and use any JVM Spark class. Any Scala other Java examples you find, therefore should just work

OneCricketeer
- 179,855
- 19
- 132
- 245