I came across the below docker image for spark. The image also comes with some of the connectors to some of the popular cloud services. An example of how to use the inbuilt connectors(say Azure storage gen2) in pyspark application will be of great help.
link to dockerhub image : https://hub.docker.com/r/datamechanics/spark
I looked into the below example that was provided but it didn't help much in understanding how to use the connector that comes with the default image https://github.com/datamechanics/examples/blob/main/pyspark-example/main.py