I have a server without internet access where I would like to use Delta Lake. So the normal use of Delta lake in the spark session does not work. from pyspark.sql import SparkSession
spark = SparkSession \
.builder \
.appName("...") \
.master("...") \
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog") \
.getOrCreate()
Where shall I copy the Delta-lake github repository? How can I point the spark session to the right libraries