I am using pyspark from a notebook and I do not handle the creation of the SparkSession. I need to load a jar containing some functions I would like to use while processing my rdds. This is something which you can easily do using --jars which I cannot do in my particular case. Is there a way to access the spark scala context and call the addJar method? I tried to use the JavaGateway (sparksession._jvm...) but have not been successful so far. Any idea?
Thanks Guillaume