I would like to install packages as I create a sparksession in a standalone python script (.py file) that will be uploaded and run in Databricks.
Something like this:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
% pip install package-name
I don't use notebook so how can I configure this sparkSession so that it downloads and installs some external packages for me? I cannot install packages for the cluster since I do not have the rights.