4

I installed pyspark 3.2.0 via pip install pyspark. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark since that is my understanding of what SPARK_HOME should be.

  1. Where can I find spark-defaults.conf? I want to modify it
  2. Am I right in setting SPARK_HOME to the installation location of pyspark ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark?
figs_and_nuts
  • 4,870
  • 2
  • 31
  • 56

1 Answers1

4

2. The SPARK_HOME environment variables are configured correctly.

1. In the pip installation environment, the $SPARK_HOME/conf directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.

过过招
  • 3,722
  • 2
  • 4
  • 11