I've installed Spark inside a folder in my home directory and added that to my .bash_profile. From the terminal, I can run
pyspark
orspark-shell
aftersource ~/.bash_profile
. But for Sparklyr, the default spark location is inside the user folder. Is there a way to permanently change the default location or setting up a path variable without having to configure it everytime I run a new R session?When I try to connect spark declaring the location spark is installed, I get the following error message:
sc <- spark_connect(master = "local", spark_home = "~/server/spark/")
`Error: Java 11 is only supported for Spark 3.0.0+
Is there a way to permanently configure java_home for sparklyr as well? I haven't found anything about this in the documentation.
Thanks!
I'm using Mac OS Catalina 10.15.4, RStudio Version 1.2.5033, Spark version 2.4.5