https://issues.apache.org/jira/browse/SPARK-832
Is this issue actually resolved? How can you include PYTHONPATH in spark workers in spark 2.2? Looking for someway that does not involve some global change (i.e. you are not allowed to use .profile, .bashrc or dist a change to every node on the cluster).
SPARK_HOME is where spark lives. Docs seems to indicate that this is also where conf/* are picked up. Is this correct?
UPDATE: I think the answer is here: Pyspark append executor environment variable
This really was not very well linked from the search results for PYTHONPATH. I have attempted to scatter breadcrumbs everywhere to help out.