I am using spark 1.6.0 in local mode. I have created ipython pyspark profile so pyspark kernel will start in jupyter notebook. All this works correctly.
I want to use this package spark-csv inside of jupyter notebook. I tried to edit file ~/.ipython/profile_pyspark/startup/00-pyspark-setup.py
and put --packages com.databricks:spark-csv_2.11:1.4.0
after pyspark-shell
command, without success. Still getting this error message:
Py4JJavaError: An error occurred while calling o22.load.
: java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at http://spark-packages.org
I have tried also [this solution][2] and many others...none of them worked.
Do you have any suggestions?