It's been four days that I am struggling with this issue, I looked in several webpages dealing with the same issue even here in Stackoverflow but without getting a solution.
I installed Spark-2.3.0 , Scala 2.12.5 and Hadoop-2.7.1 (for winutils master) then set up the according environment variables. I installed findspark and then launch pyspark in my Jupyter Notebook. The issue is that when I run:
sc = pyspark.SparkContext('local')
I get the following error:
java gateway process exited before sending the driver its port number
I should mention that I'm using Java-1.8.0 and I set in my environment variables :
PYSPARK_SUBMIT_ARGS="--master local[2] pyspark-shell"
Please if you have any idea how I can solve this issue, I will be gratefull. Thank you!