I am using a python script that establish pyspark environment in jupyter notebook. The kernel is Azure ML 3.6
#find SPARK_HOME Variable environment
import findspark
findspark.init()
import pyspark;
import os
# These reference the jars mentioned on the Snowflake documentation page
os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages net.snowflake:snowflake-jdbc:3.6.24,net.snowflake:spark-snowflake_2.11:2.4.12-spark_2.3,com.microsoft.ml.spark:mmlspark_2.11:0.18.0 pyspark-shell'
from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext, SparkSession
# Use the SparkSession API for datasets and dataframe API
spark = SparkSession.builder.master('local').appName('test').config('spark.driver.memory', '6G').config('spark.driver.maxResultSize', '4G').config("spark.num.executors","8").config("spark.executor.cores", "8").config('spark.executor.memory', '14G').config("spark.worker.instances", "2").getOrCreate()
but I receive this error : Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM
I don't understand why. this code yesterday was working perfectly but today I receive this error.