0

I'm trying to fix it with this code but it's not working for me . i'm reading just a json file with 2.7 MB

conf = SparkConf().set("spark.cores.max", "16g") \
   .set("spark.driver.memory", "16g") \
   .set("spark.executor.memory", "16g") \
   .set("spark.executor.memory_overhead", "16g") \
   .set("spark.driver.maxResultsSize", "0")

spark = SparkSession.builder\
      .master("local[1]")\
      .appName("name")\
      .config(conf=conf)\
      .getOrCreate()
trincot
  • 317,000
  • 35
  • 244
  • 286
  • Does this answer your question? [How to deal with "java.lang.OutOfMemoryError: Java heap space" error?](https://stackoverflow.com/questions/37335/how-to-deal-with-java-lang-outofmemoryerror-java-heap-space-error) – seenukarthi Jul 12 '21 at 11:10
  • this link is helpful is someone has the same problem [link](https://stackoverflow.com/questions/48726208/how-do-i-read-a-large-json-array-file-in-pyspark) – bou aya Jul 13 '21 at 11:16

0 Answers0