1

I am running built in sample example which comes as part of Spark installation, and running in Hadoop 2.7 + Spark with JDK 8. However it is giving me the following error:

Exception in thread "main" java.lang.OutOfMemoryError: Cannot allocate new DoublePointer(10000000): totalBytes = 363M, physicalBytes = 911M

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.OutOfMemoryError: Physical memory usage is too high: physicalBytes = 911M > maxPhysicalBytes = 911M at org.bytedeco.javacpp.Pointer.deallocator(Pointer.java:572) at org.bytedeco.javacpp.Pointer.init(Pointer.java:121)

I followed the following SO question as well, and did the configuration changes.

In addition to this, I referred to these links as well: YARN-4714 , HADOOP-11090

Are there any issues in running Spark in JDK 8.

The below are the versions of the softwares that I am running in my simple cluster:

jdk-8u131-linux-x64
scala-2.12.2
spark-2.1.1-bin-without-hadoop
hadoop-2.7.0

One thing when I run the program in JDK 7, it is working fine, but failing with JDK 8.

Has anyone encountered this problem and if so, what is the fix? Isn't hadoop , spark, scala not yet compatible with JDK 8?

Can anyone please help me?

Community
  • 1
  • 1
CuriousMind
  • 8,301
  • 22
  • 65
  • 134

2 Answers2

1

You are receiving OOM error is the indication of shortage of memory for java to launch. As you have mentioned JDK7 it worked fine. Upgrading to JDK8 require more memory compare to JDK7. Please check JDK8 memory requirements here - https://dzone.com/articles/java-8-permgen-metaspace

Fairoz
  • 1,616
  • 13
  • 16
0

Spark is not yet released for Scala 2.12. I don't know if it solves the original problem but you should switch to Scala 2.11 anyway.

simpadjo
  • 3,947
  • 1
  • 13
  • 38
  • Where is this mentioned? Can you please share the link? – CuriousMind May 09 '17 at 08:59
  • Spark maven repo: https://mvnrepository.com/artifact/org.apache.spark There are no artifacts for scala 2.12. Related ticket in Spark Jira: https://issues.apache.org/jira/browse/SPARK-14220 – simpadjo May 09 '17 at 09:14
  • You can refer https://spark.apache.org/downloads.html in Note section Note: Starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 users should download the Spark source package and build with Scala 2.10 support. – Bhavesh May 09 '17 at 09:27