When I create a hiveContext
in spark local model using IDEA,which the spark version is 1.6.0,the program throw a exception.The exception as follows:
Caused by: java.lang.OutOfMemoryError: PermGen space
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
I notices that the java oom permGen space
,so I think it maybe the permGen space is too small.
So I search the problem in google and stack overflow,it suggest me to increase permgen space,then I have tried to increase the space as follows
val conf = new SparkConf().setMaster("local[4]").setAppName("TF")
conf.set("spark.driver.memory","4g")
conf.set("spark.executor.memory","4g")
conf.set("spark.executor.extraJavaOptions","-XX:MaxPermSize=2048m -XX:PermSize=512m")
conf.set("spark.driver.extraJavaOptions","-XX:PermSize=512m -XX:+PrintGCDetails")
val sc = new SparkContext(conf)
val hc = new HiveContext(sc)
It seems does not work, and the parameter does not become effective, the error still exists. As the spark official says,we can set up spark java property by the SparkConf,so I used the sparkConf to increase the permgen space,but it does not work.
How to increase the permgen space in spark and make it become effective.And is there any others get the similar problem,and how to fix it!