Hi two questions if I can learn?
- What are possible reasons to cause running out of heap space in Scala or Java?
- Are there best practice to avoid it?
Hi two questions if I can learn?
When you start a Java
process, it sets a default maxHeapMemory, which is the maximum heap memory that can be used by that process. You can pass a command-line argument to increase that, but depending on your application's memory usage, it still may not enough. Memory leaks is another problem that can eat up your memory (not releasing it for garbage collection) possibly causing out of memory issues. Setting the heap memory allocated in the beginning and maximum allowed heap memory depends on your application, its memory usage, you can judge, test and profile your application (for memory usage) for that.
You have to be aware of your heap memory in the first place. On the other hand, it is possible that your application have memory leaks. One way that you can help the JVM's Garbage Collector is to nullify your references. Try to look at this question Understandin Java Memory Management
We can assign maxHeapmemory size by going to control Panel>Java>Java Tab>View>Java runtime Environment>
Assisgn RunTime Parameter size as per requirement eg: -Xmx4000m
1) Maximum amount of heap memory. Solutions:
-Xmx // maximal amount of heap memory parameter
-d32 // 32-bit VM
2) Heap memory permanent generation space. Solution:
-XX:MaxPermSize
3) Memory leaks. Solutions:
jVisualVM
(../java/bin/jvisualvm )FindBugs
tool or similar4) System.gc
suggests a Garbage Collector to execute.
It is possible to handle an OutOfMemoryError by monitoring memory consumption via Runtime.freeMemory
, Runtime.maxMemory
and Runtime.totalMemory
, and execute System.gc
on par with switching in a mode that is less demanding for memory consumption, in your application (for example swap data to disk or database). A key decision (in my use case) was implementing a function that executes System.gc no more than once per interval of time, for particular thread, if memory was low.
Of course a system could easily go to swap by itself, but it will be slower, and usually you have more than a single VM in production, on one machine.
Defining objects within the scope of use greatly reduces need to do a lot of memory management and simplifies how you code.
Rule of thumb - local objects are usually a better idea compared to global ones.
class AddThem{
private int sumGlobal = 0; //Usually bad idea for long lived processes
public int addThem(int a, int b){
int sumLocal;
sumLocal = a + b
return sumLocal;
}
}
This may be simplified too much but if sumGlobal is a collection of objects... they remain in memory until the instance of AddThem is garbage collected. sumLocal suffers a different fate and better for such a situation.
There's not much about this question that is specific to Scala. It's pretty much just a JVM question. It's quite common for a non-trivial program (Scala or Java) to exceed the default JVM heap limit.
What is specific to Scala is the fact that every function literal and ever by-name parameter is compiled to a separate class. It's pretty easy to be unaware of when you're creating function literals in Scala, since it makes it so easy to use them, as is the emphasis in functional programming.
So, in large Scala programs this can sometimes lead to an exhaustion of the Perm-Gen space where classes are loaded (as opposed to the "ordinary" heap spaces from which instances of user-defined classes are allocated). The JVM has separate options for setting the Perm Gen size. In the "stock" Oracle JVM it's -XX:MaxPermSize=###u
(where u
is a units character, k
for kilo-, m
for mega- and g
for giga-bytes).