4

Hi two questions if I can learn?

  1. What are possible reasons to cause running out of heap space in Scala or Java?
  2. Are there best practice to avoid it?
monica
  • 1,035
  • 1
  • 9
  • 21
  • 5
    The most likely cause of running of heap space is using too much heap space. The best solution is to use less heap space. A common solution is to allow more heap space. i.e. common sense will get you a long way to a solution. – Peter Lawrey Jan 22 '13 at 09:53
  • As the original question is already answered by a few people, I think you should ask a new question for the new queries/edits you've added. It's cleaner that way. – Swapnil Jan 22 '13 at 19:09

6 Answers6

6

When you start a Java process, it sets a default maxHeapMemory, which is the maximum heap memory that can be used by that process. You can pass a command-line argument to increase that, but depending on your application's memory usage, it still may not enough. Memory leaks is another problem that can eat up your memory (not releasing it for garbage collection) possibly causing out of memory issues. Setting the heap memory allocated in the beginning and maximum allowed heap memory depends on your application, its memory usage, you can judge, test and profile your application (for memory usage) for that.

Swapnil
  • 8,201
  • 4
  • 38
  • 57
  • `./eclipse -vmargs -Xss64m -Xms1400m -Xmx4096m`. here is my argument when loading eclipse to run. 64M! I'm wondering what kind of pratices to make program use less if we there is anything we can do? – monica Jan 22 '13 at 07:49
  • @monica there is many thing that you can do to reduce the program memory usage, Garbage Collector is one of them, but you must never put a large data into 1 / many variable because it will eat up your memory. Try to process your data directly and dispose when finished. – goravine Jan 22 '13 at 07:52
  • Do you use these options to run `eclipse`? This seems to be for the `eclipse` process (which could be just another Java process. And 64M is 64MB, which is not very high if you're wondering about it. – Swapnil Jan 22 '13 at 07:52
  • The `-Xss` option consumes virtual memory if it is not used. i.e. for a 64-bit JVM the cost of making it too large is notional. – Peter Lawrey Jan 22 '13 at 09:55
2

You have to be aware of your heap memory in the first place. On the other hand, it is possible that your application have memory leaks. One way that you can help the JVM's Garbage Collector is to nullify your references. Try to look at this question Understandin Java Memory Management

Community
  • 1
  • 1
Michael 'Maik' Ardan
  • 4,213
  • 9
  • 37
  • 60
1

We can assign maxHeapmemory size by going to control Panel>Java>Java Tab>View>Java runtime Environment>

Assisgn RunTime Parameter size as per requirement eg: -Xmx4000m

Neelam Singh
  • 619
  • 1
  • 6
  • 10
1

1) Maximum amount of heap memory. Solutions:

-Xmx    // maximal amount of heap memory parameter
-d32    // 32-bit VM

2) Heap memory permanent generation space. Solution:

-XX:MaxPermSize

3) Memory leaks. Solutions:

  • Code reviews
  • Profiling using jVisualVM (../java/bin/jvisualvm )
  • FindBugs tool or similar

4) System.gc suggests a Garbage Collector to execute.

It is possible to handle an OutOfMemoryError by monitoring memory consumption via Runtime.freeMemory, Runtime.maxMemory and Runtime.totalMemory, and execute System.gc on par with switching in a mode that is less demanding for memory consumption, in your application (for example swap data to disk or database). A key decision (in my use case) was implementing a function that executes System.gc no more than once per interval of time, for particular thread, if memory was low.

Of course a system could easily go to swap by itself, but it will be slower, and usually you have more than a single VM in production, on one machine.

idonnie
  • 1,703
  • 12
  • 11
1

Defining objects within the scope of use greatly reduces need to do a lot of memory management and simplifies how you code.

Rule of thumb - local objects are usually a better idea compared to global ones.

class AddThem{
    private int sumGlobal = 0; //Usually bad idea for long lived processes

    public int addThem(int a, int b){
        int sumLocal;
        sumLocal = a + b
        return sumLocal;
    }
}

This may be simplified too much but if sumGlobal is a collection of objects... they remain in memory until the instance of AddThem is garbage collected. sumLocal suffers a different fate and better for such a situation.

korefn
  • 955
  • 6
  • 17
1

There's not much about this question that is specific to Scala. It's pretty much just a JVM question. It's quite common for a non-trivial program (Scala or Java) to exceed the default JVM heap limit.

What is specific to Scala is the fact that every function literal and ever by-name parameter is compiled to a separate class. It's pretty easy to be unaware of when you're creating function literals in Scala, since it makes it so easy to use them, as is the emphasis in functional programming.

So, in large Scala programs this can sometimes lead to an exhaustion of the Perm-Gen space where classes are loaded (as opposed to the "ordinary" heap spaces from which instances of user-defined classes are allocated). The JVM has separate options for setting the Perm Gen size. In the "stock" Oracle JVM it's -XX:MaxPermSize=###u (where u is a units character, k for kilo-, m for mega- and g for giga-bytes).

Randall Schulz
  • 26,420
  • 4
  • 61
  • 81