I do understand that one of the problems with recursion is the stack depth that can lead to problems in very deep recurrences. Also I am aware that recursion can be replaced by explicit stack usage to avoid such issues.
On the other side, recursion allows for succinct and clear (and beautiful, I would say) code and is the main way to present algorithms in all standard textbooks if I recall correctly.
My question is more practical than theoretical:
I am running the following program to figure out how far can I go before I get a java.lang.StackOverflowError
My reasoning was that e.g. if I had a small graph view a few thousands vertices and minimal processing per traversal, then recursive vs iterative approach might not matter that much:
public static void foo(int i) {
if (i % 5000 == 0) {
System.out.println(i);
}
foo(i + 1);
}
public static void main(String[] args) {
foo(0);
}
The output was:
Exception in thread "main" java.lang.StackOverflowError
5000
10000
So from that my interpretation is that the program would blow up after processing just above over 10000 vertices doing a DFS which sounds small.
Googling I found that the default size of the stack can be configurable when we run a program but I wanted to know what is the industry practice on this?
Is it to set explicit thread stack size for heavy server programs so the example above wouldn't be an issue in practice? Or is it to avoid writing recursion altogether? Or is my example not really representative of a problem?
Update
I do understand that the code sample above is an infinite recursion. The only purpose of that was to see how far it would go before it breaks.