12

A recent question on SO concerning "Why does allocating a large element on the stack not fail in this specific case?" and a series of other questions concerning "large arrays on the stack" or "stack size limits" made me search for related limits documented in the standard.

I know that the C standard does not specify a "stack" and that it therefore does not define any limits for such a stack. But I wondered up to which SIZE_X in void foo() { char anArray[SIZE_X]; ... } the standard guarantees the program to work, and what happened if a program exceeded this SIZE_X.

I found the following definition, but I'm not sure if this definition is actually a guarantee for a specific supported size of objects with automatic storage duration (cf. this online C11 standard draft):

5.2.4.1 Translation limits

(1) The implementation shall be able to translate and execute at least one program that contains at least one instance of every one of the following limits:

...

65535 bytes in an object (in a hosted environment only)

Does this mean that an implementation must support a value up to 65535 for SIZE_X in a function like void foo() { char anArray[SIZE_X]; ... } and that any value larger than 65535 for SIZE_X is undefined behaviour?

For the heap, a call to malloc returning NULL let's me control the attempt of requesting "too large objects". But how can I control the behaviour of the program if it "requests a too large object with automatic storage duration", specifically if such a maximum size were not documented, e.g. in some limits.h? So is it possible to write a portable function like checkLimits() supporting an "entry barrier" like:

int main() {
   if(! checkLimits()) {
      printf("program execution for sure not supported in this environment.");     
      return 1;
   } else {
      printf("might work. wish you good luck!"); 
   }
   ...
}
Stephan Lechner
  • 34,891
  • 4
  • 35
  • 58
  • 3
    Wouldn't it be a stack overflow? – zerkms Nov 09 '17 at 21:38
  • Why wouldn't it be undefined behavior? – Zimano Nov 09 '17 at 21:39
  • Related: [Why does C not define minimum size for an array?](https://stackoverflow.com/q/14695254/1275169) – P.P Nov 09 '17 at 21:42
  • For a local variable it is *assumed* there is enough memory available. If it is on the same stack as the call stack, there is no warning mechanism. When it fails, there may not even be enough stack left to report it. – Weather Vane Nov 09 '17 at 21:42
  • 1
    I do not know what, if anything, the standard text requires but for the record the smallest actual space I have encountered for a compiler _claiming_ to be C99-compliant is just shy of 128 bytes. – doynax Nov 09 '17 at 21:42
  • 1. Whatever it says is about *bytes*, and `sizeof int` is usually not `1`. 2. It’s not *undefined* behavior larger than 65535; it’s *implementation-defined* behavior how big you can go. A lot of implementations say you can get much larger than that. – Daniel H Nov 09 '17 at 21:43
  • @Daniel H: thanks, type `int` is misleading concerning `SIZE_X` and `65535`. Changed type to `char`. But can I write some code like `int main() { if(! checkLimits()) { printf("program not supported in this environment."); return 1;}`? – Stephan Lechner Nov 09 '17 at 21:53
  • 1
    The issue is obviously stack overflow, but the standard doesn't even mention the word stack. Perhaps we should demand from compiler authors that stack-overflowing programs, like the one in the linked question, should not crash because they're perfectly defined. – Petr Skocik Nov 09 '17 at 21:55
  • 1
    The problem with a `checkLimits` method like that is that, in a lot of cases, it would be limited by the amount of memory available. In an environment with more than one program running, some other program can allocate memory between when you call `checkLimits` and when you actually call `foo`. – Daniel H Nov 09 '17 at 22:00
  • @zerkms: yes, obviously. But the standard does not define a stack overflow and particularly not any of its consequences, does it? Can I test or even just "catch" such a situation? – Stephan Lechner Nov 09 '17 at 22:14
  • @Zimano: probably yes, but which part of the program is UB, and which part of the standard defines that it is? – Stephan Lechner Nov 09 '17 at 22:16

1 Answers1

6

Technically, an implementation only needs to translate and execute one program with a 65,535-byte object (and the other things listed) in order to conform to the standard. It could fail on all others.

To know that larger programs work, you must rely on details of your specific implementation. Most implementations provide for more stack space than 64 KiB, although it may be undocumented. There may be linker switches for adjusting the stack space allowed.

E.g., for the ld linker on current macOS, the default is 8 MiB, and the -stack_size switch can be used to set more or less (for the main thread).

I would say that since the C standard says that environmental limits such as stack space may constrain the implementation, anything other than the fact that one particular sample program must work is technically undefined behavior.

Eric Postpischil
  • 195,579
  • 13
  • 168
  • 312
  • 2
    "Most implementations provide for more stack space than 64 KiB" --> There are 100s of millions of embedded processors each year using C with various degrees of limited resources, so I have doubts about "most". – chux - Reinstate Monica Nov 09 '17 at 22:26
  • 2
    @chux: If you want to be technical, I expect none of those is a C implementation since none of them documents all the things the C standard requires an implementation to document. I believe the number of conforming C implementation is zero, and 51% of those have large stacks. – Eric Postpischil Nov 09 '17 at 22:35
  • 2
    Certainly agree with the 51% of 0 part. – chux - Reinstate Monica Nov 09 '17 at 23:33