I was reading up on array types in Ada and found it interesting that, unlike C++, the language allows their size to be unknown at compile time. I was not sure how they are implemented though so I wrote a small test:
with Ada.Text_IO; use Ada.Text_IO;
with Ada.Command_Line; use Ada.Command_Line;
procedure Main is
type Data is array (1 .. 131_072) of Integer;
type Vector is array (Positive range <>) of Data;
Sequence : Vector (1 .. Argument_Count);
begin
for I in Sequence'Range loop
Sequence (I) := (others => I);
Put (Integer'Image (Sequence (I)(1)));
end loop;
end Main;
And then tried to replicate this code in C using variable length arrays:
#include <stdio.h>
struct data {
int x[131072];
};
int main(int argc, char** argv) {
(void)argv;
struct data sequence[argc - 1];
for (int i = 0; i < argc - 1; i++) {
for (int j = 0; j < 131072; j++)
sequence[i].x[j] = i;
printf("%i ", sequence[i].x[1]);
}
}
I compiled both with gnatmake -gnato -fstack-check -gnat2012 -gnata -O3 main.adb -o main
and gcc -O3 -Wall -Werror -Wextra -pedantic cmain.c -o cmain
. After running the programs, both were failing when given 16 or more arguments - the difference was that cmain
simply segfaulted while main
ended up raising "STORAGE_ERROR : stack overflow or erroneous memory access".
Since both VLAs and unconstrained arrays seem to be (at least on surface) implemented in a similar manner and the former is widely considered to be not safe to use in nearly all circumstances, is it safe to use the latter?