2

I understand that VLAs are part of C99 but not C++0x. My question is then why and how g++ would compile code with VLAs. I wrote a test case below:

test.c

#include "stdio.h"

int main(int argc, char *argv[]) {
    int i;
    int array[argc];

    for(i = 0; i < argc; i++)
        array[i] = i;

    for (i = 0; i < argc; i++)
        printf("%d ", i);
    puts("");

    return 0;
}

I also copied it into a file called test.cpp. The following all compile:

$ gcc test.c -o test                 // expected
$ g++ test.cpp -o test               // not expected
$ g++ test.cpp -o test -x c++        // really not expected
$ g++ test.cpp -o test -std=c++11    // really not expected

I have read the SO questions Variable length arrays in C++? and What is the difference between g++ and gcc? but I cannot find an answer as to why g++ would do this or how it does. This seems like extremely unpredictable behavior as the standard specifies no VLAs. Does g++ recognize this as C code and compile with cc1? Is there a way to force g++ to stick to the standard?

Additionally, I can break it with the following:

test2.cpp

#include "stdio.h"
#include <vector>

int main(int argc, char *argv[]) {
    int i;
    int array[argc];

    for(i = 0; i < argc; i++)
        array[i] = i;

    for (i = 0; i < argc; i++)
        printf("%d ", i);
    puts("");

    std::vector<decltype(array)> temp;

    return 0;
}

Which then fails to compile:

$ g++ test2.cpp -o test2 -std=c++11
test2.cpp: In function ‘int main(int, char**)’:
test2.cpp:16:32: error: ‘int [(((sizetype)(((ssizetype)argc) + -1)) + 1)]’ is a variably modified type
     std::vector<decltype(array)> temp;
                                ^
test2.cpp:16:32: error:   trying to instantiate ‘template<class> class std::allocator’
test2.cpp:16:32: error: template argument 2 is invalid
test2.cpp:16:38: error: invalid type in declaration before ‘;’ token
     std::vector<decltype(array)> temp;

This error appears to mean that I am still allowed to declare VLAs, I just can't have them as an argument to decaltype. This makes everything seem much stranger. I can understand the obvious answer of, "It's wrong, so just don't do it" or, "if you're using C++, use std::vector" but that seems like an end-game type of answer and doesn't quite get to the root of the issue.

I cannot compile on Visual Studios to compare as I do not currently have access to it.

System information in case that is the difference:

$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 14.04.1 LTS
Release:    14.04
Codename:   trusty

$ g++ -v
Using built-in specs.
COLLECT_GCC=g++
COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-linux-gnu/4.8/lto-wrapper
Target: x86_64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 4.8.2-19ubuntu1' --with-bugurl=file:///usr/share/doc/gcc-4.8/README.Bugs --enable-languages=c,c++,java,go,d,fortran,objc,obj-c++ --prefix=/usr --program-suffix=-4.8 --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --with-gxx-include-dir=/usr/include/c++/4.8 --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-gnu-unique-object --disable-libmudflap --enable-plugin --with-system-zlib --disable-browser-plugin --enable-java-awt=gtk --enable-gtk-cairo --with-java-home=/usr/lib/jvm/java-1.5.0-gcj-4.8-amd64/jre --enable-java-home --with-jvm-root-dir=/usr/lib/jvm/java-1.5.0-gcj-4.8-amd64 --with-jvm-jar-dir=/usr/lib/jvm-exports/java-1.5.0-gcj-4.8-amd64 --with-arch-directory=amd64 --with-ecj-jar=/usr/share/java/eclipse-ecj.jar --enable-objc-gc --enable-multiarch --disable-werror --with-arch-32=i686 --with-abi=m64 --with-multilib-list=m32,m64,mx32 --with-tune=generic --enable-checking=release --build=x86_64-linux-gnu --host=x86_64-linux-gnu --target=x86_64-linux-gnu
Thread model: posix
gcc version 4.8.2 (Ubuntu 4.8.2-19ubuntu1) 
Community
  • 1
  • 1
shiveagit
  • 95
  • 1
  • 1
  • 9
  • 2
    Compile with `-pedantic` and you'll get a warning. By default, GCC allows them. – chris Feb 09 '15 at 23:12
  • Compilers typically implement a whole bunch of extensions, you have to tell it not to if you don't want them. And even then you get some anyway; the standard usually specifies a minimum set of behaviour. – M.M Feb 09 '15 at 23:43
  • Are you asking how GCC implements its VLA extension? Or? – Lightness Races in Orbit Feb 10 '15 at 00:09
  • @LightnessRacesinOrbit he's asking why g++ does not reject the code containing VLA – M.M Feb 10 '15 at 00:52
  • @Matt McNabb That's what I was looking for, thank you! Standards are minimums and this functionality is due to an extension. – shiveagit Feb 10 '15 at 07:38

1 Answers1

5

VLAs are a gcc (and clang) compiler extension in C++ mode, documented here. These extensions are enabled by default.

You should see a warning if you compile with -pedantic.

Funnily, you will not with gcc4.9 and -std=c++14 or -std=c++1y. As they explain here, they removed the -Wvla warning in C++14 mode because VLAs were once in the draft for the standard. They did not actually make it in the final version of C++14.

clang still warns in C++14 mode, as it should.

Edit: gcc 5.1 warns in C++14 mode too, so that got updated correctly.

Drew Dormann
  • 59,987
  • 13
  • 123
  • 180
Baum mit Augen
  • 49,044
  • 25
  • 144
  • 182