0

In Xcode6, what's the "compiler default" for C++ language dialect.

I am using a C++ new feature std:max(a,b,c)

if I use "Compiler Default", it failed to compile.

When I changed to "C++11 or GNUC++11", it compiles fine.

I am wondering if compiler default is C++98?

Adam Lee
  • 24,710
  • 51
  • 156
  • 236

1 Answers1

4

I ran below code to get - GNU C++ 98.

#include <iostream>

int main()
{

//gnu mode
#ifndef __STRICT_ANSI__
    std::cout << "GNU - ";
#endif

// C++ iso standard
#if __cplusplus == 199711L
     std::cout << "C++98" << std::endl;
#elif __cplusplus == 201103L
     std::cout <<  "C++11" << std::endl;
#elif __cplusplus > 201103L
     std::cout <<  "C++14" << std::endl;
#endif

}

Macros chosen

  1. __cplusplus - From gcc online documentation

    Depending on the language standard selected, the value of the macro is 199711L, as mandated by the 1998 C++ standard; 201103L, per the 2011 C++ standard; an unspecified value strictly larger than 201103L for the experimental languages enabled by -std=c++1y and -std=gnu++1y.

  2. __STRICT_ANSI__ - From clang user manual

    Differences between all c* and gnu* modes => c* modes define __STRICT_ANSI__

As a side note, __STRICT_ANSI__ for GNU standard differentiation could also be found from this SO answer

$ g++ -E -dM -std=c++11 -x c++ /dev/null >b
$ g++ -E -dM -std=gnu++11 -x c++ /dev/null >a
$ diff -u a b
 --- a  2014-12-19 12:27:11.000000000 +0530
 +++ b  2014-12-19 12:27:05.000000000 +0530
 @@ -144,6 +144,7 @@
 #define __STDC_UTF_16__ 1
 #define __STDC_UTF_32__ 1
 #define __STDC__ 1
 +#define __STRICT_ANSI__ 1
 #define __UINTMAX_TYPE__ long unsigned int
 #define __USER_LABEL_PREFIX__ _
 #define __VERSION__ "4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.54)"
Community
  • 1
  • 1
kiranpradeep
  • 10,859
  • 4
  • 50
  • 82