5

I considered the C++11-based enum bitset introduced here. I came up with some sample program:

#include <bitset>
#include <type_traits>
#include <limits>

template <typename TENUM>
class FlagSet {

private:
  using TUNDER = typename std::underlying_type<TENUM>::type;
  std::bitset<std::numeric_limits<TUNDER>::max()> m_flags;

public:
  FlagSet() = default;

  FlagSet(const FlagSet& other) = default;
};

enum class Test
{
  FIRST,
  SECOND
};


int main(int argc, char *argv[])
{
  FlagSet<Test> testFlags;
  return 0;
}

The program compiles just fine using clang++ (clang version 3.8.1 (tags/RELEASE_381/final)) via clang++ -std=c++11 -o main main.cc. However, if I use g++ (g++ (GCC) 6.2.1 20160830) via g++ -std=c++11 -o main main.cc instead, the compiler eventually exhausts system memory. Is this an issue with g++ or is this code somehow not compliant with the standard?

Community
  • 1
  • 1
hfhc2
  • 4,182
  • 2
  • 27
  • 56
  • 2
    `std::bitset::max()>` That's one big bitset. – Borgleader Sep 13 '16 at 12:27
  • 2
    I wonder if g++ and clang use a different underlying type. – NathanOliver Sep 13 '16 at 12:28
  • @NathanOliver AFAIK the default underlying type for an enum class is 32 bit int? (Or wtv the default is there is one specified by the standard because you can forward declare an enum class as opposed to regular enums that cannot be) – Borgleader Sep 13 '16 at 12:30
  • @Borgleader AFAIK you are correct. I just wonder if that could be the difference. Unless g++ just can't handle a `std::bitset::max()>`. – NathanOliver Sep 13 '16 at 12:31
  • Which looks like it is the case. Coliru gives a timeout if main is just `std::bitset::max()>foo;`. – NathanOliver Sep 13 '16 at 12:33
  • Apparently both clang++ and g++ say that the size of the underlying type is equal to 2147483647, i.e. the underlying type is the underlying type is a signed 32 bit int. – hfhc2 Sep 13 '16 at 12:35

1 Answers1

5

std::bitset<std::numeric_limits<TUNDER>::max()> is 256 MiB in size (assuming 32-bit int). It's great that clang successfully compiles it, but it's not particularly surprising that gcc runs out of memory.

If you're intending to use the enumerators as bitset indices you'll have to pass the largest enumerator in as a separate template parameter; there is as yet (Max and min values in a C++ enum) no way to find the range of an enumeration.

Example:

template <typename TENUM, TENUM MAX>
class FlagSet {

private:
  std::bitset<MAX + 1> m_flags;

public:
  FlagSet() = default;

  FlagSet(const FlagSet& other) = default;
};

enum class Test
{
  FIRST,
  SECOND,
  MAX = SECOND
};

FlagSet<Test, Test::MAX> testFlags;
Community
  • 1
  • 1
ecatmur
  • 152,476
  • 27
  • 293
  • 366
  • Or maybe OP expected max() to return the highest value in the enum, that was my guess. Either using the max value of the underlying type is, overzealous to say the least. – Borgleader Sep 13 '16 at 12:36
  • Well, I was just blindly copying from here: http://stackoverflow.com/a/31906371/1255016 I guess the answer should be edited... – hfhc2 Sep 13 '16 at 12:37
  • @hfhc2 ah... that answer is using a non-`class` enum, so it'll have underlying type `char`; so while the space is still wasteful (128 or 256 bits, most likely) it won't crash the compiler. – ecatmur Sep 13 '16 at 12:40
  • @hfhc2 No, the answer is fine in a pre-C++11 context, enum (not enum class) is defined to be as small as possible which means an enum would use char as its underlying type, whereas enum class is defined to be int32 by default. Which means youre getting a bitset thats is *much* larger by using enum class. – Borgleader Sep 13 '16 at 12:40
  • @Borgleader: Interesting, the things one does not know about C++ :) T take it that the `digits` approach is portable and minimal? – hfhc2 Sep 13 '16 at 12:49
  • @hfhc2 `digits` would only be correct if you the enumeration were a flags enumeration (`A = 1, B = 2, C = 4` etc.). I don't think that's what you have, so you'll have to supply the maximum value and use that as the size of the bitset. – ecatmur Sep 13 '16 at 12:52
  • Well, that is a shame. I would have like to avoid having to specify the values of the enum entries. – hfhc2 Sep 13 '16 at 13:01
  • @Borgleader can you cite that enum was as small as possible in C++03? – Yakk - Adam Nevraumont Sep 13 '16 at 13:19
  • @Yakk I think @Borgleader means that in C++03 `enum class` did not exist so one would always use non-`class` enum. – ecatmur Sep 13 '16 at 13:28
  • 1
    @ecatmur Ok, then a citation that an `enum` is "as small as possible" even in C++11. I am aware of no such guarantee. This could be because it does not exist, or because I am simply unaware of it. Hence a request for citation! – Yakk - Adam Nevraumont Sep 13 '16 at 13:32
  • @Yakk It's possibly that it isn't guaranteed, but I think that was the rationale, i.e. allow the enum to be stored on a smaller type if it can be. I'm not a language lawyer sorry. – Borgleader Sep 13 '16 at 13:34
  • 2
    @Yakk sorry, you're totally right; the sole guarantee ([dcl.enum]/7) is that it should be no larger than `int` if it fits in `int`; the actual size is implementation-defined. Indeed it looks like both the x86 and Windows ABI use `int` for non-fixed enums that fit in `int`. – ecatmur Sep 13 '16 at 13:51