If I want to check a combined value with enum value, I run following code (included there remarks)
// 001 | 010 = 011.
Enum1 firstAndSecond1 = Enum1.First | Enum1.Second;
// 011 | 000 = 011 == true but should be false!
Console.WriteLine("None: {0}", (firstAndSecond1 | Enum1.None) == firstAndSecond1);
// 0010 | 0100 = 0110.
Enum2 firstAndSecond2 = Enum2.First | Enum2.Second;
// 0110 | 0001 = 0111 == false.
Console.WriteLine("None: {0}", (firstAndSecond2 | Enum2.None) == firstAndSecond2);
public enum Enum1
{
None = 0b_000,
First = 0b_001,
Second = 0b_010,
Third = 0b_100,
All = 0b_111
}
public enum Enum2
{
None = 0b_0001,
First = 0b_0010,
Second = 0b_0100,
Third = 0b_1000,
All = 0b_1111
}
The problem is if I assign None
value as 0
, the check is incorrect as it returns true
, it only works if None
value is set as 1 << 0
Should the None value always be set as power of two or am I missing something? Because, for example, looking at this topic: What does the [Flags] Enum Attribute mean in C#?, all None
values start from 0
in all examples - this is what is confusing me