I develop my own string class that has both small string optimization and has an internal flag to know if the string is Ascii, UTF8, WTF8 or a byte string. The constructor
String(const char* );
can be used to construct either an Ascii string or an UTF8 string. It should only be used with literals such as:
const String last_name = "Fayard"
const String first_name = "François"
The constructor needs to compute both the length of the string and check if it is Ascii or UTF8. Therefore, I wrote those functions so they can be evaluated at compile time.
inline constexpr il::int_t size(const char* s) {
return (*s == '\0') ? 0 : (size(s + 1) + 1);
}
inline constexpr bool isAscii(const char* s) {
return (*s == '\0')
? true
: (((static_cast<unsigned char>(*s) & 0x80_uchar) ==
0x00_uchar) && isAscii(s + 1));
}
The constructor is written like this and is available in the headers so he can be inlined.
String(const char* data) {
const int n = size(data);
const bool ascii = isAscii(data);
if (n <= max_small_string) {
...
} else {
data_ = malloc();
...
}
}
But I can't manage to get the functions size and isAscii run be evaluated at compile time (tried and check the assembly with gcc 4.8.5, clang 4.0.1, icpc 17.0.4). Is there a way to do that?
PS : The solution needs to be C++11 only and compile with gcc 4.8.5 and Visual Studio 2015.