Consider:
int test(int )
{
auto = 5;
+= ;
return ;
}
Is there a way to get g++ to compile this program? I've tried g++ -fextended-identifiers -finput-charset=utf-8 -c utf8-test.cpp
and I get back errors like this:
utf8-test.cpp:1:9: error: stray ‘\360’ in program
1 | int test(int )
| ^
It works great with clang.
Note that I'm not generally endorsing the practice of using emoji in identifiers. It was just a limit case for testing purposes. Though, I have a class it would be really nice to name ℤpField (with a nice type alias to ZPField to save everyone the pain of figuring out how to type it).
Edit: The TLDR answer appears to be that gcc will likely support this in gcc 10, but it doesn't right now, and this is a known issue. It's also not clear that this support is required by the standard and it's possible that you're required to use \U
escaped hexadecimal or UCN sequences to specify Unicode characters in identifiers.