17

I have a C++ application that I am porting to MacOSX (specifically, 10.6). The app makes heavy use of the C++ standard library and boost. I recently observed some breakage in the app that I'm having difficulty understanding.

Basically, the boost filesystem library throws a runtime exception when the program runs. With a bit of debugging and googling, I've reduced the offending call to the following minimal program:

#include <locale>

int main ( int argc, char *argv [] ) {
    std::locale::global(std::locale(""));
    return 0;
}

This program fails when I run this through g++ and execute the resulting program in an environment where LANG=en_US.UTF-8 is set (which on my computer is part of the default bash session when I create a new console window). Clearing the environment variable (setenv LANG=) allows the program to run without issues. But I'm surprised I'm seeing this breakage in the default configuration.

My questions are:

  1. Is this expected behavior for this code on MacOS 10.6?
  2. What would a proper workaround be? I can't really re-write the function because the version of the boost libraries we are using executes this statement internally as part of the filesystem library.

For completeness, I should point out that the program from which this code was synthesized crashes when launched via the 'open' command (or from the Finder) but not when Xcode runs the program in Debug mode.

edit The error given by the above code on 10.6.1 is:

$ ./locale 
terminate called after throwing an instance of 'std::runtime_error'
  what():  locale::facet::_S_create_c_locale name not valid
Abort trap
Peter Hosey
  • 95,783
  • 15
  • 211
  • 370
fixermark
  • 1,261
  • 1
  • 14
  • 19
  • Can you give a short piece of code that exhibits the breakage you see, rather than a generic sample which cannot show it? (This is sometimes called a test case.) It might be as simple as including a boost.filesystem call into your current example. –  Nov 16 '09 at 21:54
  • When you say "throws a runtime exception", what exactly are you seeing? – quark Nov 16 '09 at 21:56
  • 2
    The code that he posted does exhibit the problem, at least on my machine. I will edit his question with the output I get. – Brian Campbell Nov 16 '09 at 22:08
  • 1
    Yes, I've also noticed that standard (as in `std::`) C++ locale support seems completely broken on Mac OS X . `std::locale("")` should select a default locale but fails to provide a working locale even if the user's environment is set to something that works with C's `setlocale` . Not worth an answer, but worth a comment. – CB Bailey Nov 16 '09 at 22:57
  • As noted in my answer, this isn't just 10.6. It's true on 10.4 too. – quark Nov 16 '09 at 23:02

6 Answers6

9

Ok I don't have an answer for you, but I have some clues:

  • This isn't limited to OS X 10.6. I get the same result on a 10.4 machine.
  • I looked at the GCC source for libstdc++ and hunted around for _S_create_c_locale. What I found is on line 143 of config/locale/generic/c_locale.cc. The comment there says "Currently, the generic model only supports the "C" locale." That's not promising. In fact if I do LANG=C the runtime error goes away, but any other value for LANG I try causes the same error, regardless of what arguments I give to the locale constructor. (I tried locale::classic(), "C", "", and the default). This is true as far back as GCC 4.0
  • That same page has a reference to libstdc++ mailing list discussion on this topic. I don't know how fruitful it is: I only followed it a little way down, and it gets very technical very fast.

None of this tells you why the default locale on 10.6 wouldn't work with std::locale but it does suggest a workaround, which is to set LANG=C before running the program.

quark
  • 15,590
  • 3
  • 42
  • 30
  • Thank you for the excellent gumshoe work. I already had a workaround I'm using, as described in the original question (setting `LANG=`). I can execute the workaround, but I'm still curious why the default configuration seems broken. – fixermark Nov 18 '09 at 18:38
  • So the answer seems to be that libstdc++ just does not support locales other than "C" on Mac OS X. I tried the same test with libc++ instead and it does work. (however I notice that, while you can construct a locale that uses a non-UTF-8 encoding, the underlying xlocal support doesn't support actually converting to or from such locales. It seems that only UTF-8 and possibly other Unicode encodings are supported). – bames53 Nov 11 '11 at 15:18
6

I have encountered this problem very recently on Ubuntu 14.04 LTS and on a Raspberry Pi running the latest Raspbian Wheezy.

It has nothing to do with OS X, rather with a combination of G++ and Boost (at least up to V1.55) and the default locale settings on certain platforms. There are Boost bug tickets sort of related to this issue, see ticket #4688 and ticket #5928.

My "solution" was first to do some extra locale setup, as suggested by this AskUbuntu posting:

sudo locale-gen en_US en_US.UTF-8
sudo dpkg-reconfigure locales

But then, I also had to make sure that the environment variable LC_ALL is set to the value of LANG (it is advisable to put this in your .profile):

export LC_ALL=$LANG

In my case I use the locale en_US.UTF-8.

Final remark: the OP said "This program fails when I run this through g++". I understand that this thread was started in 2009, but today there is absolutely no need to use GCC or G++ on the Mac, the much better LLVM/Clang compiler suite is available from Apple free of charge, see the XCode home page.

Community
  • 1
  • 1
András Aszódi
  • 8,948
  • 5
  • 48
  • 51
4

The situation is still the same. But some functionality may be gained by

setlocale( LC_ALL, "" );

This gets you UTF-8 coding on wide iostreams but not money formatting, for my two data points.

locale::global( locale( "" ) );

should be equivalent, but it crashes if subsequently run in the very same program.

Potatoswatter
  • 134,909
  • 25
  • 265
  • 421
3

I had the same problem, checked LANG and LC_MESSAGES and they are not set when you lunch the application through Finder, so the following lines saved the day:

unset("LANG");
unset("LC_MESSAGES");
teki
  • 2,032
  • 1
  • 12
  • 6
1

The _S_create_c_locale exception seems to indicate some sort of misconfiguration: check that whatever your LC_ALL or LANG environment variable is set to, exists in the output of locale -a.

$ env LC_ALL=xx_YY ./test
terminate called after throwing an instance of 'std::runtime_error'
  what():  locale::facet::_S_create_c_locale name not valid
Aborted
$ env LC_ALL=C ./test
$ echo $?
0

But since you're on OS X, I'm not really sure how locale information is supposed to be handled.

ephemient
  • 198,619
  • 38
  • 280
  • 391
  • `locale -a` lists the locale `en_US.UTF-8` as you might expect, so unfortunately that's not enough. – quark Nov 16 '09 at 23:10
0

Quoting the accepted answer:

It has nothing to do with OS X

I encountered this issue on MacOS Big Sur using an outdated MacOS utility. The specific utility was VMWare's ovftool, but none of the above LANG/LC_ALL workarounds fixed it. Updating the tool was the only way to get the error to go away. No combination of locale workarounds would fix this.

In my specific case, the error occurred using ovftool 4.1.0, and the error went away using ovftool 4.4.3.

tresf
  • 7,103
  • 6
  • 40
  • 101