"String processing functions" is overly generic and as such misleading.
There are a limited set of libraries that do locale-aware conversions.
Only one (to the best of my knowledge) is specified in terms of IOStreams: Boost Lexical Cast. As such they would be subject to the underlying standard library implementation of locale. It's possible that they would be using locks.
To be more complete:
Boost Locale is - obviously - locale aware but needn't use the global locale and as such might be using the more modern thread-safe locale functions from libc. To change locale parameters in a thread-safe fashion, POSIX defined the newlocale()
, uselocale()
and freelocale()
functions.
Boost Convert 2.0 has the option to use Lexical Cast or Stream. When you do, the situation is again the same as the standard library implementation in use.
A subset of String Algorithms library is locale aware, e.g. for case insensitive comparison or case conversion. I'd say if you use them with the global locale you could expect the standard library quality of implementation is again the deciding factor.
In the light of the above:
think carefully why you are using locale aware parsing. Sometimes you don't need that - e.g. because there is no variance (parsing integers) or the locale itself is fixed (use e.g. Boost Spirit Qi or Boost Convert, or indeed std::stoi
and friends or c++17 from_chars
which is practically guaranteed to be the best preformance money can buy
When you require locale awareness, consider the options. Consider using a thread-private locale instance (so it comes with separate with facets imbued) and measure your performance: measure more, worry less.