I can’t tell the effective difference between these exactly, the first one seems to work more robustly. The second one works with the adjustment, but has issues with incomplete multibyte strings, and when I remove the resize bytesWritten - 1, it doesn’t work right at all. I would love to know why these work differently. Thanks!
First:
size_t maxBytes = JSStringGetMaximumUTF8CStringSize(str);
std::vector<char> buffer(maxBytes);
JSStringGetUTF8CString(str, buffer.data(), maxBytes);
return std::string(buffer.data());
Second:
std::string result;
size_t maxBytes = JSStringGetMaximumUTF8CStringSize(str);
result.resize(maxBytes);
size_t bytesWritten = JSStringGetUTF8CString(str, &result[0], maxBytes);
// JSStringGetUTF8CString writes the null terminator, so we want to resize
// to `bytesWritten - 1` so that `result` has the correct length.
result.resize(bytesWritten - 1);
return result;