-1

So my teacher made me a challenge of putting a bird into a c++ win32 program. I found a unicode U+1F426 but how can I put it? Any thoughts? Thanks

1 Answers1

1

Insofar as you're using an output medium that supports printing it you can easily via UTF-16 strings: L"\U0001f426" although L"" may work as well

#include <iostream>
#include <io.h>
#include <fcntl.h>

int wmain(int argc, wchar_t* argv[])
{
    _setmode(_fileno(stdout), _O_U16TEXT);
    std::wcout << L"" << std::endl;
}
Community
  • 1
  • 1
Mgetz
  • 5,108
  • 2
  • 33
  • 51
  • 1
    Console output is unlikely to work, I'd suggest displaying to a window instead. – Mark Ransom Apr 23 '15 at 14:43
  • @MarkRansom I linked to a question that shows how to display unicode correctly on the console. But you are correct in the sense that you'll need to use a font that will support that character. Sadly my Win32 UI skills are lacking. – Mgetz Apr 23 '15 at 14:44
  • Sorry I didn't notice the `_setmode` call the first time I saw this answer, that changes everything. You're still unlikely to have the character in your console font, but at least it's not completely unfeasible. – Mark Ransom Apr 23 '15 at 15:04
  • On systems where `wchar_t` is 2 bytes in size, you might need to use `L"\uD83D\uDC26"` instead of `L"\U0001f426"`, depending on your compiler. – Remy Lebeau May 05 '15 at 20:46
  • @RemyLebeau aye, but this is windows and it supports the use of the character directly according to MSDN – Mgetz May 06 '15 at 01:50
  • @Mgetz: Again, it depends on whether the compiler you are using supports the `\U` (uppercase U) syntax for defining a UTF-32 codepoint value in a UTF-16 wide literal. Otherwise, use the `\u` (lowercase u) syntax to define the individual UTF-16 codeunits for the codepoint. – Remy Lebeau May 06 '15 at 02:04