0

I have a simple SendChar function:

void SendChar(const char mC)
{
    INPUT ip={0}; KEYBDINPUT kb={0};
    char mK=mC;

    kb.wScan=mK;
    kb.dwFlags=KEYEVENTF_UNICODE;
    ip.type=INPUT_KEYBOARD;
    ip.ki=kb;
    SendInput(1,&ip,sizeof(INPUT));
}

It run good with normal key but when I want to sends a unicode character example 'á' (0xE1 in unicode table), it sends wrong character ('£').

SendChar(0xE1);
SendChar('á');

But it successes with this

void SendChar()
 {
      INPUT ip={0}; KEYBDINPUT kb={0};

      kb.wScan=0xE1;
      kb.dwFlags=KEYEVENTF_UNICODE;
      ip.type=INPUT_KEYBOARD;
      ip.ki=kb;
      SendInput(1,&ip,sizeof(INPUT));
  }

Please help me know what wrong with my first function?

HuynhAT
  • 175
  • 1
  • 8
  • You are trying to send UTF-8 but MS-Windows is UTF-16. So all you are sending is extended ASCII not Unicode. – Richard Critten Mar 17 '17 at 13:49
  • Thanks Richard, Can you show me how to send unicode key in my app? – HuynhAT Mar 17 '17 at 13:53
  • 1
    change `void SendChar(const char mC)` to `void SendChar(const wchar_t mC)` and pass it a UTF-16 Unicode character. Try `SendChar(L'£');` – Richard Critten Mar 17 '17 at 13:55
  • Thank you very much, It worked! – HuynhAT Mar 17 '17 at 13:59
  • Note that Unicode characters outside of the BMP do not fit in a single `wchar_t`. You should change `SendChar()` to accept a `wchar_t*` or `std::wstring` string instead, calling `SendInput()` with an array of `INPUT`s for each `wchar_t` (not counting the null terminator). See http://stackoverflow.com/a/38625599/65863. – Remy Lebeau Mar 18 '17 at 01:29

0 Answers0