95

I've been forcing the usage of chcp 65001 in Command Prompt and Windows Powershell for some time now, but judging by Q&A posts on SO and several other communities it seems like a dangerous and inefficient solution. Does Microsoft provide an improved / complete alternative to chcp 65001 that can be saved permanently without manual alteration of the Registry? And if there isn't, is there a publicly announced timeline or agenda to support UTF-8 in the Windows CLI in the future?

Personally I've been using chcp 949 for Korean Character Support, but the weird display of the backslash \ and incorrect/incomprehensible displays in several applications (like Neovim), as well as characters that aren't Korean not being supported via 949 seems to become more of a problem lately.

Paul Kim
  • 1,153
  • 1
  • 9
  • 17
  • Interesting, thanks! (The highest-voted cautionary comments are 8 years old though, I doubt that they still apply.) – Tomalak Jul 21 '19 at 11:03
  • 1
    @Tomalak, prior to Windows 8, `WriteFile` to the console returns the number of decoded UTF-16 code points written, which can cause problems with buffered writers that expect this to be the number of UTF-8 bytes written, as it should be. And for `ReadFile` from the console, even in Windows 10, you'll be limited to 7-bit ASCII if the input codepage is set to UTF-8, due to buggy assumptions in the console host, conhost.exe. In Windows 10, it returns non-ASCII characters as null ("\0") in the buffer. In older versions, the read succeeds with 0 bytes read, which looks like EOF. – Eryk Sun Jul 21 '19 at 13:31
  • 2
    Modern Windows programs should be using the Unicode console functions, `WriteConsoleW` and `ReadConsoleW`. Then the only limits are the console's inherent limits with Unicode, i.e. limited to the basic multilingual plane; no support for complex scripts and combining codes; and no support for font fallback if the selected font doesn't have a glyph for a character. Ultimately Microsoft may update the classic console host to remove these limits by switching to a DirectWrite-based implementation, but for now their (and open-source contributors') efforts are focused on the new Windows terminal. – Eryk Sun Jul 21 '19 at 13:42
  • Related. • [What encoding to get Å Ä Ö to work](https://superuser.com/q/675369) • [Using UTF-8 Encoding (CHCP 65001) in Command Prompt](https://stackoverflow.com/q/57131654) • [How to use unicode characters in Windows command line](https://stackoverflow.com/q/388490) • [chcp 65001 and a .bat file](https://stackoverflow.com/q/32182619) • [Making Swedish characters show properly in Windows Command Prompt](https://stackoverflow.com/q/2660264) – Henke Jan 31 '23 at 15:54

4 Answers4

165

Note:

  • This answer shows how to switch the character encoding in the Windows console to
    (BOM-less) UTF-8 (code page 65001), so that shells such as cmd.exe and PowerShell properly encode and decode characters (text) when communicating with external (console) programs with full Unicode support, and in cmd.exe also for file I/O.[1]

  • If, by contrast, your concern is about the separate aspect of the limitations of Unicode character rendering in console windows, see the middle and bottom sections of this answer, where alternative console (terminal) applications are discussed too.


Does Microsoft provide an improved / complete alternative to chcp 65001 that can be saved permanently without manual alteration of the Registry?

As of (at least) Windows 10, version 1903, you have the option to set the system locale (language for non-Unicode programs) to UTF-8, but the feature is still in beta as of this writing and has far-reaching consequences.

To activate it:

  • Run intl.cpl (which opens the regional settings in Control Panel)
  • Follow the instructions in the screen shot below.

Control Panel > Region > Administrative

  • This sets both the system's active OEM and the ANSI code page to 65001, the UTF-8 code page, which therefore (a) makes all future console windows, which use the OEM code page, default to UTF-8 (as if chcp 65001 had been executed in a cmd.exe window) and (b) also makes legacy, non-Unicode GUI-subsystem applications, which (among others) use the ANSI code page, use UTF-8.

    • Caveats:

      • If you're using Windows PowerShell, this will also make Get-Content and Set-Content and other contexts where Windows PowerShell default so the system's active ANSI code page, notably reading source code from BOM-less files, default to UTF-8 (which PowerShell Core (v6+) always does). This means that, in the absence of an -Encoding argument, BOM-less files that are ANSI-encoded (which is historically common) will then be misread, and files created with Set-Content will be UTF-8 rather than ANSI-encoded.

        • Similarly, legacy (non-Unicode) non-console applications will then misinterpret ANSI-encoded files.
      • Pick a TT (TrueType) font, but even they usually support only a subset of all characters, so you may have to experiment with specific fonts to see if all characters you care about are represented - see this answer for details, which also discusses alternative console (terminal) applications that have better Unicode rendering support.

      • As eryksun points out, legacy console applications that do not "speak" UTF-8 will be limited to ASCII-only input and will produce incorrect output when trying to output characters outside the (7-bit) ASCII range. (In the obsolescent Windows 7 and below, programs may even crash).
        If running legacy console applications is important to you, see eryksun's recommendations in the comments.

  • However, for Windows PowerShell, that is not enough:

    • You must additionally set the $OutputEncoding preference variable to UTF-8 as well: $OutputEncoding = [System.Text.UTF8Encoding]::new()[2]; it's simplest to add that command to your $PROFILE (current user only) or $PROFILE.AllUsersCurrentHost (all users) file.
    • Fortunately, this is no longer necessary in PowerShell Core, which internally consistently defaults to BOM-less UTF-8.

If setting the system locale to UTF-8 is not an option in your environment, use startup commands instead:

Note: The caveat re legacy console applications mentioned above equally applies here. If running legacy console applications is important to you, see eryksun's recommendations in the comments.

  • For PowerShell (both editions), add the following line to your $PROFILE (current user only) or $PROFILE.AllUsersCurrentHost (all users) file, which is the equivalent of chcp 65001, supplemented with setting preference variable $OutputEncoding to instruct PowerShell to send data to external programs via the pipeline in UTF-8:

    • Note that running chcp 65001 from inside a PowerShell session is not effective, because .NET caches the console's output encoding on startup and is unaware of later changes made with chcp; additionally, as stated, Windows PowerShell requires $OutputEncoding to be set - see this answer for details.
$OutputEncoding = [console]::InputEncoding = [console]::OutputEncoding = New-Object System.Text.UTF8Encoding
  • For example, here's a quick-and-dirty approach to add this line to $PROFILE programmatically:
'$OutputEncoding = [console]::InputEncoding = [console]::OutputEncoding = New-Object System.Text.UTF8Encoding' + [Environment]::Newline + (Get-Content -Raw $PROFILE -ErrorAction SilentlyContinue) | Set-Content -Encoding utf8 $PROFILE
  • For cmd.exe, define an auto-run command via the registry, in value AutoRun of key HKEY_CURRENT_USER\Software\Microsoft\Command Processor (current user only) or HKEY_LOCAL_MACHINE\Software\Microsoft\Command Processor (all users):

    • For instance, you can use PowerShell to create this value for you:
# Auto-execute `chcp 65001` whenever the current user opens a `cmd.exe` console
# window (including when running a batch file):
Set-ItemProperty 'HKCU:\Software\Microsoft\Command Processor' AutoRun 'chcp 65001 >NUL'

Optional reading: Why the Windows PowerShell ISE is a poor choice:

While the ISE does have better Unicode rendering support than the console, it is generally a poor choice:

  • First and foremost, the ISE is obsolescent: it doesn't support PowerShell (Core) 7+, where all future development will go, and it isn't cross-platform, unlike the new premier IDE for both PowerShell editions, Visual Studio Code, which already speaks UTF-8 by default for PowerShell Core and can be configured to do so for Windows PowerShell.

  • The ISE is generally an environment for developing scripts, not for running them in production (if you're writing scripts (also) for others, you should assume that they'll be run in the console); notably, with respect to running code, the ISE's behavior is not the same as that of a regular console:

    • Poor support for running external programs, not only due to lack of supporting interactive ones (see next point), but also with respect to:

      • character encoding: the ISE mistakenly assumes that external programs use the ANSI code page by default, when in reality it is the OEM code page. E.g., by default this simple command, which tries to simply pass a string echoed from cmd.exe through, malfunctions (see below for a fix):
        cmd /c echo hü | Write-Output

      • Inappropriate rendering of stderr output as PowerShell errors: see this answer.

    • The ISE dot-sources script-file invocations instead of running them in a child scope (the latter is what happens in a regular console window); that is, repeated invocations run in the very same scope. This can lead to subtle bugs, where definitions left behind by a previous run can affect subsequent ones.

  • As eryksun points out, the ISE doesn't support running interactive external console programs, namely those that require user input:

The problem is that it hides the console and redirects the process output (but not input) to a pipe. Most console applications switch to full buffering when a file is a pipe. Also, interactive applications require reading from stdin, which isn't possible from a hidden console window. (It can be unhidden via ShowWindow, but a separate window for input is clunky.)

  • If you're willing to live with that limitation, switching the active code page to 65001 (UTF-8) for proper communication with external programs requires an awkward workaround:

    • You must first force creation of the hidden console window by running any external program from the built-in console, e.g., chcp - you'll see a console window flash briefly.

    • Only then can you set [console]::OutputEncoding (and $OutputEncoding) to UTF-8, as shown above (if the hidden console hasn't been created yet, you'll get a handle is invalid error).


[1] In PowerShell, if you never call external programs, you needn't worry about the system locale (active code pages): PowerShell-native commands and .NET calls always communicate via UTF-16 strings (native .NET strings) and on file I/O apply default encodings that are independent of the system locale. Similarly, because the Unicode versions of the Windows API functions are used to print to and read from the console, non-ASCII characters always print correctly (within the rendering limitations of the console).
In cmd.exe, by contrast, the system locale matters for file I/O (with < and > redirections, but notably including what encoding to assume for batch-file source code), not just for communicating with external programs in-memory (such as when reading program output in a for /f loop).

[2] In PowerShell v4-, where the static ::new() method isn't available, use $OutputEncoding = (New-Object System.Text.UTF8Encoding).psobject.BaseObject. See GitHub issue #5763 for why the .psobject.BaseObject part is needed.

mklement0
  • 382,024
  • 64
  • 607
  • 775
  • 3
    Setting the console's input codepage to UTF-8 limits legacy programs that read via `ReadFile` to 7-bit ASCII input. (Output will be broken prior to Windows 8, but Windows 7 is approach EOL anyway.) If you set the system locale to UTF-8, I suggest setting the "CodePage" value for "HKEY_CURRENT_USER\Console\%SystemRoot%_system32_cmd.exe" (and other window titles of interest) to a legacy OEM codepage, so that legacy non-Unicode console applications will continue to work properly for your locale. Do not use `chcp.com 65001`, except temporarily in batch scripts, such as for `for /f` loops. – Eryk Sun Jul 21 '19 at 15:15
  • 2
    PowerShell and CMD use the console's Unicode API, so these settings for the console codepage are only in regards to the input and output console codepage the shell sets when running an external console application, not anything internal to the shell such as cmdlets, except to the extent that the shell uses the input and output encoding settings when working with text in files and pipes. I'm not sure how these settings are used in PowerShell with regard to that, but CMD uses the console output codepage when decoding batch scripts and reading piped output from a program in a `for /f` loop. – Eryk Sun Jul 21 '19 at 15:22
  • Thanks, @eryksun, good background information, as usual. However, if you know the limitations and run (primarily) UTF-8-aware programs, setting `chcp.com 65001` (or equivalent) globally is a viable option, but I've added caveats to the answer, which point to your comments. – mklement0 Jul 21 '19 at 17:20
  • 2
    @eryksun: As for PowerShell: stdout output _from_ external programs is decoded according to `[console]::OutputEncoding`, and text sent _to_ external programs via a pipe is encoded based on preference variable `$OutputEncoding`. Re _files_: Windows PowerShell: reading defaults to ANSI, unless there is a BOM; writing defaults to UTF-16LE with `>` / `Out-File` and ANSI with `Set-Content`; fortunately, PowerShell Core_now uses BOM-less UTF-8 consistently in all these scenarios. – mklement0 Jul 21 '19 at 17:21
  • 1
    There is an unpleasant surprise with [VS](https://visualstudio.microsoft.com/) (my version is 16.7.2) after changing system locale: all files containing UTF-8 are **forcibly corrupted** by replacing UTF-8 characters with some Unicode placeholders. Big red error message preceded every opening file – dyomas Sep 16 '20 at 08:37
  • @dyomas, that happened after changing the _system locale_ to UTF-8 (code page `65001`)? Your symptom sounds like your files were ANSI-encoded, and the switch then caused Visual Studio to interpret them as UTF-8-encoded. – mklement0 Sep 16 '20 at 15:18
  • No! This was exactly saved with «_Unicode (UTF-8 without signature) - Codepage 65001_» files containing a lot of comments in russian and some UTF-8 specific symbols, such as arrows (→, ←), math operators ≡, ≥, ≤, and so on… – dyomas Sep 16 '20 at 20:07
  • @dyomas, unless you simply wanted to warn others, I encourage you to ask a _new_ question to get to the bottom of this. – mklement0 Sep 16 '20 at 22:19
  • 1
    I report a [problem](https://developercommunity.visualstudio.com/content/problem/1187663/utf-8-encoded-files-corrupted-if-system-locale-swi.html) to MS, confirmation received. Waiting… – dyomas Sep 18 '20 at 15:22
  • 3
    Did I get that right o O? It's the year 2021 AD. And the largest operating system in the world is not handling text files `UTF-8` by default? ( but something Latin-1 – ISO-8859-1 or similar 8-bit-encoding-ish of the 1990's ?) – Frank N Apr 26 '21 at 08:09
  • Changing the system locale to UTF-8 is still marked Beta for a reason, possibly causing errors hard to resolve. For example, it might [remove](https://stackoverflow.com/a/42480249/3528522) VBA code in Excel if it contains non-ASCII characters. – Enno Oct 07 '21 at 08:28
  • 1
    For the `cmd` shell, wrapping the command into a [batch file that temporarily switches the code page to UFT-8](https://superuser.com/a/1523968) is more prudent; if reverting back fails, the code page of the `cmd` shell session that ran the batch file will stay UTF-8. – Enno Oct 07 '21 at 08:59
  • I've officially given up on this, I just can't get the regular PS5.1 running in the default console to output UTF8, even after following all the steps (while this did work in the past on previous machines). Time for me to leave this sorry excuse of a console behind and do everything in Windows Terminal – bluuf Jan 26 '22 at 12:51
  • 1
    @bluuf, moving to Windows Terminal is probably a good idea in general, but I'm surprised it stopped working for you. Did you change to UTF-8 system-wide or did you use the `$OutputEncoding = [console]::InputEncoding = [console]::OutputEncoding = New-Object System.Text.UTF8Encoding` technique? Is the problem just a _display_ problem (characters unsupported by the selected font), or true misinterpretation of data? – mklement0 Jan 26 '22 at 14:57
  • I've changed the encoding, turned the experimental feature back on (I had it turned off because of a bug in Outlook with invite respones), tried different fonts (even the same fonts I use in Windows Terminal). Any character above 127 will be shown as a square in the default console. Luckily Windows Terminal works fine with nerdfonts and glyphs – bluuf Jan 26 '22 at 21:51
6

You can put the command chcp 65001 in your Powershell Profile, which will run it automatically when you open Powershell. However, this won't do anything for cmd.exe.

Microsoft is currently working on an improved terminal that will have full Unicode support. It is open source, and if you're using Windows 10 Version 1903 or later, you can already download a preview version.

Alternatively, you can use a third-party terminal emulator such as Terminus.

jfhr
  • 687
  • 5
  • 13
  • 1
    Unfortunately, running `chcp 65001` from _inside_ a PowerShell session is _not_ effective, because .NET caches the console's output encoding on startup; additionally, Windows PowerShell (but not PowerShell _Core_) requires `$OutputEncoding` to be set. – mklement0 Jul 21 '19 at 14:30
  • 1
    As of today (I have no idea when it was changed), `chcp 65001` works in cmd.exe. I installed Windows Terminal 1.10.2714.0 on Windows 10 Home 20H2 and the experience is identical to Windows PowerShell (5.1) and cmd.exe (for my purposes of simply outputting UTF-8 characters). Interestingly, *PowerShell **Core*** 7.1.5 is completely broken. A fresh install claims to be using code page 65001 according to properties, but behaves as if it is using 437. `chcp` reports 437, and if `chcp 65001` is run it reports 437 but this has no actual effect on the encoding. – WD40 Oct 24 '21 at 15:47
  • 1
    @WD40, `chcp 65001` always worked _in `cmd.exe`_, but not when called from _inside PowerShell_. PowerShell (Core) to this day still defaults to the OEM code page, such as `437`; only the `$OutputEncoding` preference variable is set to UTF-8, which controls the encoding to use for data sent _to_ external programs. To get full UTF-8 support, you need to use `$OutputEncoding = [console]::InputEncoding = [console]::OutputEncoding = New-Object System.Text.UTF8Encoding` All of this covered in the accepted answer. – mklement0 Jan 26 '22 at 14:53
  • @mklement0 thank you for the clarification. But from my reading and understanding of this answer, it makes this answer completely false. `chcp 65001` **does** something for cmd.exe, and ...might do something for PowerShell (if placed in the profile). I know nothing about PowerShell. – WD40 Apr 30 '22 at 19:25
  • @WD40, yes, `chcp 65001` is and was always effective when called from `cmd.exe`, and also when called from PowerShell _if you then call `cmd.exe` from PowerShell_. However, it does _not_ work for _PowerShell itself_ (and its internal commands, aka cmdlets), because .NET, which PowerShell is built on, caches encodings and therefore doesn't pick up on the changed code pages. The `$OutputEncoding = ...` prevents this problem: it tells PowerShell to use the UTF-8 code page and also update's the console's code pages, that is, it _also_ acts like `chcp 65001`. – mklement0 Apr 30 '22 at 19:31
1

Typing some commands (chcp or whatever) whenever starting Command Prompt is can be done with editing registry. It's the right way as it's documented in CMD /?:

If /D was NOT specified on the command line, then when CMD.EXE starts, it looks for the following REG_SZ/REG_EXPAND_SZ registry variables, and if either or both are present, they are executed first.

HKEY_LOCAL_MACHINE\Software\Microsoft\Command Processor\AutoRun

    and/or

HKEY_CURRENT_USER\Software\Microsoft\Command Processor\AutoRun

Now it's 2023 and good news. With Windows Terminal, editing registry or creating an additional batch file is not needed. In Windows Terminal, go Settings > Profiles and locate Command Prompt and then change the Command line from %SystemRoot%\System32\cmd.exe (default) to %SystemRoot%\System32\cmd.exe /K "chcp 65001". It's simple.

Sangbok Lee
  • 2,132
  • 3
  • 15
  • 33
0

The Powershell ISE displays Korean perfectly fine. Here's a sample text file encoded in utf8 that would work:

PS C:\Users\js> cat .\korean.txt

The Korean language (South Korean: 한국어/韓國語 Hangugeo; North 
Korean: 조선말/朝鮮말 Chosŏnmal) is an East Asian language
spoken by about 77 million people.[3]

Since the ISE comes with every version of Windows 10, I do not consider it obsolete. I disagree with whoever deleted my original answer.

The ISE has some limitations, but some scripting can be done with external commands:

echo 'list volume' | diskpart # as admin
cmd /c echo hi

EDIT:

If you have Windows 10 1903, you can download Windows Terminal from the Microsoft Store https://devblogs.microsoft.com/commandline/introducing-windows-terminal/, and Korean text would work in there. Powershell 5 would need the text format to be UTF8 with bom or UTF16.

EDIT2:

It seems like the ideals are windows terminal + powershell 7 or vscode + powershell 7, for both pasting characters and output.

EDIT3:

Even in the EDIT2 situations, some unicode characters cannot be pasted, like (U+21C6), or unicode spaces. Only PS7 in Osx would work.

js2010
  • 23,033
  • 6
  • 64
  • 66
  • 1
    The ISE is of course a powerful tool, but some actions can't be accomplished by the ISE alone. For example, I use Neovim with PowerShell the terminal, which isn't an available option with the ISE. – Paul Kim Jul 22 '19 at 00:10
  • 2
    ISE is an environment for running PowerShell scripts. It doesn't support interactive console applications (e.g. diskpart.exe, the python.exe shell). The problem is that it hides the console and redirects the process output (but not input) to a pipe. Most console applications switch to full buffering when a file is a pipe. Also, interactive applications require reading from stdin, which isn't possible from a hidden console window. (It can be unhidden via `ShowWindow`, but a separate window for input is clunky.) – Eryk Sun Jul 22 '19 at 01:24
  • js2010: A moderator deleted your answer, and my guess as to why is that it may have been flagged as a low-quality answer, given that it provided no explanation. I'll repost the comment that was deleted along with your answer, but to add to @eryksun's point, building on their comment on my answer: If you confine your activities to PowerShell-native commands only, you _never_ need to worry about code pages - neither in the console nor in the ISE. The code page matters when you talk to _external (console) applications_, and for that the ISE is an even poorer choice than the console. – mklement0 Jul 23 '19 at 02:26
  • The obsolescent PowerShell ISE gives you the worst of both worlds: It uses the active ANSI code page, so it neither works with legacy console applications that use the OEM code page, nor does it work with UTF-8 programs. And, seemingly, the default encoding cannot even be changed (`chcp` calls are quietly ignored, and assigning to `[console]::OutputEncoding` yields a "handle is invalid" error). For an IDE-like experience that already speaks UTF-8 by default for PowerShel Core and can be configured to do so for Windows PowerShell, use [Visual Studio Code](https://code.visualstudio.com/) – mklement0 Jul 23 '19 at 02:27
  • @mklement0, if we're only concerned with using PowerShell and non-interactive console programs, then PowerShell ISE does provide better Unicode support than the console, including support for non-BMP characters (e.g. most [emojis](https://en.wikipedia.org/wiki/Emoji)), complex scripts (e.g. character sequences that use the [zero-width joiner](https://en.wikipedia.org/wiki/Zero-width_joiner) character), and font fallback. For me, in Windows 10, chcp.com does work properly in ISE. It sets the input and output codepages of the hidden console that console applications inherit by default. – Eryk Sun Jul 23 '19 at 03:48
  • Some console applications may use the console input/output codepage when writing to stdout even if it's a pipe (as PowerShell ISE uses for stdout). It's worth running `chcp.com 65001` in that particular case when getting output from a non-interactive command. Other console applications ignore the console if stdout is a pipe. For example, Python will always use the system ANSI codepage in this case, unless we force it to use UTF-8 via the `PYTHONIOENCODING` environment variable or UTF-8 mode (new in 3.7) via the `PYTHONUTF8` environment variable or `-X utf8` command-line option. – Eryk Sun Jul 23 '19 at 03:58
  • @eryksun: It's good to know that the ISE has better Unicode _rendering_ support, but that is a moot point, because it neither correctly renders nor correctly captures UTF-8 output _from external programs_, and `chcp` cannot help that: what matters for correct _PowerShell_ behavior is that `[console]::OutputEncoding` match the output encoding of the external program, and because `[console]::OutputEncoding` _caches_ the encoding (active code page) _at startup time_ and only recognizes later changes _via that property_, running `chcp` in-session is ineffective. – mklement0 Jul 23 '19 at 10:47
  • @eryksun: Now that we know that a true PowerShell `conhost.exe`-based session _can_ be made to properly redirect / capture UTF-8 output from external programs (by setting `[console]::OutputEncoding` to UTF-8, as detailed in my answer (which fails in the ISE)), which console-based alternative would you recommend for better _rendering_? You've mentioned ConEmu in the past; will the new Windows Terminal be on par with the ISE's rendering abilities? What, in a nutshell, makes the ISE better at rendering Unicode, and will the same technology underlie Windows Terminal? – mklement0 Jul 23 '19 at 11:05
  • 1
    @mklement0, PowerShell inserts itself as a middleman in redirection to pipes and files, so the encoding it uses to decode output from programs is important. But isn't that a matter of changing the `OutputEncoding` variable? I'd find it odd if it were a function of `[console]::OutputEncoding`. Anyway, trying to set the latter will fail at first because powershell_ise.exe doesn't initially have a console. It calls `AllocConsole` to get a console and hides the window just before it runs an external console application. Afterwards we can set `[console]::OutputEncoding`. – Eryk Sun Jul 23 '19 at 11:17
  • 1
    @mklement0, ConEmu or the new Windows Terminal are both good choices. In Windows 10 I'm pretty sure both are taking advantage of the new pseudoconsole capability, but ConEmu works in older versions of Windows as well. The difference in Unicode handling between conhost.exe and modern programs is because conhost.exe is based on the classic Windows GDI API, and newer programs use the DirectWrite API. – Eryk Sun Jul 23 '19 at 11:20
  • @eryksun: No, `$OutputEncoding` - perhaps surprisingly - only applies to data sent _to_ external programs; data decoded _from_ [the stdout stream of] external programs is always interpreted based on `[console]::OutputEncoding`. Thanks for the tips re what console to use. Good point re being able to set `[console]::OutputEncoding` after the hidden console happens to have been allocated; to call that workaround awkward would be an understatement, however. Allow me a tangent, while I have your attention: I've never understood when `[console]::InputEncoding` ever comes into play. – mklement0 Jul 23 '19 at 12:03