74

I have a VC++ project in Visual Studio 2008.

It is defining the symbols for unicode on the compiler command line (/D "_UNICODE" /D "UNICODE"), even though I do not have this symbol turned on in the preprocessor section for the project.

alt text

As a result I am compiling against the Unicode versions of all the Win32 library functions, as opposed to the ANSI ones. For example in WinBase.h, there is:

#ifdef UNICODE
#define CreateFile  CreateFileW
#else
#define CreateFile  CreateFileA
#endif // !UNICODE

Where is the unicode being turned on in the VC++ project, how can I turn it off?

Glorfindel
  • 21,988
  • 13
  • 81
  • 109
Cheeso
  • 189,189
  • 101
  • 473
  • 713
  • 9
    Now for the real question: Why - for heaven's sake - would you ever hope to gain anything from disabling UNICODE support? This has got to be the very first time I ever saw anybody asking for help with entering failure mode. – IInspectable Oct 20 '14 at 13:40
  • 3
    If you have legacy code to support? We have a bunch of libraries which use char/TCHAR interchanegably for example, from 15 years ago. – Mr. Boy Oct 16 '15 at 10:34
  • @IInspectable: Unicode support is not the same as moving `char` to `wchar_t`, which is Window's crappy way of doing "Unicode". Give me UTF-8 any day over some botched UTF-16 implementation. – Thomas Eding Sep 15 '16 at 20:55
  • 2
    @ThomasEding: On Windows, `wchar_t` is synonymous for Unicode/UTF-16LE encoding. Crappy or not, it is the native character encoding in Windows, exposed through the Windows API. If you wish to interface with it, you better learn to appreciate it. Incidentally, .NET strings use UTF-16 encoding as well. So does NTFS. Or Java strings. – IInspectable Sep 15 '16 at 22:24
  • 2
    I'll use a C++ UTF-8 library when working with C++, thanks. Now my code works on Linux too. The wonders! – Thomas Eding Sep 17 '16 at 00:17
  • 4
    You can use a C++ UTF-8 library all you want, but you'll still have to convert those UTF-8 strings into UTF-16 strings in order to interface with the platform's native API. And so defining `UNICODE` helps to ensure that you do not goof up and accidentally pass a UTF-8 string (typed as `char`) to an API function that is expecting an ANSI string (also typed as `char`, very different from UTF-8). Note that, contrary to the expectations of some programmers, UTF-8 is *not* a valid ANSI code page on Windows. @thomas – Cody Gray - on strike Oct 02 '16 at 05:28

6 Answers6

111

Have you tried: Project Properties - General - Project Defaults - Character Set?

See answers in this question for the differences between "Use Multi-Byte Character Set" and "Not Set" options: About the "Character set" option in visual studio 2010

Community
  • 1
  • 1
Nemanja Boric
  • 21,627
  • 6
  • 67
  • 91
  • 1
    Thank you very much. This helped me a lot with a project I found and can not compile. – LogoS Dec 30 '15 at 15:53
  • 2
    thanks again it saved my problem UNICODE by default is not nice move though.! – Haseeb Mir Apr 07 '18 at 23:24
  • 9
    In Visual Studio 2019: `Project Properties > Advanced > Character Set` Set to `Not Set` or `Use Multi-Byte Character Set`. – AlainD Mar 20 '20 at 16:08
20

Burgos has the right answer. Just to clarify, the Character Set should be changed to "Not Set".

Ellis Miller
  • 309
  • 2
  • 5
  • No, it should not. *"Not Set"* amounts to inheriting that setting, from *some* default. In other words: The code may still be compiled with Unicode support. This is like answering the question *"How do I make sure this light is off?"* with *"Do NOT touch the light switch!"* (which, indeed, produces the desired result in 50% of the cases). – IInspectable Jul 17 '22 at 20:29
11

From VS2019 Project Properties - Advanced - Advanced Properties - Character Set enter image description here

Also if there is _UNICODE;UNICODE Preprocessors Definitions remove them. Project Properties - C/C++ - Preprocessor - Preprocessor Definition enter image description here

Juan Rojas
  • 8,711
  • 1
  • 22
  • 30
9

project properities -> configuration properities -> general -> charater set

Ahmed
  • 7,148
  • 12
  • 57
  • 96
5

For whatever reason, I noticed that setting to unicode for "All Configurations" did not actually apply to all configurations.

Picture: Setting Configuragion In IDE

To confirm this, I would open the .vcxproj and confirm the correct token is in all 4 locations. In this photo, I am using unicode. So the string I am looking for is "Unicode". For you, you likely want it to say "MultiByte".

Picture: Confirming changes in configuration file

KANJICODER
  • 3,611
  • 30
  • 17
-5

you can go to project properties --> configuration properties --> General -->Project default and there change the "Character set" from "Unicode" to "Not set".

  • 3
    Any reason you stop by, and copy a 6 year old answer without contributing anything of value? Without attributing the source of information, of course. – IInspectable Sep 15 '16 at 22:27