34

I am trying to make a simple Message Box in C in Visual Studio 2012, but I am getting the following error messages

argument of type const char* is incompatible with parameter of type "LPCWSTR"

err LNK2019:unresolved external symbol_main referenced in function_tmainCRTStartup

Here is the source code

#include<Windows.h>

int _stdcall WinMain(HINSTANCE hinstance,HINSTANCE hPrevinstance,LPSTR lpszCmdline,int nCmdShow)
{

    MessageBox(0,"Hello","Title",0);

    return(0);
}

Please Help

Thanks and Regards

Adam Rosenfield
  • 390,455
  • 97
  • 512
  • 589
  • 11
    Check your project's Character Set setting (Project Properties, Configuration Properties, General, Character Set). It's probably set to "Use Unicode" instead of "Use Multi-Byte". – TypeIA Feb 17 '14 at 16:56
  • 5
    You are running your code on a Unicode operating system. You should use Unicode strings. Like L"Hello". You can turn the clock back to 1991 but there isn't much point in using C if you do that. – Hans Passant Feb 17 '14 at 17:04
  • @HansPassant or switch to multibyte. The Windows API is [designed to allow you use either/or](http://msdn.microsoft.com/en-us/library/windows/desktop/dd374089%28v=vs.85%29.aspx). Note that "multibyte" strings can still encode Unicode code points, so choosing this option is not necessarily "turning the clock back to 1991." Perfectly correct, globalized, Unicode-aware applications can be written using multibyte (rather than wide character) strings. – TypeIA Feb 17 '14 at 17:05
  • 3
    No, the winapi is designed to support old projects, ones that got started in the previous century. Creating a new one and *intentionally* making the code slow by forcing the compatibility functions to convert the strings makes very little sense. Might as well use a scripting language. – Hans Passant Feb 17 '14 at 17:07
  • 1
    @HansPassant No. The world runs on UTF8, use it as your internal representation for easy compatibility with the world, or perhaps use UCS4 if you need serious character-level text processing. The half-arsed 16-bit encoding from the previous millennium is an interesting curiosity that should stay in the museum of ancient computing. Windows OS intermal reliance on this encoding is a bug, not a feature. A minuscule speed improvement is not a valid reason to admit such a bug to your own code. – n. m. could be an AI Feb 17 '14 at 17:54
  • @n.1 The world *actually* runs on UTF-16: Windows, .NET, Java, ... As your *internal* representation, choose whatever suits your needs. If you're calling into the Win32 API a lot, UTF-16 is the natural choice (and if you're interacting with the filesystem you do not even have a choice). As your *external* representation, UTF-8 is a fine choice: It's byte-order agnostic and self-synchonizing, features that matter when exchanging data. Just don't let the external representation dictate your internal encoding, blindly following the *"UTF-8 Everywhere"* mantra. – IInspectable Jun 21 '22 at 07:20
  • @IInspectable Maybe your world, not mine. Do you have a Windows .NET UTF-16 Java device in your pocket? I don't. Maybe yours talks to Windows .NET UTF-16 Java servers to do its job? Mine doesn't (most of the time). I have a Windows laptop, all I use it for is to ssh to a Linux server. UTF-16 might have seemed a panacea back when Win32 and Java APIs were created but it is surely a hindrance now. It's the worst of both worlds, not compact *and* not a single code unit per character encoding. Use it when you have to, and not a single byte more. – n. m. could be an AI Jun 21 '22 at 07:37

4 Answers4

26

To compile your code in Visual C++ you need to use Multi-Byte char WinAPI functions instead of Wide char ones.

Set Project -> Properties -> General -> Character Set option to Use Multi-Byte Character Set

I found it here https://stackoverflow.com/a/33001454/5646315

Jasurbek Nabijonov
  • 1,607
  • 3
  • 24
  • 37
25

To make your code compile in both modes, enclose the strings in _T() and use the TCHAR equivalents

#include <tchar.h>
#include <windows.h>

int WINAPI _tWinMain(HINSTANCE hinstance, HINSTANCE hPrevinstance, LPTSTR lpszCmdLine, int nCmdShow)
{
    MessageBox(0,_T("Hello"),_T("Title"),0);
    return 0;
}
cup
  • 7,589
  • 4
  • 19
  • 42
  • throuh your sugestion the argument of type const char* get removed –  Feb 17 '14 at 18:49
  • but it still showing the unresolve externalsymbol_main referenced in function_tmainCRTStartup –  Feb 17 '14 at 18:50
  • Corrected the winmain part. If you get Visual Studio to create the dummy program for you, that is what it will give you. – cup Feb 17 '14 at 18:58
  • `_T` is the wrong macro. It should be `TEXT` (see [TEXT vs. _TEXT vs. _T, and UNICODE vs. _UNICODE](https://devblogs.microsoft.com/oldnewthing/20040212-00/?p=40643)). – IInspectable Jun 18 '22 at 08:54
  • @IInspectable - It may be the wrong macro according to your article but why don't you just try the above program, replacing _T with TEXT and try compiling to prove your point. If it builds, then you've proved your point. – cup Jun 19 '22 at 22:46
  • @IInspectable - note that the article was written in 2004. – cup Jun 19 '22 at 23:02
  • Yes, 2004. So roughly a decade after those macros were invented. And none of that changed ever since. Now, to prove *my* point, put the following two preprocessor directives at the top of your snippet: `#define _MBCS` and `#define UNICODE`. Compile, and watch the errors you set out to fix reappear. – IInspectable Jun 20 '22 at 04:56
  • I don't know which compiler you are using: I don't get those errors on VS2010 and VS2019. VS either pre-defines _MBCS or UNICODE. If you **#define** it again, you get a warning. Your original point is that _T is the wrong one to use. Changing it to TEXT doesn't solve anything: it just generates more errors. – cup Jun 21 '22 at 05:20
  • There are two sets of preprocessor symbols: Those with and those without leading underscores. The former control the generic-text mappings of the Win32 API headers, the latter of the CRT. `MBCS`/`UNICODE` decide, whether e.g. `MessageBox` expands to `MessageBoxA` or `MessageBoxW`, and `_MBCS`/`_UNICDOE` whether `_tcslen` expands to `strlen` or `wcslen`. The arguments have to match: When calling into the Win32 API, use the `TEXT` macro, and for calls into the CRT use the `_TEXT`/`_T` macros. If replacing `_T` with the `TEXT` macro produces errors then there's an issue in your configuration. – IInspectable Jun 21 '22 at 06:54
  • While that apparently confuses you, that's just another reason to forget about those macros altogether. The real solution here is: `MessageBoxW(0, L"Hello", L"Title", 0)`. Unless you're writing code that still needs to compile for Win9x. – IInspectable Jun 21 '22 at 06:58
  • Doesn't that depend on whether you wish to work in MBCS or Unicode? It can be #defined but this is set in the General section of the project. – cup Jun 21 '22 at 20:17
  • That doesn't change the fact, that there are **two** sets of preprocessor macros. The definition of `TEXT` is based off of `UNICODE`/`MBCS`. The definition of `_TEXT`/`_T` is based off of `_UNICODE`/`_MBCS`. Since the expansion of `MessageBox` is also based on `UNICODE`/`MBCS`, you'll need to use the matching string literal conversion macro: `TEXT`. Or, as I noted above, don't use those macros at all. Just call the Unicode version, and pass wide-character string literals. That's the right choice anyway, regardless of whether you understand generic-text mappings. – IInspectable Jun 22 '22 at 07:04
  • If you firmly believe that is the right solution, why not just post that as an answer. Note that if you select Unicode character set in General, both UNICODE and _UNICODE are defined. If you select MBCS in General, only _MBCS is defined. – cup Jun 23 '22 at 05:40
  • Note that if you compile on the command line then none of your project settings are even considered. All things considered, your proposed solution works by coincidence (if it does), not by virtue of being correct. A statement which you'll find challenging to refute. – IInspectable Jun 23 '22 at 07:38
6

I recently ran in to this issue and did some research and thought I would document some of what I found here.

To start, when calling MessageBox(...), you are really just calling a macro (for backwards compatibility reasons) that is calling either MessageBoxA(...) for ANSI encoding or MessageBoxW(...) for Unicode encoding.

So if you are going to pass in an ANSI string with the default compiler setup in Visual Studio, you can call MessageBoxA(...) instead:

#include<Windows.h>

int _stdcall WinMain(HINSTANCE hinstance,HINSTANCE hPrevinstance,LPSTR lpszCmdline,int nCmdShow)
{

    MessageBoxA(0,"Hello","Title",0);

    return(0);
}

Full documentation for MessageBox(...) is located here: https://msdn.microsoft.com/en-us/library/windows/desktop/ms645505(v=vs.85).aspx

And to expand on what @cup said in their answer, you could use the _T() macro and continue to use MessageBox():

#include<tchar.h>
#include<Windows.h>

int _stdcall WinMain(HINSTANCE hinstance,HINSTANCE hPrevinstance,LPSTR lpszCmdline,int nCmdShow)
{

    MessageBox(0,_T("Hello"),_T("Title"),0);

    return(0);
}

The _T() macro is making the string "character set neutral". You could use this to setup all strings as Unicode by defining the symbol _UNICODE before you build (documentation).

Hope this information will help you and anyone else encountering this issue.

mkchandler
  • 4,688
  • 3
  • 23
  • 25
1

Yes whatever it was it was a wrong tutorial, you need to make it a long byte integer.

Try this:

#include<Windows.h>
int _stdcall WinMain(HINSTANCE hinstance,HINSTANCE hPrevinstance,LPSTR lpszCmdline,int nCmdShow)
{
    MessageBox(0,L"Hello",L"Title",0);
    return(0);
}
Arulkumar
  • 12,966
  • 14
  • 47
  • 68
GameDev99
  • 11
  • 2
  • check out the new guide from microsoft https://learn.microsoft.com/en-us/windows/win32/learnwin32/learn-to-program-for-windows – GameDev99 May 10 '21 at 12:28