I'm currently going through learncpp.com's C++ tutorials and I'm seeing that their variable naming trend has them naming int variables with an "n" prefix (i.e. int nValue) and "ch" prefixes for char variables (i.e. char chOperation). Is this something that is commonplace in the industry that I should form as a habit now?
-
You're talking about a derivation of Hungarian notation of variable names, and honestly it doesn't really matter as long as you're consistent and the code is *readable*. – WhozCraig Aug 17 '13 at 02:11
-
1Actually the `n` and `ch` prefixes are _Systems_ Hungarian, not _Apps_ Hungarian. There are plenty of people around, including many legendary programmers and gurus, who argue very strongly against Systems Hungarian, even if the code is "consistent." Even people who say Hungarian Notation is okay today pretty much say it's the Apps, *not* the System, Hungarian that is acceptable. Details on the [Wikipedia page for Hungarian Notation](http://en.wikipedia.org/wiki/Hungarian_notation). – Ray Toal Aug 17 '13 at 02:19
-
I think this is a good read: http://herbsutter.com/2008/07/15/hungarian-notation-is-clearly-goodbad/ – Fred Larson Aug 17 '13 at 02:19
-
@WhozCraig I'd argue that HN prevents things from being readable. See the Stanford page linked in my answer for a great argument about this. – Aug 17 '13 at 02:27
-
@FredLarson I like the quote "prefixes tend to turn into lies as variable types morph during maintenance." from that linked article. – Aug 17 '13 at 02:30
-
Also known as anti-Hungarian notation. Proper Hungarian notation is useful even for systems programming. – DanielKO Aug 17 '13 at 06:51
4 Answers
Is this something that is commonplace in the industry?
This practice was common in some parts of Microsoft twenty or thirty years ago, due to a misunderstanding of a somewhat more useful convention used by other parts of the company (that of tagging variables indicate their purpose which, in a weakly typed language, can help avoid various kinds of category error). Neither convention serves any useful purpose in a strongly typed language like C++: the type system can catch such errors automatically and more reliably.
It became widely used by others, long after Microsoft (presumably) realised that it was pointless and advised against its use, presumably in the belief that emulating Microsoft's habits might also emulate their success. It's still occasionally seen today, by people who develop habits and never question their usefulness, and by companies who prioritise style guides above software.
Personally, I find it useful as a warning that the code is likely to contain worse horrors.
I should form as a habit now?
It only serves to make the code harder to read, and misleading if you forget to update the tags when you change a variable's type. You should develop a habit of writing clear, readable code, not of sprinkling it with mysterious runes.
Disclaimer: the brief comments about Microsoft are intended to give historical context and are not intended to be an authorative account of Microsoft's policy decisions; specifically the phrase "[Microsoft] realised [it] was pointless" is intended to mean "[some people at Microsoft] realised [the topic under discussion, using redundant type tags in modern C++ in most contexts] was pointless" not (as a commentor appears to have read) "[the entirety of Microsoft] realised [all use of variable tagging] was pointless". All opinions are my own, and may be based on imperfect knowledge.

- 249,747
- 28
- 448
- 644
-
You're not exactly providing much reference to the "Microsoft realised that it was pointless" argument. While I wouldn't necessarily suggest using some sort of hungarian notation with C++ it certainly helps a **lot** when dealing with a C interface (like the Windows API). I for one value a disambiguated formal size parameter that indicates whether it is the size in bytes (`cb`) or the number of characters (`cch`). – IInspectable Aug 17 '13 at 02:55
-
1@IInspectable: I didn't think a reference would be necessary, but [here is one](http://msdn.microsoft.com/en-us/library/ms229045.aspx). To quote (with their emphasis): "**DO NOT** use Hungarian notation." – Mike Seymour Aug 17 '13 at 02:57
-
1Uhm.... that is .NET, about as strongly typed as C++. The Microsoft you were talking about is the Microsoft from twenty or thirty years ago, that allegedly realized that it is a mistake to embed semantic information in parameter names for a C interface. I was thinking of that sort of reference. – IInspectable Aug 17 '13 at 03:03
-
3@IInspectable: I never said they stopped using it twenty or thirty years ago. I just said they stopped using it, and advised people not to use it, at some point (in the early 2000s, if memory serves). The exact timeline is irrelevant to the point I'm trying to make that (a) I don't like it and (b) others (including today's Microsoft) also don't. I'm not trying to argue about whether or not it should have been used in the 1980s, because none of us are going to write code in the 1980s. And the question is about writing C++, not C. – Mike Seymour Aug 17 '13 at 03:08
-
Apparently, Microsoft [continued](http://msdn.microsoft.com/en-us/library/windows/desktop/dd744776.aspx) to use semantic encoding up to and including Windows 7/Server 2008. Again, I'm merely asking for reference to back your statement. And for reference where they advice against its use - in the context you constructed: Windows API, C. – IInspectable Aug 17 '13 at 03:26
-
@IInspectable: The question is about writing C++ in 2013 (and later). My mentioning of Microsoft advising people not to use any kind of tagging (at least in C++ for .NET) is only a minor sidenote in my answer; and whether or not everyone at Microsoft followed that advice has absolutely nothing to do with anything. Could we stop this tedious, off-topic argument please? – Mike Seymour Aug 17 '13 at 03:32
-
let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/35637/discussion-between-iinspectable-and-mike-seymour) – IInspectable Aug 17 '13 at 03:33
Yes, they are common (esp. in Windows related projects)
But different projects may use different coding styles. So if you're working with an existing project, then the best is to stick to the style it already follows.
The naming style you mentioned is known as Hungarian style, which is typically used in Windows related projects. In the Hungarian style, variables are formatted in camel case (e.g., CamelCase) and prefixed by their scope and types:
[scope prefix]_[variable type][actual variable name in camel-cased style]
For example:
m_nMemberInteger
is an integer (according to it prefix n
), in addition, it's a member variable (according to its prefix m_
) to some structure / class, and you can find the complete list of scope and type prefixes used in the Hungarian style in the above link.
However, in linux-based projects, you will usually find people using different coding styles (e.g.,
Google c++ coding style), which uses only lower-cases and underscore _
to name their variables.

- 5,814
- 7
- 40
- 79
This looks similar to Hungarian Notation. Such things are sometimes used, especially in certain fields of programming. Personally I think it makes code look messy. In C++ you should think more about what the object means rather than what its underlying type may happen to be. And modern editors easily let you look up the type of variables, so it is kind of obsolete. I can understand why it was used when editors weren't so helpful..

- 21,327
- 9
- 53
- 91
As mentioned in the other comments, this is known as "Hungarian Notation" and is used to make the type of a variable obvious. While it's perhaps arguable whether it's worth the trouble to label the type, another common convention (especially in C++) is to use prefixes to indicate information about a variable's usage. This is especially useful for references and member variables. For instance, one might have a function that looks like
void MyClass::myMethod(const int& iInput, int& oOutput, int &ioInputAndOutput)
{
oOutput = ioInputAndOutput + mMemberData + iInput;
ioInputAndOutput *= 2;
}
As also mentioned above, the important thing is consistency, which will prevent more bugs than any particular convention. On collaborative projects, it's usually worth it to conform to the existing convention.

- 2,375
- 1
- 23
- 25