68

I am a first year computer science student and my professor said #define is banned in the industry standards along with #if, #ifdef, #else, and a few other preprocessor directives. He used the word "banned" because of unexpected behaviour.

Is this accurate? If so why?

Are there, in fact, any standards which prohibit the use of these directives?

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
psrag anvesh
  • 1,257
  • 2
  • 12
  • 18
  • 62
    First I've heard of it. No; `#define` and so on are widely used. Sometimes too widely used, but definitely used. There are places where the C standard mandates the use of macros — you can't avoid those easily. You could check MISRA C standards; they tend to proscribe things, but I'm pretty sure that even they recognize that `#define` et al are sometimes needed. – Jonathan Leffler Dec 28 '15 at 15:44
  • 18
    A better rule of thumb is "only use #define when nothing else will do." – August Karlstrom Dec 28 '15 at 16:04
  • 21
    Maybe ask, banned by whom, in what industries, what are some publications related to this supposed ban, etc? – Random832 Dec 28 '15 at 16:32
  • 1
    I'm sure there is at least one standard, somewhere, which bans #define. – user253751 Dec 29 '15 at 00:21
  • 35
    Your professor isn't just wrong, they're *comically* wrong. To actually say this with a straight face, they'd have to have zero experience and almost-zero knowledge of C. This person has no business teaching. – Alex Celeste Dec 29 '15 at 05:53
  • 4
    @psraganvesh Is your question perhaps incorrectly tagged? Do you mean C++? Because in C++ there are more ways that can be used to "get around" doing a #define. He's still wrong though. – pipe Dec 29 '15 at 06:37
  • 1
    Funny quote about macro's in the [Google Style Guide](https://google.github.io/styleguide/cppguide.html): "_You're not really going to define a macro, are you? If you do, they're like this: MY_MACRO_THAT_SCARES_SMALL_CHILDREN._" – Danny_ds Dec 29 '15 at 19:50
  • Your professor is not wrong. Except for include guards, the use of preprocessor directives is widely banned in high-reliability computing. – David Hammen Dec 29 '15 at 20:11
  • Relevant? https://twitter.com/ftrain/status/671394222218592256 – iKlsR Dec 30 '15 at 00:34
  • 2
    Maybe English is not your first language? The choice of words here is causing some say your professor is right while others say he's comically wrong. There is no organization that defines "industry standards" with the authority to "ban" macros or any other language feature. However, industry *best practices*--which is to say, opinions and suggestions from smart and experienced professionals--generally *discourage* the use of preprocessor macros unless truly necessary. – Kip Jan 03 '16 at 19:59
  • @Kip there are trade bodies that define operating standards that corporations must agree to in order to maintain a certification code. "ISO-9001" is one of these, and there are many more, and by other certifying bodies. Whether they go so deep as to define a code convention like this I cannot say, but I can speak for them sometimes having detailed measures of practice. I think there is a worthwhile question here. – New Alexandria Jan 10 '16 at 03:23

13 Answers13

141

First I've heard of it.

No; #define and so on are widely used. Sometimes too widely used, but definitely used. There are places where the C standard mandates the use of macros — you can't avoid those easily. For example, §7.5 Errors <errno.h> says:

The macros are

     EDOM
     EILSEQ
     ERANGE

which expand to integer constant expressions with type int, distinct positive values, and which are suitable for use in #if preprocessing directives; …

Given this, it is clear that not all industry standards prohibit the use of the C preprocessor macro directives. However, there are 'best practices' or 'coding guidelines' standards from various organizations that prescribe limits on the use of the C preprocessor, though none ban its use completely — it is an innate part of C and cannot be wholly avoided. Often, these standards are for people working in safety-critical areas.

One standard you could check the MISRA C (2012) standard; that tends to proscribe things, but even that recognizes that #define et al are sometimes needed (section 8.20, rules 20.1 through 20.14 cover the C preprocessor).

The NASA GSFC (Goddard Space Flight Center) C Coding Standards simply say:

Macros should be used only when necessary. Overuse of macros can make code harder to read and maintain because the code no longer reads or behaves like standard C.

The discussion after that introductory statement illustrates the acceptable use of function macros.

The CERT C Coding Standard has a number of guidelines about the use of the preprocessor, and implies that you should minimize the use of the preprocessor, but does not ban its use.

Stroustrup would like to make the preprocessor irrelevant in C++, but that hasn't happened yet. As Peter notes, some C++ standards, such as the JSF AV C++ Coding Standards (Joint Strike Fighter, Air Vehicle) from circa 2005, dictate minimal use of the C preprocessor. Essentially, the JSF AV C++ rules restrict it to #include and the #ifndef XYZ_H / #define XYZ_H / … / #endif dance that prevents multiple inclusions of a single header. C++ has some options that are not available in C — notably, better support for typed constants that can then be used in places where C does not allow them to be used. See also static const vs #define vs enum for a discussion of the issues there.

It is a good idea to minimize the use of the preprocessor — it is often abused at least as much as it is used (see the Boost preprocessor 'library' for illustrations of how far you can go with the C preprocessor).

Summary

The preprocessor is an integral part of C and #define and #if etc cannot be wholly avoided. The statement by the professor in the question is not generally valid: #define is banned in the industry standards along with #if, #ifdef, #else, and a few other macros is an over-statement at best, but might be supportable with explicit reference to specific industry standards (but the standards in question do not include ISO/IEC 9899:2011 — the C standard).


Note that David Hammen has provided information about one specific C coding standard — the JPL C Coding Standard — that prohibits a lot of things that many people use in C, including limiting the use of of the C preprocessor (and limiting the use of dynamic memory allocation, and prohibiting recursion — read it to see why, and decide whether those reasons are relevant to you).

Community
  • 1
  • 1
Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
  • 3
    A few of the MISRA rules require/advise certain limitations on the use of macros, but they're fairly common-sensical. I was surprised by their permissiveness, actually. Even `#define MAX(a,b) ((a>b)?(a):(b))` is okay, despite its potential perils. – Sneftel Dec 28 '15 at 15:57
  • 1
    @Sneftel: Thanks. I've not read all the MISRA rules (it's on my "To Do" list, but not at a high priority), but it was one possible source of "industry standard" that does place limits on what you can do compared with what the standard allows. – Jonathan Leffler Dec 28 '15 at 16:01
  • 1
    There are a number of guidelines or standards for safety-critical or mission-critical development that forbid usage of function macros, in both C and C++. – Peter Dec 28 '15 at 16:04
  • 1
    @Peter: can you give a concrete example? MISRA does not per what Sneftel says, and my minimal scanning. I just went to check the NASA GSFC (Goddard Space Flight Center) [C Coding Standard](https://stackoverflow.com/questions/256277) guidelines, and it says: _Macros should be used only when necessary. Overuse of macros can make code harder to read and maintain because the code no longer reads or behaves like standard C._ but continues with an example of a function macro. – Jonathan Leffler Dec 28 '15 at 16:09
  • 5
    Read **7.5 Errors** in the C Standard. It *requires* the use of macros. Literally. – Andrew Henle Dec 28 '15 at 16:42
  • 1
    @AndrewHenle: Great minds think alike — I was just finishing up an edit and included exactly §7.5 in the text as an example of where macros are explicitly required. – Jonathan Leffler Dec 28 '15 at 16:45
  • 1
    @JonathanLeffler - Keep reading. ;) The standard says that not using the `errno` macro results in UB. – Andrew Henle Dec 28 '15 at 16:46
  • 1
    Jonathon - Most, like you say, don't ban outright. A C++ example, though. JSF++ AV Rule 29 states "The `#define` pre-processor directive shall not be used to create inline macros. Inline functions shall be used instead.". AV Rule 30 says "The `#define` pre-processor directive shall not be used to define constant values. Instead, the const qualifier shall be applied to variable declarations to specify constant values." AV Rule 31 states "The `#define` pre-processor directive will only be used as part of the technique to prevent multiple inclusions of the same header file.". – Peter Dec 28 '15 at 23:01
  • 1
    I do remember reading some comparable requirements in other documents for C, but don't have those at hand. I don't recall any standard or guideline that forbids standard library/headers from using macros [probably too hard to get vendor compliance, or would force developers to tailor their development environment which would be a maintenance nightmare] but a number actively discourage using such macros (e.g. forbidding use of `offsetof()` and similar macros). – Peter Dec 28 '15 at 23:05
  • 1
    @Peter: thanks for the feedback. Please see my update. Using `offsetof()` is occasionally, but only very occasionally, justifiable. – Jonathan Leffler Dec 28 '15 at 23:22
  • 1
    In organizations that do have such rules, they don't forbid the use of macros already defined in the standard (so the discussion on errno and offsetof is moot), and they don't forbid the use of include guards. You may run into such rules if you write software that can kill, can result in many millions of dollars of damage, or threaten national security. If you're writing a video game or a database, it doesn't matter. – David Hammen Dec 29 '15 at 20:32
33

No, use of macros is not banned.

In fact, use of #include guards in header files is one common technique that is often mandatory and encouraged by accepted coding guidelines. Some folks claim that #pragma once is an alternative to that, but the problem is that #pragma once - by definition, since pragmas are a hook provided by the standard for compiler-specific extensions - is non-standard, even if it is supported by a number of compilers.

That said, there are a number of industry guidelines and encouraged practices that actively discourage all usage of macros other than #include guards because of the problems macros introduce (not respecting scope, etc). In C++ development, use of macros is frowned upon even more strongly than in C development.

Discouraging use of something is not the same as banning it, since it is still possible to legitimately use it - for example, by documenting a justification.

Peter
  • 35,646
  • 4
  • 32
  • 74
30

Some coding standards may discourage or even forbid the use of #define to create function-like macros that take arguments, like

#define SQR(x) ((x)*(x))

because a) such macros are not type-safe, and b) somebody will inevitably write SQR(x++), which is bad juju.

Some standards may discourage or ban the use of #ifdefs for conditional compilation. For example, the following code uses conditional compilation to properly print out a size_t value. For C99 and later, you use the %zu conversion specifier; for C89 and earlier, you use %lu and cast the value to unsigned long:

#if __STDC_VERSION__ >= 199901L
#  define SIZE_T_CAST
#  define SIZE_T_FMT "%zu"
#else
#  define SIZE_T_CAST (unsigned long)
#  define SIZE_T_FMT "%lu"
#endif
...
printf( "sizeof foo = " SIZE_T_FMT "\n", SIZE_T_CAST sizeof foo );

Some standards may mandate that instead of doing this, you implement the module twice, once for C89 and earlier, once for C99 and later:

/* C89 version */
printf( "sizeof foo = %lu\n", (unsigned long) sizeof foo );

/* C99 version */
printf( "sizeof foo = %zu\n", sizeof foo );

and then let Make (or Ant, or whatever build tool you're using) deal with compiling and linking the correct version. For this example that would be ridiculous overkill, but I've seen code that was an untraceable rat's nest of #ifdefs that should have had that conditional code factored out into separate files.

However, I am not aware of any company or industry group that has banned the use of preprocessor statements outright.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
John Bode
  • 119,563
  • 19
  • 122
  • 198
16

Macros can not be "banned". The statement is nonsense. Literally.

For example, section 7.5 Errors <errno.h> of the C Standard requires the use of macros:

1 The header <errno.h> defines several macros, all relating to the reporting of error conditions.

2 The macros are

EDOM
EILSEQ
ERANGE

which expand to integer constant expressions with type int, distinct positive values, and which are suitable for use in #if preprocessing directives; and

errno

which expands to a modifiable lvalue that has type int and thread local storage duration, the value of which is set to a positive error number by several library functions. If a macro definition is suppressed in order to access an actual object, or a program defines an identifier with the name errno, the behavior is undefined.

So, not only are macros a required part of C, in some cases not using them results in undefined behavior.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
Andrew Henle
  • 32,625
  • 3
  • 24
  • 56
  • 1
    *not using them results in undefined behavior* that does not seem exactly what the text says (or I'm misreading/misinterpreting)? It says there's UB when either "a macro definition is suppressed" or "an identifier named errno is defined" and I don't see how you draw the link from that to "not using macros". On the contrary, both those actions (suppressing/defining) would be typical things you do when using macros, not when not using them? – stijn Dec 28 '15 at 17:20
  • 1
    @stijn - How would you access `errno` without using the macro without invoking UB? The standard clearly says `errno` is a macro, and it clearly states that suppressing the `errno` macro or otherwise defining your own `errno` results in UB. And how can you write robust code in general without using `errno`? [Some system calls](http://man7.org/linux/man-pages/man2/msgop.2.html) can still be interrupted and return a failure with `errno` set to `EINTR` even if all your signal flags have `SA_RESTART` set. Ergo, if you use `errno` you can't code without macros unless you invoke UB. – Andrew Henle Dec 28 '15 at 17:58
  • 1
    yes that's all true, and I'm not advocating one should not macros, and definitely not one shouldn't use errno, rather just nitpicking, but your comment imo still has no one-on-one proof that not using macros in one's own code somehow results in UB. (note your last sentence "*if* you use errno" so firstly if you don't all is fine (well, sort of:) and secondly even if you do use errno you are in fact already using macros and you'd still have to use more macros to invoke the UB, so again that's not *not* using them – stijn Dec 28 '15 at 19:20
  • @stijn My answer states *So, not only are macros a required part of C, in some cases not using them results in undefined behavior.* Section 7.5 clearly makes macros a *required* part of the C language. And if you try to use `errno` without using the *required* macro, you will invoke UB. Clear? – Andrew Henle Dec 28 '15 at 22:21
  • 2
    @Andew : I believe stijn's point is that there's a distinction between "not using errno" and "using errno without using macros", and therefore it isn't "not using them" that causes the UB, but rather using them incorrectly. The nitpick is with the English, not the C. – Ray Dec 28 '15 at 22:58
  • @Ray - If you use `errno` without using macros, you invoke UB: *If a macro definition is suppressed in order to access an actual object ... the behavior is undefined.* It has nothing to do with using macros incorrectly - if you *don't* use macros but do use `errno` somehow, you've invoked UB. – Andrew Henle Dec 28 '15 at 23:06
  • This answer is so off-base I have to downvote it. I've worked in industries that ban the use of preprocessor directives. They do not ban the use of `errno`, or macros defined per the standard, or protecting against multiple inclusion via an include guard. That would be ludicrous. What they do ban is the use of `#if` and `#define` in any context other than that of an include guard. – David Hammen Dec 29 '15 at 20:44
  • @DavidHammen *This answer is so off-base I have to downvote it. I've worked in industries that ban the use of preprocessor directives. They do not ban the use of `errno`, or macros defined per the standard, or protecting against multiple inclusion via an include guard. That would be ludicrous. What they do ban is the use of `#if` and `#define` in any context other than that of an include guard.* So, you've worked where they ban macros, **except when they don't**. [Cool story](https://pbs.twimg.com/profile_images/1593202305/cool-story-bro-500x499.jpg). – Andrew Henle Dec 29 '15 at 22:26
  • @AndrewHenle - The exceptions are spelled out, very carefully. Besides, the question was about using `#if` and `#define` in user code, rather than the use of preprocessor directives in system headers. The discussion about `errno` is irrelevant. – David Hammen Dec 29 '15 at 23:07
  • @DavidHammen The question is *I am a first year computer science student and my professor said `#define` is banned in the industry standards along with `#if`, `#ifdef`, `#else`, and a few other preprocessor directives...* There is **NO** mention of user code. `errno` is proof that statement is nonsense because the C Standard requires `errno` to be a macro - a `#define`! - and it therefore follows macros can not be banned. The fact that your *exceptions are spelled out, very carefully* is a *de facto* admission on your part that they are not banned. That's like saying speed limits ban driving. – Andrew Henle Dec 29 '15 at 23:19
15

No, #define is not banned. Misuse of #define, however, may be frowned upon.

For instance, you may use

#define DEBUG

in your code so that later on, you can designate parts of your code for conditional compilation using #ifdef DEBUG, for debug purposes only. I don't think anyone in his right mind would want to ban something like this. Macros defined using #define are also used extensively in portable programs, to enable/disable compilation of platform-specific code.

However, if you are using something like

#define PI 3.141592653589793

your teacher may rightfully point out that it is much better to declare PI as a constant with the appropriate type, e.g.,

const double PI = 3.141592653589793;

as it allows the compiler to do type checking when PI is used.

Similarly (as mentioned by John Bode above), the use of function-like macros may be disapproved of, especially in C++ where templates can be used. So instead of

#define SQ(X) ((X)*(X))

consider using

double SQ(double X) { return X * X; }

or, in C++, better yet,

template <typename T>T SQ(T X) { return X * X; }

Once again, the idea is that by using the facilities of the language instead of the preprocessor, you allow the compiler to type check and also (possibly) generate better code.

Once you have enough coding experience, you'll know exactly when it is appropriate to use #define. Until then, I think it is a good idea for your teacher to impose certain rules and coding standards, but preferably they themselves should know, and be able to explain, the reasons. A blanket ban on #define is nonsensical.

Viktor Toth
  • 707
  • 6
  • 13
12

That's completely false, macros are heavily used in C. Beginners often use them badly but that's not a reason to ban them from industry. A classic bad usage is #define succesor(n) n + 1. If you expect 2 * successor(9) to give 20, then you're wrong because that expression will be translated as 2 * 9 + 1 i.e. 19 not 20. Use parenthesis to get the expected result.

mikedu95
  • 1,725
  • 2
  • 12
  • 24
  • 2
    i'm not sure the example clearly justifies the false usage of a beginners, – psrag anvesh Dec 28 '15 at 16:00
  • 1
    True, everyone ought to know that macros need to expand to single token equivalents, which pretty much mandates parenthesēs around the expansion… *and* the arguments: `#define successor(n) ((n) + 1)` – mirabilos Dec 29 '15 at 15:23
11

No. It is not banned. And truth to be told, it is impossible to do non-trivial multi-platform code without it.

luis.espinal
  • 10,331
  • 6
  • 39
  • 55
  • 2
    Multi-platform and cross-compiler, I'd say :) – Andrea Corbellini Dec 28 '15 at 16:13
  • Oh really? How do you create a threading framework that operates over POSIX threads and/or over the Windows API without relying on 3rd-party shims (a-la cygwin or mingw)? We are not just talking about same-os, multiple hardware, but multiple OSs or compiler vendors. Ever tried to support a system targeted for, say, Linux and Integrity-OS, which both require different compilers? Neg-rep all you want. Facts on the dirty ground won't change regardless. – luis.espinal Dec 28 '15 at 16:19
  • 8
    I 100% agree with you. I was just adding that the preprocessor is also useful when writing cross-compiler code. – Andrea Corbellini Dec 28 '15 at 16:23
8

No your professor is wrong or you misheard something.

#define is a preprocessor macro, and preprocessor macros are needed for conditional compilation and some conventions, which aren't simply built in the C language. For example, in a recent C standard, namely C99, support for booleans had been added. But it's not supported "native" by the language, but by preprocessor #defines. See this reference to stdbool.h

Superlokkus
  • 4,731
  • 1
  • 25
  • 57
  • 2
    he said twice after me asking him for the second time, "Banned in the industry due to undefined behaviour ", because i haven't heard that anywhere else – psrag anvesh Dec 28 '15 at 15:56
  • undefined behavior? :| – Andrea Corbellini Dec 28 '15 at 15:56
  • @AndreaCorbellini yup he said that not explaining what the undefined behaviour actually is – psrag anvesh Dec 28 '15 at 16:02
  • @psraganvesh Undefined behavior is caused by misuse of the C language, but not just by simple using of built in tools and concepts. Maybe he meant "because, common usage often lead to undefinded behaviour", but there are scenarios in C, like I said, where you can't circumvent the use of preprocessors and IMHO proper use often, lead to better code. – Superlokkus Dec 28 '15 at 16:02
  • 3
    @psraganvesh Undefined behavior is rather caused by for example `*NULL` or `char foo[2]; foo[2];`, all programming errors. To better know what undefined behavior is, often shortend to UB, I recommend this: http://c2.com/cgi/wiki?UndefinedBehavior – Superlokkus Dec 28 '15 at 16:07
  • 2
    @psraganvesh *Banning* macros would result in undefined behavior per the C Standard. See my answer regarding `errno` - the standard specifically states that not using the `errno` macro results in undefined behavior., – Andrew Henle Dec 28 '15 at 16:42
  • 1
    @psraganvesh If your teacher actually said that #define, #if, etc, leads to undefined behavior, he should *not* be teaching a class on C. Chances are that he has never written any code in C, and cheated himself to this job. I would consider discussing this with someone higher up. – pipe Dec 29 '15 at 06:41
6

Macros are used pretty heavily in GNU land C, and without conditional preprocessor commands there's be no way to properly handle multiple inclusions of the same source files, so that makes them seem like essential language features to me.

Maybe your class is actually on C++, which despite many people's failure to do so, should be distinguished from C as it is a different language, and I can't speak for macros there. Or maybe the professor meant he's banning them in his class. Anyhow I'm sure the SO community would be interested in hearing which standard he's talking about, since I'm pretty sure all C standards support the use of macros.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
erik258
  • 14,701
  • 2
  • 25
  • 31
  • 3
    Lol how can i not know which class i am listening to he specifically said it twice after me asking him for the second time – psrag anvesh Dec 28 '15 at 15:53
  • 2
    Macros are required in C++ as well. For example, there's no other way (that I know of) to put in header guards (`#ifndef HEADER_H #define HEADER_H ... #endif`) – R_Kapp Dec 28 '15 at 15:55
  • 1
    @R_Kapp you've never heard of `#pragma once`? Definitely look it up — it has yet to make it into the Standard but it's supported by every vendor (GCC, Clang, EDG, MSVC, Intel, Green Hills, ARM,...) and greatly reduces the chance of making silly typos like `#ifndef FOO_H #define FOOH` or `#ifdef FOO_H #define FOO_H` or using the same macro in two different .h files. (It also improves compile times, and eliminates holy wars over the spacing and indentation of `#ifndef`-style include guards.) Conditional compilation has its uses, for sure, but "poor man's `#pragma once`" is *not* one of them. – Quuxplusone Dec 28 '15 at 21:59
  • 2
    @Quuxplusone #pragma once will almost certainly not make it into the C standard by the very virtue of being a pragma. Pragmas are reserved for implementation defined behaviour so adding it to the standard would break every compiler that used it in a way not compliant to this new supposed standard. Therefore I would say it is best avoided if possible and that this is a very valid use for #ifndef – Vality Dec 29 '15 at 06:34
  • @Vality [Standard pragmas](http://www-01.ibm.com/support/knowledgecenter/#!/SSGH2K_13.1.2/com.ibm.xlc131.aix.doc/language_ref/std_pragmas.html) are actually a thing. :) Sure, `#pragma once` might never make it into ISO *C++* because [modules](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4047.pdf) will probably make .h files pretty much irrelevant in the long term, but that wouldn't affect the ISO *C* committee as far as I know. They could standardize it tomorrow, if someone wrote a proposal. – Quuxplusone Dec 29 '15 at 08:45
  • 2
    There is more to getting something into the standard than someone writing a proposal. The point of pragmas - in both the C and C++ standards - is to provide a hook for implementation defined behaviour, and implementations are free to ignore any usage of `#pragma`. It is therefore simply fantasy to suggest that any pragma will ever be standardised - the notion is completely at odds with what pragmas are. The fact that several vendors misrepresent some of their vendor-specific features as standard does not make it so. – Peter Dec 29 '15 at 12:18
6

Contrary to all of the answers to date, the use of preprocessor directives is oftentimes banned in high-reliability computing. There are two exceptions to this, the use of which are mandated in such organizations. These are the #include directive, and the use of an include guard in a header file. These kinds of bans are more likely in C++ rather than in C.

Here's but one example: 16.1.1 Use the preprocessor only for implementing include guards, and including header files with include guards.

Another example, this time for C rather than C++: JPL Institutional Coding Standard for the C Programming Language . This C coding standard doesn't go quite so far as banning the use of the preprocessor completely, but it comes close. Specifically, it says

Rule 20 (preprocessor use) Use of the C preprocessor shall be limited to file inclusion and simple macros. [Power of Ten Rule 8].


I'm neither condoning nor decrying those standards. But to say they don't exist is ludicrous.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
David Hammen
  • 32,454
  • 9
  • 60
  • 108
  • And the question is tagged with C. – jxh Dec 29 '15 at 22:43
  • @jxh -- And I've worked in organizations where such bans were in place, and with regard to C rather than C++. I didn't like it personally, but they do exist. – David Hammen Dec 29 '15 at 23:01
  • Yes, the consequence is that it removes a standardized code generation mechanism, and results in hacks using scripting languages to generate the correct source code rather than conditional compilation. – jxh Dec 29 '15 at 23:34
  • 6
    That there is at least one environment in which macros are banned is not at all the same as saying they are banned across the entire industry. So the existing answers are neither wrong nor "ludicrous". – Lightness Races in Orbit Dec 30 '15 at 00:09
  • 2
    @LightnessRacesinOrbit -- I didn't say that they are banned across the entire industry. The top-rated (and IMNHO wrongly accepted) answer says "First I've heard of it," along with several other answers that say something similar. What we have here is a professor who possibly only knows of one industry where use of the preprocessor is strongly limited, a student who most likely mis-interpreted what the professor said, and a bunch of answers that over-interpreted that mis-interpreted question. This isn't just an XY problem. It's an XYZW problem. – David Hammen Dec 31 '15 at 18:38
  • The question is about the entire industry. – Lightness Races in Orbit Dec 31 '15 at 18:49
  • @LightnessRacesinOrbit - That's your interpretation of a question by a first year student. Oftentimes we have to liberally interpret student questions at this site because students are even more clueless than are professors who teach them. – David Hammen Dec 31 '15 at 19:03
  • @DavidHammen: That's really saying something – Lightness Races in Orbit Dec 31 '15 at 21:15
2

If you want your C code to interoperate with C++ code, you will want to declare your externally visible symbols, such as function declarations, in the extern "C" namespace. This is often done using conditional compilation:

#ifdef __cplusplus
extern "C" {
#endif

/* C header file body */

#ifdef __cplusplus
}
#endif
jxh
  • 69,070
  • 8
  • 110
  • 193
  • But it could be done with `#include "c++/header.h"` in the C++ code, where the file `"c++/header.h"` contains: `extern "C" { / #include "c/header.h" / }` (with solo slashes indicating newlines — and the choice of sub-directory names is largely arbitrary, of course), which would avoid conditional compilation, at the cost of an extra file. I agree, what you show is what is most often used (I use it myself). But it isn't actually necessary if you set things up correctly. – Jonathan Leffler Jan 01 '16 at 17:12
  • Conditional compilation can be replaced with a makefile that picks different source files based on the conditions. It doesn't make it a better solution, it just complies with the "no `#if`" requirement. Two header files, with one just including the other, just creates questions. – jxh Jan 01 '16 at 18:53
1

Look at any header file and you will see something like this:

#ifndef _FILE_NAME_H
#define _FILE_NAME_H
//Exported functions, strucs, define, ect. go here
#endif /*_FILE_NAME_H */

These define are not only allowed, but critical in nature as each time the header file is referenced in files it will be included separately. This means without the define you are redefining everything in between the guard multiple times which best case fails to compile and worse case leaves you scratching your head later why your code doesn't work the way you want it to.

The compiler will also use define as seen here with gcc that let you test for things like the version of the compiler which is very useful. I'm currently working on a project that needs to compile with avr-gcc, but we have a testing environment that we also run our code though. To prevent the avr specific files and registers from keeping our test code from running we do something like this:

#ifdef __AVR__
//avr specific code here
#endif

Using this in the production code, the complementary test code can compile without using the avr-gcc and the code above is only compiled using avr-gcc.

Dom
  • 1,687
  • 6
  • 27
  • 37
  • 1
    In organizations that ban the use of the processor, those include guards don't count as part of that ban. (In fact, the use of an include guard is mandated in such organizations.) – David Hammen Dec 29 '15 at 20:22
1

If you had just mentioned #define, I would have thought maybe he was alluding to its use for enumerations, which are better off using enum to avoid stupid errors such as assigning the same numerical value twice.

Note that even for this situation, it is sometimes better to use #defines than enums, for instance if you rely on numerical values exchanged with other systems and the actual values must stay the same even if you add/delete constants (for compatibility).

However, adding that #if, #ifdef, etc. should not be used either is just weird. Of course, they should probably not be abused, but in real life there are dozens of reasons to use them.

What he may have meant could be that (where appropriate), you should not hardcode behaviour in the source (which would require re-compilation to get a different behaviour), but rather use some form of run-time configuration instead.

That's the only interpretation I could think of that would make sense.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
jcaron
  • 17,302
  • 6
  • 32
  • 46