I always use "and" and "or", but at college I learned that we should use && and ||...
Asked
Active
Viewed 108 times
-2
-
1I have had instances where `and` and `or` were not recognized by the compiler. However, `&&` and `||` have **always** been recognized by the compilers (especially when you have old code that must be compiled with old compilers). – Thomas Matthews Jun 25 '21 at 17:34
-
6Not all compilers support `and` and `or` correctly. Personally I regard such code as idiosyncratic, but that might fall away with the advent of Python. I also find that `or` doesn't really distinguish `||` and `|` adequately? – Bathsheba Jun 25 '21 at 17:35
-
2Searching for `&&` and `||` gives a lot fewer false positives. – Ben Voigt Jun 25 '21 at 17:37
-
There is no difference between them. It's customary to use `&&` `||` though, so I recommend you to get used to it. – HolyBlackCat Jun 25 '21 at 17:37
-
@Bathsheba neither AND, btw - all those symbolic names fail to convey the nature of the operation - whether it is bit-wise or logical. I tend to think of them as bad influence (I doubt there are really people nowadays who are using symbol pages which do not support special symbols). – SergeyA Jun 25 '21 at 17:44
-
Everyone knows `&&`, all compilers understand it, it's shorter than `and`. Other alternative tokens like `not_eq` are clumsy - you would end up mixing styles inconsistently. – Aykhan Hagverdili Jun 25 '21 at 18:31
1 Answers
3
This related to character encoding, the source code maybe not written in any non-ASCII 7-bit character (ex: ISO 646:1983) and serving compatibility to code that use this operator.
Reference:

Stelf
- 88
- 1
- 5
-
I like this answer. While it does not directly answers the question "why", it empowers the reader to make their own projections. – SergeyA Jun 25 '21 at 17:46
-
If your source character set doesn't have `&`, all your references and pointers are in for a world of hurt. – Ben Voigt Jun 25 '21 at 17:48
-
-
@SergeyA: You mean `bitand`? But while it may make the lexer happy, it's semantically incorrect and going to cause mental anguish for anyone working with the code. – Ben Voigt Jun 25 '21 at 18:23
-
@BenVoigt sure, `bitand`. I am not saying it is a welcome practice to write code in such a manner, I am just saying that references and pointers would survive. – SergeyA Jun 25 '21 at 18:27
-
@BenVoigt `#define ref bitand` and then `int ref i = a;` (I am just joking) – Aykhan Hagverdili Jun 25 '21 at 18:34