1

Can an int be unsigned by default like char and can someone provide these platform if it exists ?

I mean that char can be unsigned or signed depending on platforms. Is the same thing applicable to int, long, short ?

StoryTeller - Unslander Monica
  • 165,132
  • 21
  • 377
  • 458
Sabrina
  • 1,460
  • 11
  • 23
  • 6
    Since when is `char` `unsigned` by default? – πάντα ῥεῖ Feb 14 '17 at 17:10
  • 1
    Where'd you get that idea ? Oo – Cakeisalie5 Feb 14 '17 at 17:10
  • 8
    Neither `int` nor `char` are `unsigned` by default. `int` is always signed unless otherwise specified. `char`'s signedness is implementation defined. – François Andrieux Feb 14 '17 at 17:11
  • 2
    `char` is unsigned in ARM platform. [http://blog.cdleary.com/2012/11/arm-chars-are-unsigned-by-default/](http://blog.cdleary.com/2012/11/arm-chars-are-unsigned-by-default/) – Rotem Feb 14 '17 at 17:11
  • 1
    What did you do to find out yourself? What about the documentation is not clear? You ask a lot of nonsense questions, how about first consulting the standard before asking? – too honest for this site Feb 14 '17 at 17:11
  • The standard is 600 pages and I did not find that on google. – Sabrina Feb 14 '17 at 17:12
  • 1
    [This](http://port70.net/~nsz/c/c11/n1570.html) is the last C11 draft prior to publication. All in a nice single page. You can search that. – StoryTeller - Unslander Monica Feb 14 '17 at 17:13
  • 5
    @Sabrina: Yes, every developer *should* dig into their handy reference manuals and language standards **before** going online to ask questions. I have no less than 5 language reference manuals on my desk *right now* (C, C++, Perl, Python, and SQL). I have a Unix programming reference, a Linux desktop reference, a UML guide, a copy of Design Patterns, and both the C and C++ online standards bookmarked. If, after checking a reference, you don't understand something, then *by all means* ask a question here, but *at least* do you own research first. No less is asked of the rest of us. – John Bode Feb 14 '17 at 17:32
  • Heck I up-voted. At least it's something we can suggest as a duplicate for any future question of the sort. – StoryTeller - Unslander Monica Feb 14 '17 at 18:34
  • @FrançoisAndrieux "`int` is always signed unless otherwise specified." has an exception. With bit-fields, a plain `int` may be treated as `unsigned int`. With bit-fields, better to use `signed int` and `unsigned int` rather than `int`. – chux - Reinstate Monica Feb 14 '17 at 19:21
  • 1
    It is curious that a question formed well enough to elicit good (and high scoring) answers from people of high rep on this site can at the same time be so easily down-voted by others for such reasons as "look it up yourself". Questions can be judged, at least in part, by the quality of the responses they have attracted. This question, although terse, addresses subtleties that many new programmers would benefit from. (+1) – ryyker Feb 14 '17 at 20:17
  • @ryyker no problem I get used to that. – Sabrina Feb 14 '17 at 22:54
  • @πάνταῥεῖ ARM implementations usually use `unsigned` char by default [Any compiler which takes 'char' as 'unsigned' ?](https://stackoverflow.com/q/3728045/995714), [Does anyone know why "char" is unsigned on ARM/gcc?](https://news.ycombinator.com/item?id=18269886), [Why unsigned types are more efficient in arm cpu?](https://stackoverflow.com/q/3093669/995714), [ARM chars are unsigned by default](https://web.archive.org/web/20121202022150/https://blog.cdleary.com/2012/11/arm-chars-are-unsigned-by-default/) – phuclv Sep 22 '21 at 06:56

3 Answers3

10

No, int is always signed. Unlike char, which may behave as a signed char or unsigned char depending on the platform, int is always a synonym for signed int, regardless of the platform, in both C and C++.

Reference: C99, 6.2.5.4:

There are five standard signed integer types, designated as signed char, short int, int, long int, and long long int.

Reference: C++11, 3.9.1.2:

There are five standard signed integer types : signed char, short int, int, long int, and long long int.

Community
  • 1
  • 1
Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523
  • Why there is no `short` ? – Sabrina Feb 14 '17 at 17:28
  • "`int` is always signed." deserves a note about _bit-fields_: "if the actual type specifier used is `int` or a typedef-name defined as `int`, then it is implementation-defined whether the bit-field is signed or unsigned." – chux - Reinstate Monica Feb 14 '17 at 19:27
  • 2
    @chux Thank you very much for the comment! You are absolutely right, bit fields are governed by a separate set of rules. I wasn't sure if I wanted to open this can of worms in the context of this question, because it sounds like OP wants to find out a very specific thing unrelated to bit fields. – Sergey Kalinichenko Feb 14 '17 at 19:38
7

char is peculiar: there are three flavors, signed char, unsigned char, and char. A plain char can be signed or unsigned, depending on the implementation, but in C++ it's a different type from the other two. All the other integer types have just two flavors, signed and unsigned; you can say unsigned int, signed int, and just plain int, but plain int is signed, and it's just a different name for signed int. In the C++ standard, that's clause 3.9.1, [basic.fundamental].

Pete Becker
  • 74,985
  • 8
  • 76
  • 165
6

As per the C11 standard section "5.2.4.2.1 Sizes of integer types ":

The values given below shall be replaced by constant expressions suitable for use in #if preprocessing directives. Moreover, except for CHAR_BIT and MB_LEN_MAX, the following shall be replaced by expressions that have the same type as would an expression that is an object of the corresponding type converted according to the integer promotions. Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown, with the same sign.
...........................
minimum value for an object of type int
INT_MIN -32767
maximum value for an object of type int
INT_MAX +32767

So, as you can see the limits of int have to be at least the ones specified (absolute values) with the same signs.

Eugene Sh.
  • 17,802
  • 8
  • 40
  • 61
  • You skipped very important part `Their implementation-defined values` – Sabrina Feb 14 '17 at 17:20
  • 2
    @Sabrina No I haven't. It says their implementation defined values are limited by the emphasized text. So no matter of implementation they must have a range [-32767, 32767]. So their implementation can only increase this range. – Eugene Sh. Feb 14 '17 at 17:21
  • I have seen INT_MIN defined as -32768 in most places. Is yours actually -32767? – ryyker Feb 14 '17 at 19:32
  • 1
    @ryyker There is a linked standard document draft. `-32767` is good for the possible sign-and-magnitude representation. – Eugene Sh. Feb 14 '17 at 19:35
  • You might want to mention that reading the entire of 5.2.4.2.1 makes it clear that `char` is a special case, and all other integer types are required to be signed unless `unsigned` is used. – zwol Feb 16 '17 at 14:02