3

I wanted to put some small integers into an array and decided to use int8_t:

#include <cstdint>
#include <iostream>

int main() {
    int n = 3;
    int8_t arr[n];
    for (int i = 0; i < n; ++i) {
        std::cin >> arr[i];
    }
}

Input is 9 -20 14 and n = 3.

But instead of 9, -20, 14 in array I got 9, -, 2.

Why does int8_t act like a char?

P.S. int8_t is typedef'd in sys/types.h this way:

# define __intN_t(N, MODE) \
  typedef int int##N##_t __attribute__ ((__mode__ (MODE)))

# ifndef __int8_t_defined
#  define __int8_t_defined
__intN_t (8, __QI__);
# endif
grepcake
  • 3,960
  • 3
  • 16
  • 26
  • 4
    because it is a `typedef` to `signed char` in this case – Slava Jan 09 '17 at 15:28
  • Actually I think it should be `signed char` (on almost all platforms), which is distinct type from just `char` (even if on many platforms `char` is signed, in which case they are in that sense identical). – hyde Jan 09 '17 at 15:28
  • @hyde `std::ostream` handle all three the same way I believe – Slava Jan 09 '17 at 15:29
  • @hyde - the same thing with signed char – grepcake Jan 09 '17 at 15:31
  • 4
    Don't spam tags! C++ is not C is not C++. And provide a [mcve]. You might want to use the C++ header. – too honest for this site Jan 09 '17 at 15:31
  • @Slava - I don't see `char` in `typedef` above – grepcake Jan 09 '17 at 15:33
  • @A.Yurchenko standard defines only those integral types (char short int long and long long), others are aliases. Which way it is done in system headers is irrelevant. – Slava Jan 09 '17 at 15:37
  • @Slava: Actually, they don't have to be. `int8_t` can be an alias of `char`, but it is not *required* to be. It can be an compiler-defined type. – Nicol Bolas Jan 09 '17 at 15:40
  • 1
    Is that the *only* (potential) definition of `int8_t`? Double check the preprocessor conditionals. – eerorika Jan 09 '17 at 15:40
  • @Slava - you were right. Should i just delete the question or you will answer it? – grepcake Jan 09 '17 at 15:43
  • @NicolBolas does it mean if I define a function overrides that accept `char`, `short`, `int`, `long` and `long long` they may fail to accept `int8_t`? – Slava Jan 09 '17 at 16:05
  • @Slava: Yes, that's correct. Then again, those overloads would probably fail with `unsigned short` too. Or at least potentially do the wrong thing. – Nicol Bolas Jan 09 '17 at 16:12
  • @NicolBolas so there is no way to have override that would handle that type properly? Unsigned is different story. Sounds incorrect, can you point to standard? – Slava Jan 09 '17 at 16:13
  • @Slava: First, `virtual` functions are "overridden". The word you're looking for is "overload". Second, if you want an overload to handle all integer types, you should use the biggest available integer type (`std::intmax_t`) and allow implicit conversions to upscale the integer to that. Or use a template function. You cannot assume that overloading on all of the fundamental integer types is sufficient to capture all integers. – Nicol Bolas Jan 09 '17 at 16:20
  • @NicolBolas "You cannot assume that overloading on all of the fundamental integer types is sufficient to capture all integers." I would like to see that in standard, is there already such question on SO or I should create one? – Slava Jan 09 '17 at 16:22
  • @Slava: You've got that backwards. You won't find it in the standard because the standard doesn't provide a guarantee on this. That is, it never says that the fundamental integer types are the *only* integer types a compiler supports. So there is nothing that can be pointed to in the standard; this conclusion is based on the *lack* of such a statement in the standard. – Nicol Bolas Jan 09 '17 at 16:24
  • 1
    @NicolBolas I created new question http://stackoverflow.com/questions/41552514/is-overloading-on-all-of-the-fundamental-integer-types-is-sufficient-to-capture let's continue there – Slava Jan 09 '17 at 16:30

0 Answers0