27

I was wondering what is the difference between uint32_t and uint32, and when I looked in the header files it had this:

types.h:

    /** @brief 32-bit unsigned integer. */
    typedef unsigned int uint32;
stdint.h:

    typedef unsigned   uint32_t;

This only leads to more questions: What is the difference between

unsigned varName;

and

unsigned int varName;

?

I am using MinGW.

cybertextron
  • 10,547
  • 28
  • 104
  • 208
user1507133
  • 473
  • 1
  • 5
  • 8
  • 9
    They're the same. However the type `uint32` (and the header `` or the file `"types.h"`) is not defined by the C99 Standard. If you want to use one of those types, use `uint32_t` and include the header ``. Also `unsigned` and `unsigned int` are the same. – pmg Aug 02 '12 at 21:34
  • 1
    So uint32 and are not part of the standard, but uint32_t is? – user1507133 Aug 02 '12 at 21:40
  • 2
    @user1507133: Yes. Basically, there's no such thing as `uint32` neither in C nor in C++. – AnT stands with Russia Aug 02 '12 at 21:42

4 Answers4

22

unsigned and unsigned int are synonymous, much like unsigned short [int] and unsigned long [int].

uint32_t is a type that's (optionally) defined by the C standard. uint32 is just a name you made up, although it happens to be defined as the same thing.

Ky -
  • 30,724
  • 51
  • 192
  • 308
Kerrek SB
  • 464,522
  • 92
  • 875
  • 1,084
  • 11
    A bit of necromancy here, but comparing `unsigned [int]` to `[unsigned] (short|long) [int]` may give new programmers the false impression that `short` and `long` are unsigned by default. I'd suggest `unsigned (short|long) [int]` instead. – bcrist Aug 18 '14 at 04:49
6

There is no difference.

unsigned int = uint32 = uint32_t = unsigned in your case and unsigned int = unsigned always

RiaD
  • 46,822
  • 11
  • 79
  • 123
4

unsigned and unsigned int are synonymous for historical reasons; they both mean "unsigned integer of the most natural size for the CPU architecture/platform", which is often (but by no means always) 32 bits on modern platforms.

<stdint.h> is a standard header in C99 that is supposed to give type definitions for integers of particular sizes, with the uint32_t naming convention.

The <types.h> that you're looking at appears to be non-standard and presumably belongs to some framework your project is using. Its uint32 typedef is compatible with uint32_t. Whether you should use one or the other in your code is a question for your manager.

Russell Borogove
  • 18,516
  • 4
  • 43
  • 50
  • " is a standard header in C99". What about C++98 and C++11? – user1507133 Aug 02 '12 at 21:41
  • `` is, I think, the equivalent C++ standard header. – Russell Borogove Aug 02 '12 at 21:46
  • 1
    @user1507133 - C++11 has which brings these types into the `std` namespace (e.g. `std::uint32_t`). C++98 does not have this header because it predates C99. POSIX provides `inttypes.h` with the same types if that is good enough. – Nemo Aug 02 '12 at 21:47
  • 1
    According to the C Standard, `` includes ``. So it "copies" every definition and provides a few extra ones (notably scanf and printf specifiers: `printf("%" PRIu32 "\n", x);` – pmg Aug 02 '12 at 22:00
2

There is absolutely no difference between unsigned and unsigned int.

Whether that type is a good match for uint32_t is implementation-dependant though; an int could be "shorter" than 32 bits.

Mat
  • 202,337
  • 40
  • 393
  • 406