There are some limits but the compiler author is free to choose the lengths for the standard C variable types (char, short, int, long, long long). Naturally char is going to be a byte for that architecture (most with C compilers are 8 bits). And naturally you cannot have something smaller be bigger than something bigger, long cannot be smaller than an int. But certainly by 1999 we saw the x86 16 to 32 bit transition and for example int changed from 16 to 32 with a number of tools but long stayed 32. Later the 32 to 64 bit x86 transition happened and depending on the tool there were types available to help.
The problem existed long before this and the solution was not to fix the lengths of the types, they are, within rules, up to the compiler authors as to size. But the compiler authors need to craft a stdint.h file that matches the tool and target (stdint.h is specific to a tool and target at a minimum and can be version of tool and build options for that tool, etc). So that, for example, uint32_t is always 32 bits. Some authors will convert that to an int others a long, etc in their stdint.h. The C language variable types remain limited to char, short, int, etc per the language (uint32_t is not a variable type it is converted to a variable type through stdint.h). This solution/workaround was a way to keep is from all going crazy and keep the language alive.
Authors will often choose for example if the GPRs are 16 bit to have int be 16 bit, and if 32 bit be 32 bit and so on, but they have some freedom.
Yes, this specifically means that there is no reason to assume that any two tools for a particular target (the computer you are reading this on for example) use the same definitions for int and long in particular, and if you want to write code for this platform that can port across these tools (that support this platform) then use the stdint.h types and not int, long, etc...Most certainly if you are crossing platforms an msp430 mcu, an arm mcu, an arm linux machine, an x86 based machine, that the types, even for the same "toolchain" (gnu gcc and binutils for example), do not have the same definitions for int and long, etc. char and short tend to be 8 and 16 bits, int and long tend to vary the most, sometimes the same size as each other sometimes different, but the point is do not assume.
It is trivial to detect the sizes, for a compiler version/target/command line options, or just go the stdint route to minimize problems later.