In the new Apple Swift documentation, it says:
Int
In most cases, you don’t need to pick a specific size of integer to use in your code. Swift provides an additional integer type, Int, which has the same size as the current platform’s native word size:
On a 32-bit platform, Int is the same size as Int32. On a 64-bit platform, Int is the same size as Int64. Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability. Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.
I can understand that when using APIs that are defined with "Int", you should use them.
But for my own code, I've always been strict about using the proper bit sized types in C using the stdint header. My thought was I'd try to reduce ambiguity. However, the folks at Apple are pretty smart, and I'm wondering if I'm missing something since this is not what they recommend.