3

If I'm correct, Objective-C's int type length depends on the platform word length (i.e. it's 64 bit when running in a 64-bit environment and 32-bit when running in a 32-bit environment).

However, when bridging an Objective-C interface to Swift, this type is bridged as Int32 which explicitly defines the size of 32 bits.

Is this an error, or int in Objective-C always occupies 32 bit? If it's not an error, why it gets bridged as Int32?

Example of the Objective-C interface

-(int)numberOfItems;

Which gets bridged into:

func numberOfItems -> Int32

When I change the method signature to use NSInteger, it get's bridged correctly:

-(NSInteger)numberOfItems;
func numberOfItems -> Int

Is there any danger if I change int in Objective-C to NSInteger, or these types behave in exactly the same manner in Objective-C?

Richard Topchii
  • 7,075
  • 8
  • 48
  • 115
  • 2
    Objective-C doesn't conventionally use `int` or `unsigned int`, it uses `NSInteger` and `NSUInteger`, so go the second way. Also `NSUInteger` is the correct type as you cannot have a negative number of items :). – trojanfoe Oct 10 '19 at 12:56
  • @trojanfoe thanks for your comment. Could you please elaborate on "Objective-C doesn't conventionally use int or unsigned int" - are these types coming from C? I've named the methods arbitrarily, of course in this context it's better to use unsigned integer. – Richard Topchii Oct 10 '19 at 13:29
  • @trojanfoe You should avoid using `NSUInteger` here. While it is not possible to have a negative number of items unsigned math is very error-prone, and bridging to `UInt` is very inconvenient in Swift. It was a mistake in ObjC that `count` returned `NSUInteger`, and that mistake was fixed in Swift by returning `Int`. – Rob Napier Oct 10 '19 at 13:30
  • Well there are no rules prohibiting the use of any valid type, however if you look at how Apple do things they always use the `NS` types instead of the bare compiler types. – trojanfoe Oct 10 '19 at 13:31
  • 1
    @RobNapier agree with you, I've actually had an error caused by not noticing that the type returned was `NSUInteger`: https://stackoverflow.com/questions/35265279/evaluation-nsintegers-inside-if-statement-objective-c – Richard Topchii Oct 10 '19 at 13:34
  • @RobNapier Why was it a mistake to use `NSUInteger` for `count`? It's never caused me any problems and neither has using `size_t` under C++. – trojanfoe Oct 10 '19 at 13:39
  • See the linked question from Richard above for a common bug caused by unsigned math. The most common form is `if (n < some_unsigned - 1)` when you forget that `some_unsigned` might be 0, and so `some_unsigned - 1` is UINT_MAX. See the note for `UInt` in the Swift Programming Guide: https://docs.swift.org/swift-book/LanguageGuide/TheBasics.html "Use UInt only when you specifically need an unsigned integer type with the same size as the platform’s native word size. If this isn’t the case, Int is preferred, even when the values to be stored are known to be nonnegative." – Rob Napier Oct 10 '19 at 13:46
  • The C "int" and "long" types can be implementation- and machine-defined sizes. "int" is 32 bits on both 32-bit and 64-bit Apple platforms. "long" is 32 bits on 32-bit platforms, and 64 bits on 64-bit platforms. "long long" is 64-bits on both. C now has explicit int32_t and int64_t types which do not change. ObjC now typically uses NSInteger for their data types, so they have control over the size regardless of system. They define that as equivalent to "long" (and NSUInteger is "unsigned long"), so it stays the same as the pointer size. – Carl Lindberg Oct 10 '19 at 14:22
  • @RobNapier Well if that's the Swift philosophy then fair enough. I don't see it as a major source of bugs though, and I find using the correct signed/unsigned type helps to self-document methods. – trojanfoe Oct 10 '19 at 14:24
  • `NSInteger` is depends on platform word length. `Int` depends on compilation settings. By default Xcode set 32bit. – Cy-4AH Oct 10 '19 at 15:26
  • @trojanfoe Note that this is also the position of Google's C++ style guide; it's not a Swift-specific concern, and predates Swift. See "On Unsigned Integers" for more. https://google.github.io/styleguide/cppguide.html#Integer_Types For a much more in-depth discussion, see Scott Meyers (1995): https://www.aristeia.com/Papers/C++ReportColumns/sep95.pdf (BTW, I'm not saying your position is wrong or unfounded. It's certainly defensible. But "avoid unsigned for numbers" has been a recommendation from many corners for many years and I agree with it.) – Rob Napier Oct 10 '19 at 15:43
  • @RobNapier That's interesting, thank you. – trojanfoe Oct 10 '19 at 15:49
  • @RobNapier - "The most common form is `if (n < some_unsigned - 1)` when you forget that `some_unsigned` might be `0`, and so `some_unsigned - 1` is `UINT_MAX` - except that in Swift it is an error due to strong typing resulting in underflow. Most would agree that having a boolean type is better than C's use of integers for the purpose; that `enum { Spades, Hearts, Clubs, Diamonds }` is better than using the integers 1..4 for the same purpose; etc. Is not the discouragement of unsigned types a left over from weaker typing? – CRD Oct 11 '19 at 10:02
  • @CRD Most likely you're correct, it's just a residue from weaker-typed languages. – Richard Topchii Oct 11 '19 at 10:10
  • @CRD This isn't the correct way to think of strong types. The compiler is not able to reason about unsigned math at compile time and prove that the operation is valid. "Crashes at runtime" is the opposite of what strong types offer. This is completely different than a Sum type like enum, where the compiler can prove at compile time that each operation is valid. (There are many other holes in the type system, such as the fact that integers can overflow, but none of these are evidence of strong typing; they're counter-evidence, because the compiler accepted faulty code.) – Rob Napier Oct 11 '19 at 12:31
  • @RobNapier - I will disagree, enforcement of the basic properties in the "type as sets" model is "stronger typing". In C the pred/succ rules of (un)signed integers were not enforced (unlike say in Ada), "succ(max)" & "pred(min)" wrapped. Swift enforces these rules by aborting, other languages use a catchable error. The "unsigned - 1" concern is just the "pred(min)" case, it wasn't enforced in Obj-C, it is in Swift – so Swift has "stronger" typing here, even if some of us might prefer it wasn't aborting! Also not all "enum" rules can be enforced at compile time, e.g. mapping to the domain. – CRD Oct 11 '19 at 17:20
  • 1
    @RobNapier - But we've wandered off the original question, we should wait for one like "Should I use unsigned types" to debate views on type theory! – CRD Oct 11 '19 at 17:22

2 Answers2

4

Objective C's int (or to be correct, C's int) should be at least 16 bits in size, so storing it in 32 bits is not invalid. Probably in Apple's implementation it is always 32 bit for int.

If I'm correct, Objective-C's int type length depends on the platform word length (i.e. it's 64 bit when running in a 64-bit environment and 32-bit when running in a 32-bit environment).

That would be NSInteger, not int. You may be confusing them because in Swift's Int is equivalent type for Objective C's NSInteger.

You can even see it in the docs:

typealias NSInteger = Int
user28434'mstep
  • 6,290
  • 2
  • 20
  • 35
  • Thank you, however when generating CoreData classes the type is created with a specific number of bits (as the CoreData model doesn't allow selecting a platform-dependent option), e.g. `int32_t`. Does it mean that the CoreData uses C-types not Objective-C? – Richard Topchii Oct 10 '19 at 13:31
  • That is correct. Core Data fixes the length of its integers to a specific bitwidth because they are persistently stored. Otherwise if you loaded a Core Data database on a different architecture, it would be corrupted. – Rob Napier Oct 10 '19 at 13:34
  • 1
    @RichardTopchiy `CoreData` need to specify bit width as precise as possible, so it uses `int_t` types. Because those types required to be exactly `width` size wide (instead of `at least that size wide`, or `depends on the platform wide`). – user28434'mstep Oct 10 '19 at 13:35
3

Swift is not bridging from generic (Objective-)C but from Apple (Objective-)C. In Standard C the size of int has only a minimum bound and not an exact size. However the Apple (& GNU) compilers follow specific models for data type sizes, from Apple's 64-Bit Transition Guide:

OS X uses two data models: ILP32 (in which integers, long integers, and pointers are 32-bit quantities) and LP64 (in which integers are 32-bit quantities, and long integers and pointers are 64-bit quantities). Other types are equivalent to their 32-bit counterparts (except for size_t and a few others that are defined based on the size of long integers or pointers).

So in both ILP32 (used for 32-bit apps) and LP64 (64-bit apps) the size of int is 32-bits. The equivalent integer type in Swift is Int32.

Regarding NSInteger the same reference states:

In addition to these changes to the base data types, various layers of OS X have other data types that change size or underlying type in a 64-bit environment. The most notable of these changes is that NSInteger and NSUInteger (Cocoa data types) are 64-bit in a 64-bit environment and 32-bit in a 32-bit environment.

Swift's equivalent is Int.

CRD
  • 52,522
  • 5
  • 70
  • 86