Constructing an NSNumber
with the value UInt8.max
yields different results in Xcode 8 than it does with the open source build of Swift 3.0 on Linux.
Xcode 8:
print(NSNumber(value: UInt8.max)) // 255 (__NSCFNumber = Int16(255))
Swift 3.0 on Linux:
print(NSNumber(value: UInt8.max)) // -1
Is this intentional? I would expect the value to equal 255 on both Swift distributions.
As Martin writes in the comments, this appears to be a bug with the Swift interpreter: SR-90