1

Will the following code work on 32-bit platforms?

class Account {
  private static let UserIDKey = "AccountUserIDKey"
  class var userID: Int64? {
    get { Int64(UserDefaults.standard.integer(forKey: UserIDKey)) }
    set { UserDefaults.standard.set(newValue, forKey: UserIDKey) }
  }
}

Ie, will it work on 32-bit platforms if I store and retrieve an Int value that's greater than Int32.max into UserDefaults?

I'm wondering because I only see the UserDefaults instance method integer(forKey:). Why doesn't UserDefaults have an instance method like int64(forKey:)?

ma11hew28
  • 121,420
  • 116
  • 450
  • 651

1 Answers1

-2

You can save a number of simple variable types in the user defaults:

  • Booleans with Bool
  • integers with Int
  • floats with Float
  • doubles with Double
  • Strings with String
  • binary data with Data
  • dates with Date
  • URLs with the URL type
  • Collection types with Array and Dictionary

    Internally the UserDefaults class can only store NSData, NSString, NSNumber, NSDate, NSArray and NSDictionary classes.

    These are object types that can be saved in a property list.

An NSNumber is an NSObject that can contain the original C style numeric types, which even include the C/Objective-C style BOOL type (that stores YES or NO). Thankfully for us, these are also bridged to Swift, so for a Swift program, an NSNumber can automatically accept the following Swift types:

  • UInt
  • Int
  • Float
  • Double
  • Bool

Dmitry Popov's answer in Quora

Remember how you were taught in school how to add numbers like 38 and 54. You work with separate digits. You take rightmost digits, 8 and 4, add them, get the digit 2 in answer and 1 carried because of an overflow. You take the second digits, 3 and 5, add them to get 8 and add the carried 1 to get result 9, so the whole answer becomes 92. A 32-bit processor does exactly the same, but instead of decimal digits it's got 32-bit integers, algorithms are the same: work on two 32-bit parts of 64-bit number separately, deal with overflows by carrying from lower to higher word. This takes several 32-bit instructions and the compiler is responsible to encode them properly.


Save Int64 to UserDefaults

import Foundation

let defaults = UserDefaults.standart

//Define an 64bit integer with `Int64`
let myInteger: Int64 = 1000000000000000

//Save the integer to `UserDefaults`
defaults.set(myInteger, forKey: "myIntegerKey")

// Get the saved value from `UserDefaults`
guard let savedInteger = defaults.object(forKey: "myIntegerKey") as? Int64 else { return }

print("my 64bit integer is:", savedInteger)

Save 64bit Int to UserDefaults via converting to String

let myInteger: Int64 = 1000000000000000
let myString = String(myInteger)

//Save the `myString` to `UserDefaults`
UserDefaults.standard.set(myString, forKey: "myIntegerKey")


// Get the value from `UserDefaults`
let myString = UserDefaults.standard.string(forKey: "myIntegerKey")
let myInteger = Int64(myString!)

Resources

emrcftci
  • 3,355
  • 3
  • 21
  • 35
  • 2
    Is this an answer to "Does UserDefaults.integer work on 32-bit platforms?" and "Why doesn't UserDefaults have an instance method like int64(forKey:)?"? – Willeke Mar 07 '20 at 09:00
  • Thank you, but as @Willeke pointed out, you did not answer my questions. – ma11hew28 Mar 09 '20 at 20:56