1

I have this Int number that I have to transmit over the network.

I am using this class extension.

extension Data {

  init<T>(from value: T) {
    self = Swift.withUnsafeBytes(of: value) { Data($0) }
  }

  func to<T>(type: T.Type) -> T? where T: ExpressibleByIntegerLiteral {
    var value: T = 0
    guard count >= MemoryLayout.size(ofValue: value) else { return nil }
    _ = Swift.withUnsafeMutableBytes(of: &value, { copyBytes(to: $0)} )
    return value
  }
}

I encode and send it using this from an iOS device:

  let number = button.number
  let buttonNumberData = Data(from:number)

  do {
    try session.send(buttonNumberData, toPeers: session.connectedPeers, with: .reliable)
  } catch let error as NSError {
  }
}

if I do a po buttonNumberData at this time, I get:

▿ 4 bytes
  - count : 4
  ▿ pointer : 0x002e9940
    - pointerValue : 3053888
  ▿ bytes : 4 elements
    - 0 : 1
    - 1 : 0
    - 2 : 0
    - 3 : 0

If I decode the data using this,

let buttonNumber = buttonNumberData.to(type: Int.self)

I have a valid number.

then, this Data is transmitted and reaches a macOS computer and is decoded using the same command:

let buttonNumber = buttonNumberData.to(type: Int.self)

The problem is that buttonNumber is always nil, but if I do a po buttonNumberData at this point, I have this:

▿ 4 bytes
  - count : 4
  ▿ pointer : 0x00007ffeefbfd428
    - pointerValue : 140732920747048
  ▿ bytes : 4 elements
    - 0 : 1
    - 1 : 0
    - 2 : 0
    - 3 : 0

what appears to be valid data.

It appears some problem on the Data extension class related to how macOS works.

Any ideas?

Duck
  • 34,902
  • 47
  • 248
  • 470
  • The "unsafe" part of the name strikes again – Alexander Aug 24 '19 at 13:01
  • what do you mean? – Duck Aug 24 '19 at 14:49
  • People constantly use these unsafe constructs without carefully considering precisely what is unsafe about them. In this case, you learned that one of the "unsafe" things about these is that you're making your code explicitly depend on the machine's raw binary layout, and trying to use it portably between devices – Alexander Aug 24 '19 at 14:53
  • ah I see. In my case I forgot about that the generic Int definition would correspond to different number of bits on different platforms... Thanks. – Duck Aug 24 '19 at 14:59
  • You also didn't consider the endianness of the integer representation, which is also architecture dependant. – Alexander Aug 24 '19 at 15:15

1 Answers1

2

Int is a platform-dependent type in Swift, it can be a 32-bit or a 64-bit integer.

Apparently you encode the integer on a 32-bit device and decode it on a 64-bit device. The decoding method returns nil if the amount of data is less than the size of an integer.

Use Int32 (or Int64) instead of Int for an integer type which is 4 (or 8) bytes, independently of the platform. This workson on all (current) iOS and macOS devices, because they all use the same (little endian) byte order.

For a completely platform independent representation, convert the data to a well-defined byte order, as demonstrated here.

Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382