1

I was reading an interesting interview with Chris Lattner, author of LLVM and Swift, and noticed a very curious claim:

Other things are that Apple does periodically add new instructions [11:00] to its CPUs. One example of this historically was the hilariously named “Swift” chip that it launched which was the first designed in-house 32-bit ARM chip. This was the iPhone 5, if I recall.

In this chip, they added an integer-divide instruction. All of the chips before that didn’t have the ability to integer-divide in hardware: you had to actually open-code it, and there was a library function to do that. [11:30] That, and a few other instructions they added, were a pretty big deal and used pervasively

Now that is surprising. As I understand it, integer divide is almost never needed. The cases I've seen where it could be used, fall into a few categories:

  • Dividing by a power of two. Shift right instead.

  • Dividing by an integer constant. Multiply by the reciprocal instead. (It's counterintuitive but true that this works in all cases.)

  • Fixed point as an approximation of real number arithmetic. Use floating point instead.

  • Fixed point, for multiplayer games that run peer-to-peer across computers with different CPU architectures and need each computer to agree on the results down to the last bit. Okay, but as I understand it, multiplayer games on iPhone don't use that kind of peer-to-peer design.

  • Rendering 3D textures. Do that on the GPU instead.

After floating point became available in hardware, I've never seen a workload that needed to do integer divide with significant frequency.

What am I missing? What was integer divide used for on Apple devices, so frequently that it was considered worth adding as a CPU instruction?

rwallace
  • 31,405
  • 40
  • 123
  • 242
  • 1
    Real-world code written by people naive about performance does carelessly use `/` with runtime-variable divisors. Even if the value was only set to some power of 2, sometimes the compiler doesn't know that at the site that uses it. A programmer more careful about optimization could have avoided that, but it should be obvious that in computing history, software expands to fill available memory space and CPU cycles as developers would rather get more written faster once CPUs can handle the performance cost. – Peter Cordes Feb 26 '22 at 01:29
  • 1
    Near duplicate of [What is integer division heavily used for?](https://stackoverflow.com/q/67527288) which only has answers in comments. (It's also about Apple, though, about *modern* Apple having fast integer division in M1, like Ice Lake.) Oh, that was also your question! – Peter Cordes Feb 26 '22 at 01:30
  • @PeterCordes Yep! This one is adjacent, but approaches the topic from a hopefully sufficiently different perspective, based on a new data point regarding the usage of integer division in iPhone apps. Your first comment here, is actually the most plausible explanation I've seen: that, yes, the divisor was not an arbitrary variable, but the compiler doesn't know that. – rwallace Feb 26 '22 at 16:13

0 Answers0