I was reading an interesting interview with Chris Lattner, author of LLVM and Swift, and noticed a very curious claim:
Other things are that Apple does periodically add new instructions [11:00] to its CPUs. One example of this historically was the hilariously named “Swift” chip that it launched which was the first designed in-house 32-bit ARM chip. This was the iPhone 5, if I recall.
In this chip, they added an integer-divide instruction. All of the chips before that didn’t have the ability to integer-divide in hardware: you had to actually open-code it, and there was a library function to do that. [11:30] That, and a few other instructions they added, were a pretty big deal and used pervasively
Now that is surprising. As I understand it, integer divide is almost never needed. The cases I've seen where it could be used, fall into a few categories:
Dividing by a power of two. Shift right instead.
Dividing by an integer constant. Multiply by the reciprocal instead. (It's counterintuitive but true that this works in all cases.)
Fixed point as an approximation of real number arithmetic. Use floating point instead.
Fixed point, for multiplayer games that run peer-to-peer across computers with different CPU architectures and need each computer to agree on the results down to the last bit. Okay, but as I understand it, multiplayer games on iPhone don't use that kind of peer-to-peer design.
Rendering 3D textures. Do that on the GPU instead.
After floating point became available in hardware, I've never seen a workload that needed to do integer divide with significant frequency.
What am I missing? What was integer divide used for on Apple devices, so frequently that it was considered worth adding as a CPU instruction?