-2

Actually the initial question was: "What exactly the kCGBitmapByteOrderDefault is?" Because sometimes CGImageGetByteOrderInfo(CGImageRef image) returns kCGBitmapByteOrderDefault as a result. But this just tells that it is equal to the endian order of the host machine/device.

Then the question automatically changes to: "What is the endian order of the host machine/device?"

Apple docs tell that there are several constants which can be the exact answer:

kCGImageByteOrder16Big
kCGImageByteOrder16Little
kCGImageByteOrder32Big
kCGImageByteOrder32Little

So, I need solution which returns some value from the list above.

Kibernetik
  • 2,947
  • 28
  • 35
  • Hi Kibernetik, first a stupid question: Do you want to know about what endianness in computing is, or do you want to know what the default endian order on an iOS device is? – MacUserT May 14 '22 at 11:44
  • I want to know what endian order stands for kCGBitmapByteOrderDefault value, for iOS or Mac device. – Kibernetik May 14 '22 at 11:47
  • Hi Kibernetik, this question has already been answered here https://stackoverflow.com/questions/58614504/is-ios-guaranteed-to-be-little-endian. Does that help you? – MacUserT May 14 '22 at 11:51
  • Thank you. But kCGBitmapByteOrderDefault = kCGImageByteOrder32Little is not a correct answer. – Kibernetik May 14 '22 at 11:54
  • Does this answer your question? [Difference between Big Endian and little Endian Byte order](https://stackoverflow.com/questions/701624/difference-between-big-endian-and-little-endian-byte-order) – Magnas May 14 '22 at 12:34
  • Intel and ARM are both little endian by default. – matt May 14 '22 at 13:44
  • Yes, you are right. The question may be misleading actually. – Kibernetik May 15 '22 at 15:32

1 Answers1

2

You said:

Because sometimes CGImageGetByteOrderInfo(CGImageRef image) returns kCGBitmapByteOrderDefault as a result. But this just tells that it is equal to the endian order of the host machine/device.

It is not the endian order of the machine. It is the default endianness of CoreGraphics. See What does kCGBitmapByteOrderDefault actually mean? on Apple Forums. They point out that while it is not documented, kCGBitmapByteOrderDefault appears to employ big-endian order.

One can confirm this by creating a kCGBitmapByteOrderDefault context, and another with kCGBitmapByteOrder32Big, and compare how they render a particular color. On my little-endian machine, the results of kCGBitmapByteOrderDefault were the same as those from kCGBitmapByteOrder32Big, not kCGBitmapByteOrder32Little. This is consistent with claims made in that forum discussion.

FWIW, PNG and JPG use “network byte order” (i.e., big-endian). Also, there is an intuitive appeal to have, for example, the bytes for an RGBA image appear in the order of red, green, blue, and then alpha, rather than the other way around. If CoreGraphics wanted to pick a standard used regardless of hardware, big-endian is not an unreasonable choice. I just wish they documented it.


That having been said, when I need to access/manipulate a pixel buffer for an image, I always render the image to a buffer with an explicit endianness as outlined in Apple’s Technical Q&A QA1509. Personally, though, I let let CoreGraphics manage the data provider for me, rather than manual malloc/free suggested by that old Technical Q&A doc. See this permutation of QA1509’s code.

Rob
  • 415,655
  • 72
  • 787
  • 1,044