3

I'm trying to create an IOSurface in Swift and then create a CIImage from it. The IOSurfaceRef looks alright, but CIImage returns nil:

    textureImageWidth = 1024
    textureImageHeight = 1024

    let macPixelFormatString = "ARGB"
    var macPixelFormat: UInt32 = 0
    for c in macPixelFormatString.utf8 {
        macPixelFormat *= 256
        macPixelFormat += UInt32(c)
    }

    let ioSurface = IOSurfaceCreate([kIOSurfaceWidth: textureImageWidth,
                     kIOSurfaceHeight: textureImageHeight,
                     kIOSurfaceBytesPerElement: 4,
                     kIOSurfaceBytesPerRow: textureImageWidth * 4,
                     kIOSurfaceAllocSize: textureImageWidth * textureImageHeight * 4,
                     kIOSurfacePixelFormat: macPixelFormat] as CFDictionary)!

    IOSurfaceLock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    let test = CIImage(ioSurface: ioSurface)
    IOSurfaceUnlock(ioSurface, IOSurfaceLockOptions.readOnly, nil)

Any suggestions please? I'm guessing that the CIImage initialiser needs a bit more metadata to do its job but I've no idea what.

Nestor
  • 2,753
  • 2
  • 29
  • 34

1 Answers1

3

I had the pixel format byte order the wrong way around. It should be:

for c in macPixelFormatString.utf8.reversed()

Leaving the question up as I don't think there are any other examples of using IOSurfaceCreate in Swift on stackoverflow.

Nestor
  • 2,753
  • 2
  • 29
  • 34
  • 1
    Thanks for returning to answer your own question! Since you are using a constant ARGB (or BGRA), note that you can just use the constant `kCVPixelFormatType_32BGRA` to avoid constructing the pixel format UInt32 manually. As you discovered by reversing the string, `kCVPixelFormatType_32BGRA` is a lot more widely supported than `kCVPixelFormatType_32ARGB`. – Liam Don Apr 15 '20 at 19:58