1

Are there any major differences between the Metal Shader Language on iOS and Mac? I'm trying to port my Metal cifilters from iOS, and they seem to look completely different

David
  • 125
  • 2
  • 15

1 Answers1

1

Yes, there shouldn't be a difference between the platforms on the language level.

One difference I can think of is that macOS supports images with 32 bits per channel ("full" float), whereas in iOS you can "only" use 16-bit half float images.

Another difference that just came to mind is the default coordinate space of input samplers. In iOS, the sampler space is in relative coordinates ([0...1]), whereas in macOS it's in absolute coordinates ([0...width]). You should be able to unify that behavior by explicitly setting the sampler matrix like that (in macOS):

// see documentation for `kCISamplerAffineMatrix`
let scaleMatrix = [1.0/inputImage.extent.width, 0, 0, 1.0/inputImage.extent.height, 0, 0]
let inputSampler = CISampler(image: inputImage, options: [kCISamplerAffineMatrix: scaleMatrix])
kernel.apply(extent: domainOfDefinition, roiCallback: roiCallback, arguments: [inputSampler, ...])
Frank Rupprecht
  • 9,191
  • 31
  • 56
  • Hmmm... And is there any way to convert a full float CIImage to a half float one? – David Dec 06 '19 at 15:41
  • What would you need that for? – Frank Rupprecht Dec 06 '19 at 15:48
  • I'm trying to figure out a way to make existing Metal kernels produce the same result as on iOS without rewriting them. I thought this could help. – David Dec 06 '19 at 16:02
  • Here is an example of a kernel for a filter which does pixelation https://gist.github.com/Davvie/b481f7ab8bde7b15a014e05217b1fb7d – David Dec 06 '19 at 16:03
  • The image gets pixelated with `paramIntensity = 20` on iOS, for instance, but it almost doesn't on the Mac – David Dec 06 '19 at 16:07
  • 1
    You should be able to use that on iOS without a problem. It's just the _maximum_ precision of pixel values that might differ on the platforms. But for most (color) operations 8 or 16 bits are more than enough. – Frank Rupprecht Dec 06 '19 at 16:07
  • I see, I guess then the issue is in a different place Thanks anyway – David Dec 06 '19 at 16:21
  • 1
    Ah, I think I read something about `src.coord()` being different by default on the platforms: in iOS you get relative coordinates (0...1), whereas in macOS those are absolute coordinates (0...width). You should be able to adjust for that by changing the scaling of the `CISampler` you pass to the kernel using the `kCISamplerAffineMatrix` parameter. – Frank Rupprecht Dec 07 '19 at 10:34
  • It looks like that's exactly why everything falls apart. Applying a transform doesn't seem to do the trick though (the output image becomes 1px) – David Dec 08 '19 at 22:09
  • This is odd. The size of the output should only be determined by the `extent` parameter in the kernel call. Let me extend my answer... – Frank Rupprecht Dec 09 '19 at 07:33