5

I am new to the Apple DriverKit SDK an i am not clear about how register my device driver so it would be available as a Camera in the OS. Do i have to register a streaming function in the Start function of the IOService? I searched all over the internet for an answer but i could not find on.

I need to read data from a custom USB Camera and then make it available via a custom driver.

Can any of you guys help me?

pmdj
  • 22,018
  • 3
  • 52
  • 103
Michael
  • 198
  • 7

1 Answers1

1

Support for cameras and video capture devices is not implemented as special I/O Kit classes in macOS (and therefore also not in DriverKit), but rather entirely in user space, via the Core Media I/O framework. Depending on the type of device a DriverKit component may be necessary. (e.g. PCI/Thunderbolt, which is not available directly from userspace, or USB devices where the camera functionality is not cleanly isolated to a USB interface descriptor) This dext would expose an entirely custom API that in turn can then be used from the user space CoreMediaIO based driver.

From macOS 13 (Ventura) onwards, the Core Media I/O extensions API should be used to implement a driver as this will run in its own standalone process and can be used from all apps using Core Media.

Prior to that (macOS 12 and earlier), only a so-called Device Abstraction Layer (DAL) plug-in API existed, which involved writing a dynamic library in a bundle, which would be loaded on-demand by any application wishing to use the device in question. Unfortunately, this raised code signing issues: applications built with the hardened runtime and library-validation flags can only load libraries signed by Apple or by the same team as signed the application itself. This means such apps can't load third party camera drivers. Examples of such apps are all of Apple's own, including FaceTime.

pmdj
  • 22,018
  • 3
  • 52
  • 103