I am building a real-time video processor for iOS and react native. I would like to use the RNCamera component to just show a live video view, while at the same time using AVFoundation in a native module.
In a swift native module, I am capturing video by starting a capture session (in the constructor) and implementing the captureOutput
delegate method:
self.captureSession.startRunning()
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
In my React code, I am just creating a RNCamera component:
<RNCamera>...</RNCamera>
Right away, I'm noticing that as soon as the RNCamera kicks in, the captureOutput
callback is no longer called.
Question
I would like to know if it is somehow possible to achieve what I'm trying to do. Am I missing something simple? If so, how would I do that? If not, what are the limiting factors?