1

I am building a real-time video processor for iOS and react native. I would like to use the RNCamera component to just show a live video view, while at the same time using AVFoundation in a native module.

In a swift native module, I am capturing video by starting a capture session (in the constructor) and implementing the captureOutput delegate method:

self.captureSession.startRunning()

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

In my React code, I am just creating a RNCamera component:

<RNCamera>...</RNCamera>

Right away, I'm noticing that as soon as the RNCamera kicks in, the captureOutput callback is no longer called.

Question

I would like to know if it is somehow possible to achieve what I'm trying to do. Am I missing something simple? If so, how would I do that? If not, what are the limiting factors?

wheresmycookie
  • 683
  • 3
  • 16
  • 39

1 Answers1

0

So basically if you look at RNCamera implementation here: https://github.com/react-native-community/react-native-camera/blob/dea33716aa00201136e16f8b3d2ecaf4bdc622f3/ios/RN/RNCamera.m#L57. They are initiating a new AVCaptureSession object and using the same camera type as the one your swift module is using. I think that causes your AVCaptureSession to be paused and therefore you don't receive new events with captureOutput method.

However, you can still have multiple outputs/previews from this suggested source (it requires you to fork the RNCamera lib and add your own custom implementation):

Multiple video output: AVCaptureSession with multiple previews (The part 2. Manually Rendering SampleBuffer might be interesting for you)

Dang Hai Luong
  • 1,316
  • 10
  • 14