2

I found the git below is simple and efficient by using func capturer(_ capturer: RTCVideoCapturer, didCapture frame: RTCVideoFrame) of RTCVideoCapturerDelegate. You get RTCVideoFrame and then convert to CVPixelBuffer to modify.

https://gist.github.com/lyokato/d041f16b94c84753b5e877211874c6fc

However, I found Chronium says nativeHandle to get PixelBuffer is no more available(link below). I tried frame.buffer.pixelbuffer..., but, looking at framework > Headers > RTCVideoFrameBuffer.h, I found CVPixelBuffer is also gone from here!

https://codereview.webrtc.org/2990253002

Is there any good way to convert RTCVideoFrame to CVPixelBuffer? Or do we have better way to modify captured video from RTCCameraVideoCapturer?

Below link suggests modifying sdk directly but hopefully we can achieve this on Xcode.

How to modify (add filters to) the camera stream that WebRTC is sending to other peers/server

1 Answers1

6

can you specify what is your expectation? because you can get pixel buffer from RTCVideoframe easily but I feel there can be a better solution if you want to filter video buffer than sent to Webrtc, you should work with RTCVideoSource.

you can get buffer with as seen

    RTCCVPixelBuffer *buffer = (RTCCVPixelBuffer *)frame.buffer;
    CVPixelBufferRef imageBuffer = buffer.pixelBuffer;

(with latest SDK and with local video camera buffer only)

but in the sample i can see that filter will not work for remote. enter image description here

i have attached the screenshot this is how you can check the preview as well.

Sumit Meena
  • 474
  • 3
  • 11
  • Hi Sumit, thanks for your comment. As I described in my question, nativeHandle function is removed 1.5 years ago. Pls see the Chronium link. I built sdk by Chronium depot_tools, not pod, to adopt bitcode. – Giraff Wombat Feb 11 '19 at 00:26
  • 1
    I have edited my answer please check it, I believe it will gives you the image buffer you are looking for. – Sumit Meena Feb 11 '19 at 09:13
  • Hi Sumit. [RTCVideoframe](https://webrtc.googlesource.com/src/+/master/sdk/objc/base/RTCVideoFrame.h), [RTCCVPixelBuffer](https://webrtc.googlesource.com/src/+/master/sdk/objc/base/RTCVideoFrameBuffer.h), [RTCI420Buffer](https://webrtc.googlesource.com/src/+/master/sdk/objc/base/RTCI420Buffer.h) These are links to most updated = master sdk. They show no property or function of Frame and Buffer gives access to CVPixelBuffer. I know it WAS available, but not anymore. So I ended up building a new capturer implementing RTCVideoCapturer. Thanks much for your time and comments, anyway! – Giraff Wombat Feb 12 '19 at 08:03
  • 1
    Hi @GiraffWombat it will work you have to try it first I just check and I am able to get to cvpixelbuffer. – Sumit Meena Feb 13 '19 at 08:59
  • In my whole day on tacking this, your answer pops up several times. It took me a while to follow the code. At the end, very happy to see it working for me. Thanks a lot. – Saran Nov 19 '19 at 15:34
  • Is there any way to capture audio from ongoing stream? – Jayesh Mardiya Jan 13 '20 at 07:44
  • 1
    as I know there is no direct way to get the remote audio stream however you can get remote audio stream by customizing SDK source code by your own. – Sumit Meena Jan 13 '20 at 08:03
  • I wish there was a way to upvote this answer more than I already did. Sharing this kind of deep information from a really experienced developer (I mean after 7 years of Xcode I only now see that (i) screen for the first time!) is absolutely saving me hours and hours and thousands of dollars. – Lucas van Dongen Jan 15 '20 at 10:54