9

Does someone know how to change WebRTC (https://cocoapods.org/pods/libjingle_peerconnection) video source?

I am working on an screen sharing app. At the moment, I retrieve the rendered frames in real-time in CVPixelBuffer. Does someone know how I could add my frames as video source please? Is it possible to set an other video source instead of camera device source ? Is yes, which format the video has to be and how to do it ?

Thanks.

Eric Aya
  • 69,473
  • 35
  • 181
  • 253
Eric Kevin
  • 91
  • 1
  • 3

2 Answers2

4
var connectionFactory :  RTCPeerConnectionFactory = RTCPeerConnectionFactory()
let videoSource :  RTCVideoSource = factory.videoSource()
videoSource.capturer(videoCapturer, didCapture: videoFrame!)
MacDeveloper
  • 1,334
  • 3
  • 16
  • 49
Mouni
  • 141
  • 1
  • 1
  • 7
  • 6
    This answer would be a lot more helpful if you explained what it does and why it solves the problem. – divibisan Jun 04 '18 at 15:18
  • 1
    he alludes to use two instances, `RTCVideoSource` and `RTCVideoCapturer`, first of all you create `videoSource` from factory and than create capturer `RTCVideoCapturer(delegate: videoSource)`. Than you create `videoTrack` with `videoSource` and `trackId` and add it to the `peer.localStream`. And than if you want to processes your own frame to the webRTC you create frame and call `videoSource.capturer(videoCapturer, didCapture: videoFrame!)` – Bws Sluk Jul 02 '18 at 13:29
3

Mounis answer is wrong. This leads to nothing. At least not at the time of this writing. There is simply nothing happening.

In fact, you would need to satisfy this delegate

- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame;

(Note the difference to the Swift version: didCapture vs. didCaptureVideoFrame)

Since this delegate is for unclear reasons not available at Swift level (the compiler says you have to use didCapture, since it has been renamed from didCaptureVideoFrame with Swift3) you have to put the code int an ObjC class. I did copy this and this (which is a part of this sample project)into my project, made my videoCapturer an instance of ARDBroadcastSampleHandler

self.videoCapturer = ARDExternalSampleCapturer(delegate: videoSource)

and within the capture callback I'm calling it

let capturer = self.videoCapturer as? ARDExternalSampleCapturer
capturer?.didCapture(sampleBuffer)
ch271828n
  • 15,854
  • 5
  • 53
  • 88
decades
  • 827
  • 17
  • 25
  • Unfortunately, none of the links is working. – Ankur Lahiry Jan 19 '22 at 09:28
  • Sorry about this, but those were all external projects, not mine – decades Jan 20 '22 at 10:05
  • yeah, I am stuck on this capture issue. I have a video frame ready to send, and applied the method `didCapture ` but didn't have any luck – Ankur Lahiry Jan 21 '22 at 11:02
  • Hi All, can anyone let me know how did you accomplish getting the frame itself? @AnkurLahiry – Dipak Jul 14 '22 at 20:53
  • @Dipak Please check this https://stackoverflow.com/a/62742537/8475638 – Ankur Lahiry Jul 15 '22 at 09:49
  • Hello I added didCaptureVideoFrame function to VideoCaptureController.h but It's not capturing frames. Where I'm doing wrong? ```- (void)capturer:(RTCVideoCapturer*)capturer didCaptureVideoFrame:(RTCVideoFrame*)frame { NSLog(@"[RTCVideoCapturer] didCaptureVideoFrame"); // create new frame [self.videoSource capturer:capturer didCaptureVideoFrame:fixedFrame]; }``` – user3838971 Nov 05 '22 at 17:11