0

I'm trying to live stream contents of iPhones screen over wifi to local server with ReplayKit broadcast extension. How can I send CMSampleBuffers encoded with VideoToolbox to server on LAN and live play the stream on it?

import ReplayKit
import VideoToolbox
import Network

class SampleHandler: RPBroadcastSampleHandler {

    let ip = "192.168.0.247"
    let pt = 9999
    var compressionSession: UnsafeMutablePointer<VTCompressionSession?> = .allocate(capacity: 1)
    var connection: NWConnection?


    override init() {
        super.init()
        connection = NWConnection(host: NWEndpoint.Host(ip), port: NWEndpoint.Port(rawValue: NWEndpoint.Port.RawValue(pt))!, using: .udp)
        connection?.start(queue: .main)
    }

    override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) {
        // User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
        VTCompressionSessionCreate(allocator: nil, width: 1000, height: 2000, codecType: kCMVideoCodecType_HEVC, encoderSpecification: nil, imageBufferAttributes: nil, compressedDataAllocator: nil, outputCallback: nil, refcon: nil, compressionSessionOut: compressionSession)
    }

    override func broadcastPaused() {
        // User has requested to pause the broadcast. Samples will stop being delivered.
    }

    override func broadcastResumed() {
        // User has requested to resume the broadcast. Samples delivery will resume.
    }

    override func broadcastFinished() {
        // User has requested to finish the broadcast.
    }

    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {

        switch sampleBufferType {
        case RPSampleBufferType.video:
            var imageBuffer:CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
            var pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) as CMTime

            VTCompressionSessionEncodeFrame(compressionSession.pointee!, imageBuffer: imageBuffer, presentationTimeStamp: pts, duration: CMTime.invalid, frameProperties: nil, infoFlagsOut: nil, outputHandler: {(status: OSStatus, flags: VTEncodeInfoFlags, buffer: CMSampleBuffer?) -> Void in
                //what can I use to send the encoded data here?
                })
            // Handle video sample buffer
            break
        case RPSampleBufferType.audioApp:
            // Handle audio sample buffer for app audio
            break
        case RPSampleBufferType.audioMic:
            // Handle audio sample buffer for mic audio
            break
        @unknown default:
            // Handle other sample buffer types
            fatalError("Unknown type of sample buffer")
        }
    }
}
creeperspeak
  • 5,403
  • 1
  • 17
  • 38
Adam
  • 104
  • 10
  • 1
    Post your code for what you are trying currently so people can comment on it. Otherwise the question is way too broad. – creeperspeak Dec 10 '19 at 19:39
  • I added the code. Its basically the xcode template with added encoding of the buffer with VT. What can I use to upload the buffers to a pc on local network so I can play the stream with openCV/VLC or any other suitable technology? I'm really stuck on research what compatible frameworks/libs I can use to live send the buffers and play them on the server side. – Adam Dec 11 '19 at 00:36
  • Did you solve this? I'm banging my head trying solving it – Roi Mulia May 22 '20 at 20:54
  • @RoiMulia Yes finally. Sorry for late response. This is where I found the answer: https://stackoverflow.com/questions/28396622/extracting-h264-from-cmblockbuffer – Adam Sep 09 '20 at 15:10

1 Answers1

0

Found the instructions how to do it: Extracting h264 from CMBlockBuffer

You can receive stream like that with ffmpeg

Adam
  • 104
  • 10