0

I'm trying to create a camera remote control app with an iPhone as the camera and an iPad as the remote control. What I'm trying to do is send the iPhone's camera preview using the AVCaptureVideoDataOutput and stream it with OutputStream using the MultipeerConnectivity framework. Then the iPad will receive the data and show it using UIView by setting the layer contents. So far what I've done is this:

(iPhone/Camera preview stream) didOutput function implementation from the AVCaptureVideoDataOutputSampleBufferDelegate:

func captureOutput(_ output: AVCaptureOutput,
                   didOutput sampleBuffer: CMSampleBuffer,
                   from connection: AVCaptureConnection) {
    DispatchQueue.global(qos: .utility).async { [unowned self] in
        let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        
        if let imageBuffer {
            CVPixelBufferLockBaseAddress(imageBuffer, [])
            let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
            let bytesPerRow: size_t? = CVPixelBufferGetBytesPerRow(imageBuffer)
            let width: size_t? = CVPixelBufferGetWidth(imageBuffer)
            let height: size_t? = CVPixelBufferGetHeight(imageBuffer)
            
            let colorSpace = CGColorSpaceCreateDeviceRGB()
            let newContext = CGContext(data: baseAddress,
                                       width: width ?? 0,
                                       height: height ?? 0,
                                       bitsPerComponent: 8,
                                       bytesPerRow: bytesPerRow ?? 0,
                                       space: colorSpace,
                                       bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
            
            if let newImage = newContext?.makeImage() {
                let image = UIImage(cgImage: newImage,
                                    scale: 0.2,
                                    orientation: .up)
                
                CVPixelBufferUnlockBaseAddress(imageBuffer, [])
                
                if let data = image.jpegData(compressionQuality: 0.2) {
                    let bytesWritten = data.withUnsafeBytes({
                        viewFinderStream?
                            .write($0.bindMemory(to: UInt8.self).baseAddress!, maxLength: data.count)
                    })
                }
            }
        }
    }
}

(iPad/Camera remote controller) Receiving the stream and showing it on the view. This is a function from StreamDelegate protocol:

func stream(_ aStream: Stream, handle eventCode: Stream.Event) {
    let inputStream = aStream as! InputStream

    switch eventCode {
    case .hasBytesAvailable:
        DispatchQueue.global(qos: .userInteractive).async { [unowned self] in
            var buffer = [UInt8](repeating: 0, count: 1024)
            let numberBytes = inputStream.read(&buffer, maxLength: 1024)
            let data = Data(referencing: NSData(bytes: &buffer, length: numberBytes))
            if let imageData = UIImage(data: data) {
                DispatchQueue.main.async {
                    previewCameraView.layer.contents = imageData.cgImage
                }
            }
        }
    case .hasSpaceAvailable:
        break
    default:
        break
    }
}

Unfortunately, the iPad did receive the stream but it shows the video data just a tiny bit of it like this (notice the view on the right, there are few pixels that shows the camera preview data on the top left of the view. The rest is just a gray color):

The iPad screen

EDIT: And I get this warning too in the console

2023-02-02 20:24:44.487399+0700 MultipeerVideo-Assignment[31170:1065836] Warning! [0x15c023800] Decoding incomplete with error code -1. This is expected if the image has not been fully downloaded.

And I'm not sure if this is normal or not but the iPhone uses almost 100% of it's CPU power.

Debug Navigator for the iPhone

My question is what did I do wrong for the video stream not showing completely on the iPad? And is there any way to make the stream more efficient so that the iPhone's CPU doesn't work too hard? I'm still new to iOS programming so I'm not sure how to solve this. If you need more code for clarity regarding this, please reach me in the comments.

cleanrun
  • 549
  • 6
  • 20
  • For the high RAM consumption and CPU usage, I ended up setting the `AVCaptureSession` sessionPreset to `.hd1280x720` to reduce the high RAM consumption and CPU usage. But the streaming problem still happening though. – cleanrun Jan 29 '23 at 10:14

1 Answers1

1

I think the root of the issue is the fact that iPad reads the data from the stream using a 1024-byte buffer, which is just 256 pixels. That's what you likely see in the preview.

Instead, you need to somehow "know" the length of every frame so you could read it in full.

If you sent an uncompressed data then you could first send the iPad the expected dimensions so iPad could always read full frames. However you send compressed images (jpegs) and you need so somehow tell iPad what's the binary size of every "image".

Sending full frames is kinda inefficient. I am not an expert in this area, but I would consider encoding the camera input into a video and then stream it to iPad. I believe it should be possible to somehow use hardware encoding and the streaming nature of mp4 videos should also help. But that might not be a good suggestion since I have a very little idea of what I'm talking about.

You might want to look into:

storoj
  • 1,851
  • 2
  • 18
  • 25