0

I am seeking to leverage the device camera as a light sensor as described in this post. Unfortunately, the captureObject function is never called by the AVCaptureVideoDataOutputSampleBufferDelegate. It may be relevant that I am attempting this inside of a SwiftUI app, I have not seen this problem posted about or resolved in the context of a SwiftUI app.

class VideoStream: NSObject, ObservableObject, 
    AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @Published var luminosityReading : Double = 0.0
    
    var session : AVCaptureSession!
        
    override init() {
        super.init()
        authorizeCapture()
    }

    func authorizeCapture() {
        // request camera permissions and call beginCapture()
        ...
    }

    func beginCapture() {
        print("beginCapture entered") // prints
        session = AVCaptureSession()
        session.beginConfiguration()
        let videoDevice = bestDevice() // func def omitted for readability
        print("Device: \(videoDevice)") // prints a valid device
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: 
             "VideoQueue"))
        session.addOutput(videoOutput)
        session.sessionPreset = .medium
        session.commitConfiguration()
        session.startRunning()
     }

    // From: https://stackoverflow.com/questions/41921326/how-to-get-light-value-from- 
       avfoundation/46842115#46842115
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, 
        from connection: AVCaptureConnection) {

        print("captureOutput entered")  // never printed
        
        // Retrieving EXIF data of camara frame buffer
        let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
        let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
        let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
        
        let FNumber : Double = exifData?["FNumber"] as! Double
        let ExposureTime : Double = exifData?["ExposureTime"] as! Double
        let ISOSpeedRatingsArray = exifData!["ISOSpeedRatings"] as? NSArray
        let ISOSpeedRatings : Double = ISOSpeedRatingsArray![0] as! Double
        let CalibrationConstant : Double = 50
        
        //Calculating the luminosity
        let luminosity : Double = (CalibrationConstant * FNumber * FNumber ) / ( ExposureTime * ISOSpeedRatings )
        luminosityReading = luminosity
    }
}

Lastly, I instantiate VideoStream as a StatreObject in my ContentView and attempt to read the updated luminosityReading:

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text(String(format: "%.2f Lux", videoStream.luminosityReading))
            .padding()
    }
}

I have read and implemented the solutions described in these similar posts:

Using AVCaptureVideoDataOutputSampleBufferDelegate without a preview window

captureOutput not being called

captureOutput not being called from delegate

captureOutput not being called by AVCaptureAudioDataOutputSampleBufferDelegate

In Swift, adapted AVCaptureVideoDataOutputSampleBufferDelegate, but captureOutput never getting called

AVCaptureVideoDataOutput captureOutput not being called

Swift - captureOutput is not being executed

Why AVCaptureVideoDataOutputSampleBufferDelegate method is not called

Why captureOutput is never called?

func captureOutput is never called

captureOutput() function is never called swift4

Minimal Reproducible Example:

import SwiftUI
import AVKit

struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text(String(format: "%.2f Lux", videoStream.luminosityReading))
    }
}

class VideoStream: NSObject, ObservableObject, AVCaptureVideoDataOutputSampleBufferDelegate {
    
    @Published var luminosityReading : Double = 0.0
    
    var session : AVCaptureSession!
        
    override init() {
        super.init()
        authorizeCapture()
    }

    func authorizeCapture() {
        switch AVCaptureDevice.authorizationStatus(for: .video) {
        case .authorized: // The user has previously granted access to the camera.
            beginCapture()
        case .notDetermined: // The user has not yet been asked for camera access.
            AVCaptureDevice.requestAccess(for: .video) { granted in
                if granted {
                    self.beginCapture()
                }
            }
            
        case .denied: // The user has previously denied access.
            return
            
        case .restricted: // The user can't grant access due to restrictions.
            return
        }
    }

    func beginCapture() {
        
        print("beginCapture entered")
        
        let testDevice = AVCaptureDevice.default(for: .video)
        print("Image Capture Device: \(testDevice)")
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: testDevice!),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoQueue"))
        session.addOutput(videoOutput)
        
        session.sessionPreset = .medium
        session.commitConfiguration()
        session.startRunning()
    }
    
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

        print("captureOutput entered")  // never printed
        
        // light meter logic to update luminosityReading
    }
}
halfer
  • 19,824
  • 17
  • 99
  • 186
John Harrington
  • 1,314
  • 12
  • 36
  • Without a [Minimal Reproducible Example](https://stackoverflow.com/help/minimal-reproducible-example) it is impossible to help you troubleshoot. – lorem ipsum Sep 05 '22 at 22:43
  • It should be included in the question, links break over time and people with the same issue wouldn't have access to the full picture – lorem ipsum Sep 05 '22 at 22:49
  • I understand, I will edit my question now – John Harrington Sep 05 '22 at 22:50
  • It is an issue of copy and paste... you are missing one very simple step/line. What is the purpose/intention behind those `guard`? That is likely why this question has been marked as duplicate before – lorem ipsum Sep 06 '22 at 00:22
  • The `guard` statements ensure that the input and output can be added to the `AVCaptureSession`, but of course you realize this, I am scratching my head trying to figure out the significance they have to my problem and what I might be missing... – John Harrington Sep 06 '22 at 00:28
  • `session.addInput(videoDeviceInput)` – John Harrington Sep 06 '22 at 00:33
  • Yup.... That is the right spot – lorem ipsum Sep 06 '22 at 00:34
  • Wow, I am not sure how long it would have taken me to spot that... a valuable lesson learned about copying and pasting from Apple Docs. If you would like to submit an answer, I will accept it. – John Harrington Sep 06 '22 at 00:35
  • 1
    I started rewriting the issue using `async await` and was actually displaying the session on my screen. It was blank so I started debugging from there. – lorem ipsum Sep 06 '22 at 00:37

1 Answers1

1

You are missing adding the input

if session.canAddInput(videoDeviceInput){
    session.addInput(videoDeviceInput)
}
lorem ipsum
  • 21,175
  • 5
  • 24
  • 48