I am working on an app in Xamarin (iOS) using C# that tries to detect a phone flashlight turning on and off from about 10 feet away. I have looked around on Xamarin's development site, but nothing seems to stick out on how to check if the camera detects whether the live video feed sees the light turning on and off.
I am very new to image processing, and my thought process as of now is the following:
Start recording a video, and then go back and try to access the frame buffer and check each frame to see if a certain area of the frame/image has a higher light temperature than other parts of the frame/image. If the temperature is higher in one concentrated area, then that means the flashlight is on, otherwise it is not
I'm not sure if this is possible to do without actually recording a video or not. It would be ideal if I could just check the light temperature from a live video feed
Any guidance on what classes/API's to use would be of great assistance. Thanks!
UPDATE: Here is our code so far. We have the live camera stream set up, but we honestly just don't know what class to use or what direction to go in to start accessing the individual images throughout the video and the brightness value of the individual images. PLEASE HELP
using Foundation;
using System;
using UIKit;
using AVFoundation;
using CoreVideo;
using CoreGraphics;
using System.Threading.Tasks;
using CoreMedia;
using ImageIO;
namespace VisibleLightData
{
public partial class ReceiveViewController : UIViewController
{
AVCaptureSession captureSession;
AVCaptureDeviceInput captureDeviceInput;
AVCaptureVideoPreviewLayer videoPreviewLayer;
public ReceiveViewController(IntPtr handle) : base(handle)
{
}
public override void ViewDidLoad()
{
base.ViewDidLoad();
SetupLiveCameraStream();
}
partial void ShowMain(UIButton sender)
{
//Get reference to current storyboar
UIStoryboard storyboard = this.Storyboard;
//Create instance of ReceiveViewController
ViewController viewController = (ViewController)storyboard.InstantiateViewController("ViewController");
//Display ReceiveViewController
PresentViewController(viewController, true, null);
}
public void SetupLiveCameraStream()
{
captureSession = new AVCaptureSession();
var viewLayer = recLiveCameraStream.Layer;
videoPreviewLayer = new AVCaptureVideoPreviewLayer(captureSession) { Frame = this.View.Frame };
recLiveCameraStream.Layer.AddSublayer(videoPreviewLayer);
var captureDevice = AVCaptureDevice.GetDefaultDevice(AVMediaTypes.Video);
ConfigureCameraForDevice(captureDevice);
captureDeviceInput = AVCaptureDeviceInput.FromDevice(captureDevice);
captureSession.AddInput(captureDeviceInput);
captureSession.StartRunning();
}
void ConfigureCameraForDevice(AVCaptureDevice device)
{
var error = new NSError();
if (device.IsFocusModeSupported(AVCaptureFocusMode.ContinuousAutoFocus))
{
device.LockForConfiguration(out error);
device.FocusMode = AVCaptureFocusMode.ContinuousAutoFocus;
device.UnlockForConfiguration();
}
else if (device.IsExposureModeSupported(AVCaptureExposureMode.ContinuousAutoExposure))
{
device.LockForConfiguration(out error);
device.ExposureMode = AVCaptureExposureMode.ContinuousAutoExposure;
device.UnlockForConfiguration();
}
else if (device.IsWhiteBalanceModeSupported(AVCaptureWhiteBalanceMode.ContinuousAutoWhiteBalance))
{
device.LockForConfiguration(out error);
device.WhiteBalanceMode = AVCaptureWhiteBalanceMode.ContinuousAutoWhiteBalance;
device.UnlockForConfiguration();
}
}
}
}