14

Is there a way to use the iPhone proximity sensor to detect whether the phone is in a room with no light?

This question seems to imply that this is not possible...Does iPhone allow Light sensors as input?

Community
  • 1
  • 1
Apollo
  • 8,874
  • 32
  • 104
  • 192
  • There should already be an ambient light sensor in the iPhone I believe – user2277872 Mar 31 '14 at 03:28
  • 2
    @user2277872 There's an important warning at the top of the IO doc you provided I/O Kit is a low-level framework communicating with hardware or kernel services. Although it is a public framework, Apple discourages developers from using it, and any apps using it will be rejected from App Store. – Sirop4ik Jan 29 '17 at 12:36

6 Answers6

30

Here's a much simpler way of using the camera to find out how bright a scene is. (Obviously, it only reads the data that can be "seen" in the camera's field of view, so it's not a true ambient light sensor...)

Using the AVFoundation framework, set up a video input and then, using the ImageIO framework, read the metadata that's coming in with each frame of the video feed (you can ignore the actual video data):

#import <ImageIO/ImageIO.h>

- (void)captureOutput:(AVCaptureOutput *)captureOutput
      didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
             fromConnection:(AVCaptureConnection *)connection
{
  CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL, 
    sampleBuffer, kCMAttachmentMode_ShouldPropagate);
  NSDictionary *metadata = [[NSMutableDictionary alloc] 
    initWithDictionary:(__bridge NSDictionary*)metadataDict];
  CFRelease(metadataDict);
  NSDictionary *exifMetadata = [[metadata 
    objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
  float brightnessValue = [[exifMetadata 
    objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
}

You now have the Brightness Value for the scene updated (typically—you can configure this) 15-30 times per second. Lower numbers are darker.

Stan James
  • 2,535
  • 1
  • 28
  • 35
Wildaker
  • 2,533
  • 1
  • 17
  • 19
  • Do you know the possible range of the brightness level? I didn't find it on the internet... – Julian F. Weinert Jul 08 '14 at 15:57
  • 1
    Eyeballing the data, it looks like it's probably -10 to +15. The range I can actually *get* is about -8.5 to +13.5. – Wildaker Jul 11 '14 at 08:25
  • I actually get data between +14 and -7... So there is no documentation on this? – Julian F. Weinert Jul 11 '14 at 19:39
  • I just an other idea. This is EXIF data, so this is no iOS specific value. These values are called the `APEX-System`. I read the Wiki article but didn't find absolute values like "from ... to ..." http://en.wikipedia.org/wiki/APEX_system – Julian F. Weinert Jul 11 '14 at 19:47
  • 2
    Just want to thank you and confirm that it works also on iOS8 – AlexeyVMP Oct 04 '14 at 06:57
  • your guys are the best!!! could somebody provide some sample code to this? eg how to set up the video feed? – Bergrebell Aug 17 '15 at 14:01
  • could you please convert your answer as Swift 3? Will be very useful – Sirop4ik Jan 29 '17 at 12:38
  • Crash may happen by chance. Thread 35 Crashed: 0 CoreFoundation 0x0000000183cf97f8 _CFBasicHashCreateCopy :592 (in CoreFoundation) 1 CoreFoundation 0x0000000183d1b328 _CFDictionaryCreateCopy :156 (in CoreFoundation) – RY_ Zheng May 17 '17 at 06:49
  • I doubt that the crash related to this code, which I've been using (for different purposes) for five years on commercial products. – Wildaker May 18 '17 at 07:04
  • There are tens of millions users using my commercial products every day. It happens more than twenty times a day, on average。 This crash never happens before. – RY_ Zheng May 18 '17 at 07:27
  • CFDictionaryCreateCopy is in CFDictionary.c . I wonder some bad memory problem, but I'm unable to get more infomation. – RY_ Zheng May 18 '17 at 07:44
  • I can only claim hundreds of thousands of users, but they're all running (exactly) this code >15 times per second, and it doesn't lead to crashes. Without seeing your code, I can't really comment further... – Wildaker May 19 '17 at 08:08
9

Swift 4.2 version based upon Wildaker's code. Xcode 10 refused to allow it to be a Float, but the double has worked.

func getBrightness(sampleBuffer: CMSampleBuffer) -> Double {
    let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
    let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
    let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
    let brightnessValue : Double = exifData?[kCGImagePropertyExifBrightnessValue as String] as! Double
    return brightnessValue
}
StephenFeather
  • 913
  • 11
  • 13
8

Proximity sensor is not what you should be looking for. Ambient light sensor it is. Apparently that API is undocumented or not available at all for developers. An alternative way of detecting if iPhone is in a dark room would be using the camera and obtaining the luminosity . Here's a good guide on how to do that,

https://www.transpire.com/insights/blog/obtaining-luminosity-ios-camera/

petrosmm
  • 528
  • 9
  • 23
Rukshan
  • 7,902
  • 6
  • 43
  • 61
1

There is much simpler solution for this if anyone needs. Use screen brightness to detect the light conditions:

0 - 0.3 (Dark)

0.4 - 1 (Bright)

Tweak as needed:

switch UIScreen.main.brightness {
    case 0 ... 0.3:
        print("LOW LIGHT")
    default:
        print("ENOUGH LIGHT")
    }
Alessign
  • 768
  • 9
  • 17
  • 4
    The brightness of the screen won't be consistent unless automatic brightness level is enabled. This is clever but very fragile way of accomplishing the desired outcome. – dcrow Aug 06 '20 at 13:08
0

Although it is possible to access the ambient light sensor data through IOKit framework, Apple discourages developers from using it, and any apps using it will be rejected from App Store.

But it is possible to deduce the luminosity of the environment approximately through the camera. That is by implementing the camera through AVFoundation framework and processing the meta data coming though each of the camera frames. Refer to this answer on question : How to get light value from AVFoundation

AnuradhaH
  • 512
  • 4
  • 14
0

If you're doing some Augmented Reality stuff using ARKit you can get the lightEstimate value on each frame of the video feed from the ARSession.

See documentation on this.

Harry Bloom
  • 2,359
  • 25
  • 17