0

Scenario is I want my app to process (in the background if possible) images been seen by the iphone camera.

e.g. App is running, user places the phone down on a piece of red cardboard, than want to display an alertview saying "Phone placed on Red Surface"(this is a simplified version of what i want to do but just to keep the question direct).

Hope this makes sense. I know there is two seperate concerns here.

  1. How to process images from the camera in the background of the app (if we cant do this that we can initiate the process with say a button click if needed).
  2. Processing the image to say what solid colour it is sitting on.

Any help/guidance would be greatly appreciated.

Thanks

Matt
  • 3,305
  • 11
  • 54
  • 98
  • Firstly if you were to place it right next to the cardboard, that might make the image darker/totally dark, though this isn't relevent to what you're asking. I've gotta dash now, but i'll do some testing later to see if i can find a solution. – prince Apr 24 '12 at 07:17
  • Yeah sorry James, it will be reading the colour off a backlit device. – Matt Apr 24 '12 at 10:35

2 Answers2

0

Generic answers to your two questions:

  1. Background processing of image can be triggered as a timer event. Say for example, every 30 second, capture the image on the screen and do the processing behind. If the processing is not computing/time intensive, this should work
  2. It is technically possible to know the color of say one pixel programatically. If you are sure that the entire image is just one color, you can try that approach. Get few random points and get the color of the pixel in the image. But if the image (in your example, red board) consists of an image or multiple colors, then that will require detailed image processing techniques.

Hope this helps

zolio
  • 2,439
  • 4
  • 22
  • 34
0

1) Image Capture

There's two kinds of apps that continually take imagery from the camera: media capture (e.g. Camera, iMovie) or Augmented Reality apps.

Here's the iPhone SDK tutorial for media capture:

https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW3

Access the camera with iPhone SDK

Augmented Reality apps take continual pictures from the camera for processing/overlay. I suggest you look into some of the available AR kits and see how they get a continual stream from the camera and also analyze the pixels.

Starting a augmented reality (AR) app like Panasonic VIERA AR Setup Simulator

http://blog.bordertownlabs.com/post/157320598/customizing-the-iphone-camera-view-with

2) Image Processing

Image processing is a really big topic that's been addressed in multiple other places:

https://photo.stackexchange.com/questions/tagged/image-processing

https://dsp.stackexchange.com/questions/tagged/image-processing

https://mathematica.stackexchange.com/questions/tagged/image-processing

..but for starters, you'll need to use some heuristical analysis to determine what you're looking for. Sampling the captured pixels in a bunch of places (e.g. corners + middle) may help, as would generating a histogram of colour intensities - if there's lots of red but little or no blue and green, it's a red card.

Community
  • 1
  • 1
JBRWilkinson
  • 4,821
  • 1
  • 24
  • 36