I am trying to write a program using opencv to calculate the distance from a webcam to a one inch white sphere. I feel like this should be pretty easy, but for whatever reason I'm drawing a blank. Thanks for the help ahead of time.
-
Have you tried "calibrating" your system? I mean you can measure the pixel size when the sphere is 50cm, 1m, 2m away from the webcam, and then compare the real time measurement to the calibration data? – Matthieu Napoli Jul 15 '11 at 23:08
-
This is a good idea, but if I'm planning on creating a cross-platform application that will be run on all sorts of different webcams, will each different camera need it's own calibration? Or will different cameras of the same quality (say, vga for instance) see the ball as the same number of pixels at each distance? – David Harbage Jul 15 '11 at 23:21
-
You'll need calibration for each camera unfortunately, because very few have the same "angle of vision". Some are more wide angle (zoomed) than others, see here for example: http://www.tomshardware.com/reviews/webcam-quality-test-shootout,878-10.html. So I understand this is not the best solution. – Matthieu Napoli Jul 16 '11 at 08:29
2 Answers
You can use triangle similarity to calibrate the camera angle and find the distance.
You know your ball's size: D
units (e.g. cm). Place it at a known distance Z
, say 1 meter = 100cm, in front of the camera and measure its apparent width in pixels. Call this width d
.
The focal length of the camera f
(which is slightly different from camera to camera) is then f=d*Z/D
.
When you see this ball again with this camera, and its apparent width is d'
pixels, then by triangle similarity, you know that f/d'=Z'/D
and thus: Z'=D*f/d'
where Z'
is the ball's current distance from the camera.

- 3,167
- 27
- 35

- 16,743
- 5
- 67
- 137
-
2Isn't the issue getting the focal length of a camera? Camera calibration of your smart phone doesn't give you the focal length in terms of meters, but in terms of pixel on the x-y axes. They have been multiplied by a scaling factor. – Cameron Lowell Palmer Dec 12 '14 at 09:26
-
The first paragraph explains how to get the focal length in pixels. Triangle similarity essentially find the conversion scaling factor from pixels to meters. – Adi Shavit Dec 12 '14 at 09:48
-
I see what you're saying. Still, isn't simply keeping a database of image sensor sizes and calibrations about the same amount of trouble? – Cameron Lowell Palmer Dec 12 '14 at 10:25
-
The two options are complementary. Once you calibrate you can keep a DB. – Adi Shavit Dec 12 '14 at 10:26
-
-
1am I the only one who can't read "You know your balls size" with a straight face? – A_P Dec 17 '20 at 21:30
To my mind you will need a camera model = a calibration model if you want to measure distance or other things (int the real-world). The pinhole camera model is simple, linear and gives good results (but won't correct distortions, (whether they are radial or tangential).
If you don't use that, then you'll be able to compute disparity-depth map, (for instance if you use stereo vision) but it is relative and doesn't give you an absolute measurement, only what is behind and what is in front of another object....
Therefore, i think the answer is : you will need to calibrate it somehow, maybe you could ask the user to approach the sphere to the camera till all the image plane is perfectly filled with the ball, and with a prior known of the ball measurement, you'll be able to then compute the distance....
Julien,

- 2,771
- 4
- 24
- 27