1

Consider two subsequent frames of a video that look as follows: fig1

fig2

What is a possible way to track the movement of one of the spokes?

I am asking this because I don't have any experience with video processing. So any advice would be helpful! What am I tracking? From what I have read, usually I would have to detect the object I want to track first. And I am using a corner detection algorithm for this such as goodfeaturestotrack. But how do I make sure I detect the correct spoke, etc. ?

Some additional information: The framerate of the video is 30fps. The wheel rotates in clockwise-direction only. And when I click through the video frame by frame it is pretty obvious that a spoke does not move by more than half of the angle between two spokes (from frame to frame). Also: The radius of the wheel is 5cm.

I have now tried out Mark's answer. I have logged Tmax and the timestamp of the frame in a txt-file and then wrote the following code to compute the corresponding velocity:

ListOfVelocities = []
for idx in range(1,len(ListOfAngles)):
    if ListOfAngles[idx] < ListOfAngles[idx-1]:
        rotation = (360-ListOfAngles[idx]) + ListOfAngles[idx-1]
    else: 
        rotation = ListOfAngles[idx] - ListOfAngles[idx-1]
    timePassed = VideoTimestamp[idx]-VideoTimestamp[idx-1]
    velocity = 2*np.pi/360 * rotation * RADIUS * timePassed
    ListOfVelocities.append(velocity)
Luk
  • 1,009
  • 2
  • 15
  • 33
  • Does it rotate in one direction only? Is the interval between frames guaranteed to be shorter than the time it takes the wheel to rotate by 1/5 (if unidirectional) or 1/10 (if bidirectional) of revolution? If not, then you basically have the same aliasing issue as if you'd undersample a periodic waveform. Painting on the spokes a different colour would help reduce the minimum sampling frequency to avoid that (although you still have an issue telling apart movement by 0.2 revolution vs 1.2/2.2/....) Maybe with fixed exposure rate, the amount of motion blur could help distinguish those. – Dan Mašek May 26 '19 at 21:59
  • 1
    Do you actually have an idea of the revs per minute of the wheel? And the framerate of the video? – Mark Setchell May 26 '19 at 22:58
  • hey guys, I updated my question to answer your remarks. What's still unclear to me: How can I detect a spoke in a frame? Can I use a corner detection algorithm? Honestly, I have no idea about the general approach to this problem – Luk May 27 '19 at 08:52
  • wouldn't it be possible to track brightness somehow? Spokes are way brighter as compared to the space between the spokes. There should be a way to examine where the edge from bright to dark moves from frame to frame – Luk May 27 '19 at 12:13
  • “when I click through the video frame by frame it is pretty obvious that a spoke does not move by more than half of the angle between two spokes” You can’t tell if the wheel moves more or not. This is the nature of aliasing. With 30 fps, the wheel be moving veeeery slowly. If it does exactly 6 turns a second, the wheel will appear to stand still. With 8 turns a second, it will appear to turn 2 turns a second. With 4 turns a second, it will appear to turn the other way with 2 turns a second. – Cris Luengo May 27 '19 at 15:25

3 Answers3

2

I don't really perceive this as a tracking problem because the wheel is constrained so it can't move about all over the frame, it can only change its angular position, so you only really need to know where some part of it is in one frame and how much it has rotated by in the next frame. Then, as you know the framerate, i.e. the time between frames, you can calculate the speed.

So, the question is how to tell which is the same spoke that you measured in the previous frame. As the area behind the spokes is dark, you would want a light spoke to contrast well so you can find it easily. So, I would paint four of the spokes black, then you are just looking for one light one on a dark background. I would also consider painting the centre of the wheel red (or other saturated colour), so you can easily find the middle.

Now, at the start of processing, find the centre of the wheel by looking for red and get its x,y coordinates in the image. Now choose a radius in pixels that you can alter later, and work out a list of the x,y coordinates of say 360 points (1 per degree) on the circumference of the circle centred on and going around the red point. These points and all the sines/cosines will not change all through your processing, so do this outside your main video processing loop.

Now at each frame, use indexing to pick up the brightness at each of the 360 points and, initially at least, take the brightest one as the spoke.

So, I have crudely painted on your image so the centre is red and just one spoke is white:

enter image description here

Now the code looks something like this:

#!/usr/bin/env python3

import math
import numpy as np
from PIL import Image

# Open image and make Numpy version of it too
im = Image.open('wheel.png')
imnp = np.array(im)

# Find centre by looking for red pixels
# See https://stackoverflow.com/a/52183666/2836621
x, y = 193, 168

# Set up list of 360 points on a circle centred on red dot outside main processing loop
radius = 60
# List of X values and Y values on circumference
Xs = []
Ys = []
for theta in range(360):
    thetaRad = math.radians(theta)
    dx = int(radius * math.sin(thetaRad))
    dy = int(radius * math.cos(thetaRad))
    Xs.append(x+dx)
    Ys.append(y+dy)

# Your main loop processing frames starts here

# Make greyscale Numpy version of image
grey = np.array(im.convert('L'))

sum  = 0
Bmax = 0
Tmax = 0
for theta in range(360):
    brightness=grey[Ys[theta],Xs[theta]]
    sum += brightness
    if brightness > Bmax:
        Bmax = brightness
        Tmax = theta
    print(f"theta: {theta}: brightness={brightness}")

# Calculate mean
Mgrey = sum/len(Xs)
print(f"Mean brightness on circumf: {Mgrey}")

# Print peak brightness and matching theta
print(f"Peak brightness: {Bmax} at theta: {Tmax}")

And the output is like this:

theta: 0: brightness=38
theta: 5: brightness=38
theta: 10: brightness=38
theta: 15: brightness=38
theta: 20: brightness=38
theta: 25: brightness=38
theta: 30: brightness=38
theta: 35: brightness=45
theta: 40: brightness=38
theta: 45: brightness=33
theta: 50: brightness=30
theta: 55: brightness=28
theta: 60: brightness=28
theta: 65: brightness=31
theta: 70: brightness=70
theta: 75: brightness=111
theta: 80: brightness=130
theta: 85: brightness=136
theta: 90: brightness=139    <--- peak brightness at 90 degrees to vertical as per picture - thankfully!
theta: 95: brightness=122
theta: 100: brightness=82
theta: 105: brightness=56
theta: 110: brightness=54
theta: 115: brightness=49
theta: 120: brightness=43
theta: 125: brightness=38
theta: 130: brightness=38
theta: 135: brightness=38
theta: 140: brightness=38
theta: 145: brightness=38
theta: 150: brightness=38
theta: 155: brightness=38
theta: 160: brightness=38
theta: 165: brightness=38
theta: 170: brightness=38
theta: 175: brightness=38
theta: 180: brightness=31
theta: 185: brightness=33
theta: 190: brightness=38
theta: 195: brightness=48
theta: 200: brightness=57
theta: 205: brightness=38
theta: 210: brightness=38
theta: 215: brightness=38
theta: 220: brightness=38
theta: 225: brightness=38
theta: 230: brightness=38
theta: 235: brightness=38
theta: 240: brightness=38
theta: 245: brightness=38
theta: 250: brightness=52
theta: 255: brightness=47
theta: 260: brightness=36
theta: 265: brightness=35
theta: 270: brightness=32
theta: 275: brightness=32
theta: 280: brightness=29
theta: 285: brightness=38
theta: 290: brightness=38
theta: 295: brightness=38
theta: 300: brightness=38
theta: 305: brightness=38
theta: 310: brightness=38
theta: 315: brightness=38
theta: 320: brightness=39
theta: 325: brightness=40
theta: 330: brightness=42
theta: 335: brightness=42
theta: 340: brightness=40
theta: 345: brightness=36
theta: 350: brightness=35
theta: 355: brightness=38
Mean brightness on circumf: 45.87222222222222
Peak brightness: 142 at theta: 89

If, in the next frame the peak brightness is now at 100 degrees to vertical, you know the wheel has rotated 10 degrees in 1/(frames_per_second).

You may need to vary the radius for best results - experiment! The white radius shown on the image corresponds to the 60 pixels radius in the code.

Rather than taking the peak brightness, you may want to find the mean and standard deviation of the brightness of the 360 pixels on the circumference and then take the angle as the average of the angles where the brightness is more than some number of standard deviations above the mean. It depends on the resolution/accuracy you need.

You can also collect all the brightnesses around the circle indexed by theta into a single 360-element array like this:

brightnessByTheta = grey[Ys[:],Xs[:]]

and you'll get:

array([ 38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  43,  49,  47,  46,  45,  44,  43,  43,
        40,  38,  36,  34,  33,  33,  33,  32,  31,  31,  29,  30,  28,
        29,  29,  29,  28,  28,  27,  29,  28,  28,  27,  28,  28,  29,
        31,  36,  42,  51,  60,  70,  81,  89,  98, 105, 111, 117, 122,
       126, 128, 130, 131, 132, 133, 135, 136, 138, 139, 141, 142, 139,
       136, 133, 129, 124, 122, 119, 113, 104,  93,  82,  72,  65,  60,
        59,  56,  56,  55,  55,  54,  54,  53,  52,  52,  50,  49,  47,
        46,  45,  44,  43,  42,  40,  39,  38,  38,  37,  38,  38,  37,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  34,  31,  31,  31,  31,
        31,  31,  32,  33,  34,  35,  36,  37,  38,  42,  43,  44,  45,
        48,  49,  50,  51,  55,  57,  60,  64,  65,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  52,  56,  46,  46,  47,  47,  38,  39,  40,  40,
        36,  36,  36,  36,  36,  35,  35,  34,  34,  34,  32,  33,  33,
        33,  33,  32,  32,  31,  30,  29,  29,  28,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,  38,
        38,  38,  38,  38,  38,  38,  40,  40,  39,  38,  39,  39,  39,
        40,  40,  41,  41,  42,  42,  42,  41,  41,  42,  42,  41,  40,
        39,  40,  40,  38,  39,  38,  37,  36,  36,  35,  34,  33,  35,
        38,  38,  38,  38,  38,  38,  38,  38,  38], dtype=uint8)
Mark Setchell
  • 191,897
  • 31
  • 273
  • 432
  • thank you so much, Mark! This is a very neat idea! ..Before you provided this code example, I have tried around a little bit and came up with sth. similar: I put a threshold on my wheel such that pixels are only 0 or 255 (for some reason they are not, but they should be). And then I detect the pixel values within just a segment of the circle. ... It's not done yet, but I hope it works as well. – Luk May 28 '19 at 08:42
  • Cool - good luck and feel free to post any improvements you come up with. – Mark Setchell May 28 '19 at 08:52
  • Mark, when I mark an ROI around my wheel and take a threshold within this ROI, s.a. `ret_thresh, thresh = cv2.threshold(ROI, 80, 255, cv2.THRESH_BINARY)`, all pixel values within this ROI should be either 0 or 255, right? I have further obtained a list of tuples, where each tuple gives (x,y) coordinates of points on a circular arch around the wheel axis. The problem is that the pixels on this arch are not only 0 or 255. Is this even possible? Only if I am doing sth. wrong, I assume – Luk May 28 '19 at 09:53
  • Start a new question (they are free) and put your OpenCV code in there so we can see what's going on. – Mark Setchell May 28 '19 at 09:58
  • ok, thx Mark! https://stackoverflow.com/questions/56340555/opencv-observe-pixel-values-within-a-thresholded-roi-where-some-values-differ – Luk May 28 '19 at 11:09
  • hey Mark, please see my edited question! This might sound like a stupid question to ask, but does the code I wrote to compute the velocity of the wheel seem to be correct to you? I am asking since I doubt the result – Luk May 31 '19 at 19:17
0

For provided frames, it's impossible to track a single spoke, because all spokes have identical shape and color. The practical way to track one is by physically marking the spoke. Then, as long as your camera has movement, you need image registration to align frames. It's not difficult to track the spoke afterward.

Edit: The physical mark can be a colored spot on the spoke (for simplicity, use a color that is unique in the image). Then use thresholding technique to single out the color. You may need some enhancement to remove noises then.

Masoud
  • 1,270
  • 7
  • 12
  • thx Masoud! When you say "physically" mark a spoke, would it suffice to print a colored piece of paper on the spoke I want to track? Or what are you suggesting? And let's assume I have done that: How do I track the spoke then? Which are the methods I would need to apply? – Luk May 27 '19 at 08:54
0

Luk, what Masoud is explaining is that you physically mark the wheel. This could be a small white sticker or a blob of paint. If you are converting to gray scale then white would be the best option, IMO. If it were me, I would find the wheel using Hough Circle and thresholding. Once you have the wheel, create a mask to remove the background. That leaves just the wheel. Once you have the wheel locate the brightest spot (should be the white sticker, blob of paint or whatever is used), record its location, preferably the center and for each frame do the same process and measure the change in the location and use that to figure out angular momentum.

Doug

AeroClassics
  • 1,074
  • 9
  • 19