I'm trying to create a method which will filter all pixels below given grayscale threshold out (as in, all below will be black, all above will be white). The method works, but is not as fast as I feel it could be.
I decided to use the Parallel
class but no matter what I set the MaxDegreeOfParallelism
I don't get any speed benefits. I perform some other operations on the bitmap too, and the total time of the operations, no matter what MaxDegreeOfParallelism
is is always around 170 ms. When debugging, the time needed to perform this filtering itself takes around 160 ms, so I think there would be a noticeable overall difference.
I'm using an i7 processor, 4 physical cores, 8 logical cores.
The code:
Color black = System.Drawing.Color.FromArgb(0, 0, 0);
Color white = System.Drawing.Color.FromArgb(255, 255, 255);
int lowerBound = (int)((float)lowerBoundPercent * 255.0 / 100.0);
int upperBound = (int)((float)upperBoundPercent * 255.0 / 100.0);
int[][] border = new int[8][];
for (int i=0;i<8;i++)
{
border[i] = new int[] { i*height/8, (i+1)*height/8-1};
}
Parallel.For(0, 8, new ParallelOptions { MaxDegreeOfParallelism = 8 }, i =>
{
for (int k = 0; k < width; k++)
{
for (int j = border[i][0]; j <= border[i][1]; j++)
{
Color pixelColor;
int grayscaleValue;
pixelColor = color[k][j];
grayscaleValue = (pixelColor.R + pixelColor.G + pixelColor.B) / 3;
if (grayscaleValue >= lowerBound && grayscaleValue <= upperBound)
color[k][j] = white;
else
color[k][j] = black;
}
}
});
color[][]
is a jagged array of System.Drawing.Color
.
The question: is this normal? If not, what can I do to change it?
EDIT:
Pixel extraction:
Color[][] color;
color = new Color[bitmap.Width][];
for (int i = 0; i < bitmap.Width; i++)
{
color[i] = new Color[bitmap.Height];
for (int j = 0; j < bitmap.Height; j++)
{
color[i][j] = bitmap.GetOriginalPixel(i, j);
}
}
Bitmap is an instance of my own class Bitmap:
public class Bitmap
{
System.Drawing.Bitmap processed;
//...
public Color GetOriginalPixel(int x, int y) { return processed.GetPixel(x, y); }
//...
}