According to this SO question, it is possible to convert a number range to another (linear conversion) by calculating:
NewValue = (((OldValue - OldMin) * NewRange) / OldRange) + NewMin
However, I want to know if there is another faster way to do this.
Consider a microcontroller with no division instruction, converting massive amount of a ranges to another ranges (i.e. 24-bit color/pixels from image file to 18-bit color/pixels for the LCD display) would take sometime. I was thinking is there any way to optimze this.