In my game, I have a (what I would call) relatively basic lighting system. I know from experience that lighting systems always significantly add to the game's overall cpu usage. However, the way i've done it seems to half the game's fps, which is never a good thing.
My lighting system works by taking light values (0 - 14) from an integer array corresponding to each pixel, and editing the rgb value of the pixel accordingly. Here is the line that does the rgb editing.
setPixel(x, y, ((255 & 0xFF) << 24) | (((rgb[0] - (rgb[0]/14 * invL)) & 0xFF) << 16) | ((rgb[1] - (rgb[1] / 14 * invL) & 0xFF) << 8) | ((rgb[2] - (rgb[2] / 14 * invL) & 0xFF) << 0));
invL just stands for 14 minus the light value, and rgb[] is an integer array containing the rgb values for the pixel. Also keep in mind that this operation is run for every pixel, meaning thousands of times every 60th of a second.
Also, this doesn't even including calculating the light level for each pixel. So my question is, is there any way I can make this operation more efficient?
Thanks.