3

Is there a way in OpenGL's glBlendFunc to scale up the destination RGB? (ie (10,50,5) to (20,100,10)

it seems the only thing you can do is add source rgb (multiplied by some factors that are <1)

Matt Peck
  • 31
  • 1
  • I fail to see why an operation for blending render targets should manage the scaling of color values. If you want the rgb values of the destination or source target to be scaled why not perform this in the shader where you produce the output(s)? Also, realize that unless you are using integer textures the component values (red, green, blue, alpha) of a render target will not occupy a discrete range (for instance from 0 to 255) but will instead house floating point values from 0.0 to 1.0. – Sir Digby Chicken Caesar Mar 31 '13 at 04:07
  • @SirDigbyChickenCaesar: A fragment shader has no access to destination framebuffer values. Maybe the next generation of GPUs will introduce a new shader stage called the "blending shader", but to this date, blending is a hardwired operation. – datenwolf Mar 31 '13 at 12:02
  • @SirDigbyChickenCaesar: Oh, and before you suggest using a FBO and sourcing the FBOs color attachment texture in the fragment shader: This doesn't work: A texture can not be bound to a texture unit and a FBO at the same time. – datenwolf Mar 31 '13 at 12:03
  • I didn't mean that you should try to scale the color values for the PREVIOUS draw operation(s) in a fragment shader. I meant that if you desired the color components of ANY draw operation to be of certain values (in this example, with a "scale" applied), you can just handle that in the fragment shader associated with that specific draw operation. (With an idea of how the blend operation before / after this draw operation will affect the color values produced.) – Sir Digby Chicken Caesar Mar 31 '13 at 17:25
  • Now, if you just want to scale the color values after the fact, you can just perform all your draw/blending operations on a single render target (not the back buffer). Then, once these operations are finished you bind the back buffer as the draw buffer and bind the render target used in the previous steps to a sampler in a shader program that applies the scale to the color values. – Sir Digby Chicken Caesar Mar 31 '13 at 17:27

1 Answers1

0

If you want to use this blend function for all draw operations / all objects (and have an initial non-zero value of course) what you could do is to interpret the colors exponentially. Imagine what is written in the color buffer is the exponent to some base, e.g. 2, so adding to the exponent will essentially multiply by 2 to the power of what you added.

Then you need to introduce a post processing step if you're not already doing this. There you read the color value and take 2 to the power of the color, which will give you the final color.

Initial values have to be logarithmic, of course.

If you however want to only multiply the colors for some draw operations (I hope only the last ones in your scene) you have to split the procedure further: First draw the "normal" content where you use normal color blending. Then convert the colors into a logarithmic scale (you can use a separate texture for this), use this for the next draw operations which should multiply the colors (but you tell OpenGL to add colors), then convert back to linear scale by taking the power.

Of course you can also use e as the base, in which case you use the natural logarithm and the exp function.

leemes
  • 44,967
  • 21
  • 135
  • 183