0

I have 2 pictures with different sizes. Say the bgImage is 300*200 and the modelImage is 50*20. It can be described below, where the x and y can be (100, 50)

There is my glsl code:

vs:

attribute vec4 a_position;
attribute vec2 a_texCoord;

uniform float u_flipY;

varying vec2 v_texCoord;

void main() {
   gl_Position = a_position * vec4(1.0, u_flipY, 0.0, 1.0);
   v_texCoord = a_texCoord;
}

fs:

  precision mediump float;



  uniform sampler2D u_bgImage;
  uniform sampler2D u_modelImage;

  varying vec2 v_texCoord;

  vec3 compose(vec4 bgRGBA, vec4 modelRGBA) {
     vec3 bgRGB = bgRGBA.rgb;
     float alpha = modelRGBA.a;
     vec3 modelRGB = modelRGBA.rgb;

     return modelRGB + bgRGB * (1.0 - alpha);
  }

  void main() {
     vec4 bgRGBA = texture2D(u_bgImage, v_texCoord);
     vec4 modelRGBA = texture2D(u_coverImage, v_texCoord);
     gl_FragColor = vec4(compose(bgRGBA, modelRGBA), 1.0);
  }

It seems like that when the webgl render, bgImage and modelImage will always have same size. So how to blend textures with different size?

Or Assayag
  • 5,662
  • 13
  • 57
  • 93
  • typically you blend by drawing one texture at a time, enabling blending with `gl.enable(gl.BLEND)` and setting the blend function with `gl.blendFunc()`. Otherwise if you actually need to do the blending in the shader you'd probably pass in multiple sets of vertex coordinates. One for the1st texture, one for the 2nd, possibly 2 texture matrices to manipulate those coordinates as well. Not an example of blending but an example of a texture matrix is [here](https://webglfundamentals.org/webgl/lessons/webgl-2d-drawimage.html) – gman Jun 24 '18 at 16:35
  • And [here's a sample with blending](https://stackoverflow.com/questions/39341564/webgl-how-to-correctly-blend-alpha-channel-png). One image is drawn in 2 different sizes. Could just as easily be 2 different images. – gman Jun 24 '18 at 16:43

0 Answers0