0

So, I read this answer, and while it seems to work, it dramatically lowers my FPS, and generally my program can only run for 10 seconds before WebGL just completely crashes.

Basically, I'm trying to implement lighting in my 2D top-down game. I'm doing this by having two WebGL canvases. In the first one, each pixel ranges from pure black to pure white, and each pixel represents the intensity of the light at that location. The second WebGL canvas is just my regular game canvas where everything in the game is rendered. Then, during runtime, these two canvases are blended together every frame, to simulate lighting.

I'm not just using a single canvas because I'm not sure how to do it with just one. I cannot simply darken my game canvas and then selectively brighten lit areas, because the original pixel colors are lost from the darkening shader. So instead I'm trying to use two canvases. The first one accumulates all the lighting values, and then when all the lighting values are determined, there is a single blending operation that blends the two canvases.

Again, this is working, but it's very slow and prone to crashes. I think the issue is that I am creating two textures every frame (one of the lighting canvas, and one of the game canvas) and then uploading all the texture data to each texture, and then blending.

Is there a more efficient way to perform this operation? It seems uploading all this texture data every frame is just completely killing performance.

Ryan Peschel
  • 11,087
  • 19
  • 74
  • 136
  • You need to look into [framebuffers](https://webglfundamentals.org/webgl/lessons/webgl-render-to-texture.html). – LJᛃ Apr 12 '21 at 09:20
  • Yeah, I ended up solving this by creating two frame buffers (one for all the lights and one for the rest of my game), and then creating a shader that blends them. – Ryan Peschel Apr 12 '21 at 19:31

0 Answers0