0

I'm trying to figure out how exactly buffers work in WebGL and I'm a little stuck here. Below will be my guesses - please confirm or deny it.

const positions = new Float32Array([
 -1, 1,
 -0.5, 0,
 -0.25, 0.25,
]);

let buffer = gl.createBuffer();

gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, positions, gl.STATIC_DRAW);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
  1. We create an array of floats on RAM via JS.
  2. WebGL creates an empty buffer directly on GPU and returns a reference on this buffer to JS. Now variable buffer is a pointer.
  3. Set pointer on the buffer to gl.ARRAY_BUFFER.
  4. Now we copy data from RAM to GPU buffer.
  5. Unbind buffer from gl.ARRAY_BUFFER ( but the buffer is still available on GPU and we can rebind it more times).

So why we can't just call createBuffer() with positions instead of using ARRAY_BUFFER as a bridge between JS and GPU? Are these only limitations of OpenGL API or we have some strong reason to don't do this? Correct me if I'm wrong, but the allocation of memory with the known size is faster than allocating some memory and reallocation with positions size after we call bufferData.

eclipseeer
  • 55
  • 5

1 Answers1

2

Because that's the API is the only real answer.

Many people agree with you that a different API would be better. It's one reason why there are new apis (DirectX11/12, Vulkan, Metal, WebGPU)

But the description in the question isn't technically correct

  1. We create an array of floats on RAM via JS.

  2. WebGL creates an object the represents a GPU buffer (nothing is allocated on the GPU)

  3. Set pointer on the buffer to gl.ARRAY_BUFFER.

  4. Now we allocate a buffer and copy data from RAM to GPU buffer.

  5. Unbind buffer from gl.ARRAY_BUFFER ( but the buffer is still available on GPU and we can rebind it more times).

Step 5 is not needed. There is no reason to unbind the buffer.

You can think of it like this. Imagine you had a javascript function that drew an image to a canvas but the image was passed in the same as buffers in your example. Here's the code

class Context {
  constructor(canvas) {
    this.ctx = canvas.getContext('2d');
  }
  bindImage(img) {
    this.img = img;
  }
  drawImage(x, y) {
    this.ctx.drawImage(this.img, x, y);
  }
}

How let's say you want to draw 3 images

const ctx = new Context(someCanvas);
ctx.bindImage(image1);
ctx.drawImage(0, 0);
ctx.bindImage(image2);
ctx.drawImage(10, 10);
ctx.bindImage(image3);
ctx.drawImage(20, 20);

will work just fine. There's no reason to do

const ctx = new Context(someCanvas);
ctx.bindImage(image1);
ctx.drawImage(0, 0);
ctx.bindImage(null);    // not needed
ctx.bindImage(image2);
ctx.drawImage(10, 10);
ctx.bindImage(null);    // not needed
ctx.bindImage(image3);
ctx.drawImage(20, 20);
ctx.bindImage(null);    // not needed

It's the same in WebGL. There are times to bind null to something, for example

gl.bindFramebuffer(gl.FRAMEBUFFER, null);  // start drawing to the canvas

but most of the time unbinding is just a programmer's personal preference, not a something the API itself requires

references:

Note that even my description above isn't technically correct. Whether or not step4 copies data to the GPU is undefined. It could just copy the data to RAM and only at draw time, if the buffer is used, and it hasn't yet been copied to the GPU, then copy it. Plenty of drivers do that. For a more concrete example of a driver not copying data to the GPU when it seems like it would see this answer and this one

gman
  • 100,619
  • 31
  • 269
  • 393
  • Thanks for the great answer! :) I agree about step 5 - it's not needed at all, it's just for a demonstration that `gl.ARRAY_BUFFER` is just a pointer on `buffer` and the `buffer` won't be deleted if you replace it with another buffer ref. But why do you think that the description of the question isn't technically correct? As I see, we cant directly put some data to buffer, just via WebGL state machine, do we? – eclipseeer Nov 07 '20 at 14:29
  • Or do you mean that we create a real buffer on GPU only when we call `gl.bindBuffer` and we can't set data before this? – eclipseeer Nov 07 '20 at 14:38
  • Your description said step 2 creates a buffer on the GPU. That is incorrect. It just creates some tracking info on the CPU. bufferData creates the actual buffer except see the last paragraph. – gman Nov 07 '20 at 15:19