I am writing my own engine using OpenTK (basically just OpenGL bindings for C#, gl* becomes GL.*) and I'm going to be storing a lot of vertex buffers with several thousand vertices in each. Therefore I need my own, custom vertex format, as a Vec3 with floats would simply take up too much space. (I'm talking about millions of vertices here)
What I want to do is to create my own vertex format with this layout:
Byte 0: Position X
Byte 1: Position Y
Byte 2: Position Z
Byte 3: Texture Coordinate X
Byte 4: Color R
Byte 5: Color G
Byte 6: Color B
Byte 7: Texture Coordinate Y
Here is the code in C# for the vertex:
public struct SmallBlockVertex
{
public byte PositionX;
public byte PositionY;
public byte PositionZ;
public byte TextureX;
public byte ColorR;
public byte ColorG;
public byte ColorB;
public byte TextureY;
}
A byte as position for each axis is plenty, as I only need 32^3 unique positions.
I have written my own vertex shader which takes two vec4's as inputs, on for each set of bytes. My vertex shader is this:
attribute vec4 pos_data;
attribute vec4 col_data;
uniform mat4 projection_mat;
uniform mat4 view_mat;
uniform mat4 world_mat;
void main()
{
vec4 position = pos_data * vec4(1.0, 1.0, 1.0, 0.0);
gl_Position = projection_mat * view_mat * world_mat * position;
}
To try and isolate the problem, I have made my vertex shader as simple as possible. The code for compiling shaders is tested with immediate mode drawing, and it works, so it can't be that.
Here is my function which generates, sets up and fills the vertex buffer with data and establishes a pointer to the attributes.
public void SetData<VertexType>(VertexType[] vertices, int vertexSize) where VertexType : struct
{
GL.GenVertexArrays(1, out ArrayID);
GL.BindVertexArray(ArrayID);
GL.GenBuffers(1, out ID);
GL.BindBuffer(BufferTarget.ArrayBuffer, ID);
GL.BufferData<VertexType>(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * vertexSize), vertices, BufferUsageHint.StaticDraw);
GL.VertexAttribPointer(Shaders.PositionDataID, 4, VertexAttribPointerType.UnsignedByte, false, 4, 0);
GL.VertexAttribPointer(Shaders.ColorDataID, 4, VertexAttribPointerType.UnsignedByte, false, 4, 4);
}
From what I understand, this is the correct procedure to: Generate a Vertex Array Object and bind it Generate a Vertex Buffer and bind it Fill the Vertex Buffer with data Set the attribute pointers
Shaders.*DataID is set with this code after compiling and using the shader.
PositionDataID = GL.GetAttribLocation(shaderProgram, "pos_data");
ColorDataID = GL.GetAttribLocation(shaderProgram, "col_data");
And this is my render function:
void Render()
{
GL.UseProgram(Shaders.ChunkShaderProgram);
Matrix4 view = Constants.Engine_Physics.Player.ViewMatrix;
GL.UniformMatrix4(Shaders.ViewMatrixID, false, ref view);
//GL.Enable(EnableCap.DepthTest);
//GL.Enable(EnableCap.CullFace);
GL.EnableClientState(ArrayCap.VertexArray);
{
Matrix4 world = Matrix4.CreateTranslation(offset.Position);
GL.UniformMatrix4(Shaders.WorldMatrixID, false, ref world);
GL.BindVertexArray(ArrayID);
GL.BindBuffer(OpenTK.Graphics.OpenGL.BufferTarget.ArrayBuffer, ID);
GL.DrawArrays(OpenTK.Graphics.OpenGL.BeginMode.Quads, 0, Count / 4);
}
//GL.Disable(EnableCap.DepthTest);
//GL.Disable(EnableCap.CullFace);
GL.DisableClientState(ArrayCap.VertexArray);
GL.Flush();
}
Can anyone be so kind as to give me some pointers (no pun intended)? Am I doing this in the wrong order or is there some functions I need to call?
I've searched all over the web, but can't find one good tutorial or guide explaining how to implement custom vertices. If you need any more information, please say so.