I'm trying to write a simple fragment shader to display a grid (or rather a checker pattern) on a polygon. I want this pattern to "remain in place", i.e. when the polygon itself moves, the squares remain in the same place, so the resulting pattern kind of slides on the surface of the pattern.
I'm developing this in Java using LWJGL for an ARM-based embedded system, and I can debug both on remotely the ARM bevice connected to my PC, as well as locally on the PC itself. I use intelliJ for this.
On PC my program defaults to using OpenGL 3.2. On ARM the context is OpenGL ES 3.0. The graphics card on ARM is Vivante GC 2000.
Here's the problem: locally, on my PC, the shader works flawlessly, just like I want it to. But when I go to ARM - the pattern is jittering, distorting and going out of sync between two triangles that make my polygon. The interesting fact is that the pattern changes and moves based on camera position, even though the shader uses only ModelMatrix and vertex positions of the plane for calculations, which both remain exactly the same between frames (I checked). Yet camera position somehow affects the result dramatically, which shouldn't happen.
Here's my vertex shader:
#version 300 es
layout (location=0) in vec3 position;
uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 modelMatrix;
out highp vec3 vertexPosition;
void main()
{
// generate position data for the fragment shader
// does not take view matrix or projection matrix into account
vec4 vp = modelMatrix * vec4(position, 1.0);
vertexPosition = vec3(vp.x, vp.y, vp.z);
// position data for the OpenGL vertex drawing
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
fragment shader:
#version 300 es
precision highp float;
in highp vec3 vertexPosition;
out mediump vec4 fragColor;
void main()
{
highp float c = float((int(round(vertexPosition.x/5.0))+int(round(vertexPosition.z/5.0))) % 2);
fragColor = vec4(vec3(c/2.0 + 0.3), 1);
}
As you can see I've tried tinkering with precision of float operations, alas to no avail. You can also notice that only modelMatrix and Vertex positions of the polygon affect fragColor, and I can guarantee that I cnecked them, and they do not change between shader calls, yet somehow camera movement ends up affecting the resulting fragment colors/pattern.
It's also worth of note that no other textures on the objects in the scene seem to be affected by the issue
Here're a couple of screenshots:
How it looks locally (everything works):
Here's how it looks on the ARM device.
Notice the textures shifted between triangles, and there's a weird line between them that seems to have been filled by a completely different set of rules entirely. The problem doesn't appear at all viewing angles - only some. If I point the camera in other directions, sometimes I can move it rather freely with no artifacts visible.
The other thing I've noticed is that the bigger my polygon is, the more jittering and artifacting occurs, which leads me to believe that it has to do with precision/calculation of either vertex positions (in vertex shader), or the position of fragment in relation to those vertex positions in the fragment shader part.
Edit: I've checked the precision of float using glGetShaderPrecisionFormat(GL_FRAGMENT_SHADER, GL_HIGH_FLOAT, range, precision); and it's the same on both local PC and MTU. So it shouldn't be that, unless you have to somehow specifically enable some flag that I'm missing.
Edit: And yet another thing I noticed is that locally, on PC the little test grass block appears in the center of one of the squares. And on ARM the squares are shifted by half, so it stands directly on the intersection (if I line the camera so that artifact doesn't happen). Can't rightly explain this, because in my mind the calculation should yield the same result.
Either way, I actually need to solve this problem somehow, and I would appreciate the answer.