0

I have been testing some opengl projects with LWJGL 3.1.1 (release). I noticed that when NVIDIA FXAA is enabled in the NVIDIA control panel pixel perfect shapes have strange edge artifacts. Is there any way to avoid this or disable FXAA manually in my program?

Here is an example program that produces this error on my machine:

import org.lwjgl.opengl.*;
import static org.lwjgl.glfw.GLFW.*;
import static org.lwjgl.opengl.GL11.*;
import static org.lwjgl.system.MemoryUtil.NULL;

public class Test
{
    private static long    windowID;
    private static boolean running;
    int width = 640, height = 480;

    public Test()
    {
        if (!glfwInit())
        {
            System.err.println("Error starting GLFW");
            System.exit(1);
        }

        windowID = glfwCreateWindow(width, height, "Window", NULL, NULL);

        glfwSetFramebufferSizeCallback(windowID, (window, width, height)->{
            this.width = width;
            this.height = height;
        });

        if (windowID == NULL)
        {
            System.err.println("Error creating a window");
            System.exit(1);
        }

        glfwMakeContextCurrent(windowID);
        GL.createCapabilities();

        glfwSwapInterval(1);
    }

    public void start() {
        running = true;

        float delta = 190f;
        float x = 190f, y = 170f;

        while (running && !glfwWindowShouldClose(windowID)) {
            GL.createCapabilities();
            glViewport(0, 0, width, height);
            glMatrixMode(GL_PROJECTION);
            glLoadIdentity();
            glOrtho(0, width, height, 0, -1, 1);
            glMatrixMode(GL_MODELVIEW);
            glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

            float s = 150f;
            x = (int) (delta += 0.01f);
            GL11.glColor4f(1, 0.1f, 1, 1);
            glBegin(GL_QUADS);
            {
                glVertex2f(+s + x, +s + y);
                glVertex2f(-s + x, +s + y);
                glVertex2f(-s + x, -s + y);
                glVertex2f(+s + x, -s + y);
            }
            glEnd();

            glfwPollEvents();
            glfwSwapBuffers(windowID);
        }

        glfwDestroyWindow(windowID);
        glfwTerminate();

        System.exit(0);
    }

    public void end()
    {
        running = false;
    }

    public static void main(String[] args)
    {
        new Test().start();
    }
}

Quick note, this happens with all shapes and textures. Any help is appreciated. Thanks.

  • I am not an expert, but I encountered strange behavior of Opengl that occurs only in lwjgl, once in a while. – javaLover May 13 '17 at 11:09

1 Answers1

0

NVIDIA does not provide a way to affect their control panel FXAA settings through OpenGL. There might be some hidden NVIDIA API that you could tie into, but other than that, no.

You just have to hope that users who turn it on know how to turn it off for programs that don't work with it.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • Thats unfortunate. I'll see if I can find some doccumenation on the internal settings somewhere. – Joseph Straceski May 13 '17 at 18:38
  • It would be great if some documents/evident are provided. I am also curious. – javaLover May 14 '17 at 01:55
  • I think the purpose of the NVIDIA control panel is to override unwanted settings, since for most settings there also is a `let application decide` option. The driver itself is always stronger than the program running on it i guess – Dynamitos Jul 17 '17 at 09:35