2

I'm trying to setup my assignment from school, but the only guidance is for windows. We're using Qt to for the shaders, and I'm working through visual studio code and compiling with terminal. Problem is it seems that what ever version of openGL I try end up with the same error:

*QOpenGLShader::compile(Vertex): ERROR: 0:1: '' :  version '150' is not supported* 

I'm using a macbook mid 12, and as it seems I'm using version 4.1 of openGL, I've tried multiple versions like 3.3, 2.1 4.1 and so on. Nothing seems to do the trick. I have also tried to override the version but that doesn't work. Am I looking at the problem the wrong way?

This is the source code that is causing the error:

#version 150 core <----

// input from application
uniform vec3 vecLight;
uniform sampler2D samTexture;

// input from geometry shader
smooth in vec3 normal;
smooth in vec3 vertex;
smooth in vec3 cartescoord;

// material constants
float ka = 0.1;
float kd = 0.6;
float ks = 0.3;
float spec_exp = 50.0;

// useful constants
float PI = 3.14159265;

// output color
        out vec4 outFragColor;

        vec2 cartesian2normspherical(vec3 cart)
{
    float fPhi = 0.0;
    float fTheta = 0.0;

    float fRadius = length(cart);
    fTheta = acos (cart.z / fRadius) / (PI);
    fPhi = atan(cart.y, cart.x)/ (PI);

    //transform phi from [-1,1] to [0,1]
    fPhi = (fPhi+1.0)/2.0;

    return vec2(fPhi, fTheta);
}

float calcPhongBlinn(vec3 vecV, vec3 vecN, vec3 vecL)
{
    float fLightingIntesity = 1.0;

    float fDiffuseIntensity = clamp(dot(normal, vecLight), 0, 1);

    vec3 vecHalfway = normalize(vecL + vecV);
    float fSpecularIntensity = pow(clamp(dot(vecHalfway, vecN), 0, 1), spec_exp);

    fLightingIntesity = ka + kd*fDiffuseIntensity + ks*fSpecularIntensity;

    return fLightingIntesity;
}

void main(void)
{
    
    vec2 sphCoord = cartesian2normspherical(cartescoord);
    vec2 texcoord = sphCoord.xy;

    // this vector is constant since we assume that we look orthogonally at the computer screen
    vec3 vecView = vec3(0.0, 0.0, 1.0);
    float fI = calcPhongBlinn(vecView, normal, vecLight);
    
    vec4 vecColor =  texture2D(samTexture, texcoord.xy);

    outFragColor = vec4(vecColor.rgb*fI, 1.0);
}

This is the extensive error:

QOpenGLShader::compile(Vertex): ERROR: 0:1: '' :  version '150' is not supported

*** Problematic Vertex shader source code ***

QOpenGLShader: could not create shader
QOpenGLShader::link: ERROR: One or more attached shaders not successfully compiled

With my script file running: make && ./myMain.app/Contents/MacOS/myMain

EDIT1: Adding my window file to the question

Window::Window()
{
    QGridLayout *mainLayout = new QGridLayout;


    QGLFormat glFormat;
    glFormat.setVersion(3, 3);
    glFormat.setProfile(QGLFormat::CoreProfile); 
    glFormat.setSampleBuffers(true);

    GLWidget *glWidget = new GLWidget(/*glFormat,0*/);
    mainLayout->addWidget(glWidget, 0, 0);
    
    setLayout(mainLayout);

    setWindowTitle(tr("Rendering with OpenGL"));
}

EDIT2:

After a lot of research, I have concluded that first and foremost I have to use openGL v: 3.3 for the shaders to work. Which my "OpenGl Extensions Viewer" says my mac does not support, i'm still wondering if there is a loophole I can exploit. So any information about how or if it's possible to do, would help.

LAST EDIT:

I found the solution, you have to pass the QGLFormat to QGLWidget as a constructor parameter. Found some of the solution here:

Qt5 OpenGL GLSL version error

Thank you for all the help!

Community
  • 1
  • 1
ruubel
  • 153
  • 1
  • 10
  • 3
    GLSL version 150 corresponds to OpenGL 3.2. Are you *sure* you're creating a context of 3.2 or above? [Possible duplicate](https://stackoverflow.com/questions/44430594/mac-opengl-shader-error-version-150-is-not-supported). – Bartek Banachewicz Apr 17 '19 at 09:33
  • I'm not sure at all, as i'm very new to this and just now started to understand how it actually works. The school is running 330 as a standard for the assignment and it works with Qt 2017 on a windows computer, if that answers anything? – ruubel Apr 17 '19 at 09:36
  • I don't know much about Qt, but there has to be a way to select a desired GL version before creating a window. – HolyBlackCat Apr 17 '19 at 09:40
  • Ok, so what version should I set for my mac then? – ruubel Apr 17 '19 at 09:43
  • You could try to set the version to 120; there may not be anything in your shader that requires 150. – J.R. Apr 17 '19 at 10:16
  • You're right! It did work, but it seems that I need version 3.3 to work to get the shaders working. And the more I research the more I give up hope about that working.. – ruubel Apr 17 '19 at 10:27
  • Note that [OpenGL is deprecated](https://stackoverflow.com/questions/55287467/is-opengl-on-macos-deprecated/55287682) on Apple platforms. – Michael Kenzel Apr 17 '19 at 11:03
  • Which version of macOS are you running? You can check which OpenGL version should be supported on your system, e.g., here: https://developer.apple.com/opengl/OpenGL-Capabilities-Tables.pdf – Michael Kenzel Apr 17 '19 at 11:13
  • 1
    For what it's worth my own MacBook Pro is circa mid. 2012, is running macOS 10.13.6 and `Open GLExtensions Viewer` shows that it fully supports all OpenGL versions from 3.0 to 4.1 inclusive. Even if you create a suitable OpenGL context there's nothing in the code shown to indicate that that context is the *current* context when you attempt to compile the shaders. A [mcve] would be nice. – G.M. Apr 17 '19 at 12:23
  • Note also that you appear to be mixing OpenGL related classes from both Qt5 and Qt4 which is probably a bad idea -- the Qt4 classes are largely deprecated/obsoleted. Have a look at the [Qt5 examples](https://doc.qt.io/qt-5/examples-widgets-opengl.html). – G.M. Apr 17 '19 at 12:23
  • I’m running mojave latest version, updated it to be able to use something I didn’t need. Well in my viewer It only supports one out of 13 ”methods” of OpenGL 3.3 and the further I go it’s less and less. I have the same Mac as you. Maybe it was the update that changed everything? Also, its not my choice to do it like that. This is the code from the assignement skeleton. I just want it to compile and display the given image, so I can start with the assignement at hand.. – ruubel Apr 17 '19 at 16:56
  • MacOS 10.8 supports OpenGL 3, MacOS 10.10 and up are OpenGL 4. I've written shaders with #version 330 core on MacOS 10.8 that compiled without any errors. Your program asks for a Core Profile, so you'll get a version 3 or 4 context. Either should compile your shaders. So ... are you perhaps trying to compile these shaders *before* initializeGL? Until that method your code isn't guaranteed to have the GL context you asked for, it may instead be an old compatibility context – Hugh Fisher Apr 17 '19 at 22:55
  • @HughFisher Oh, that gives me hope of success after all. But i'm not familiar enough with openGL to check for that. From what you're describing it doesn't look like that is the problem. Because when I changed the version to 2.1 I didn't get a version error anymore I only got syntax error since 2.1 don't have the methods that 3.3 have. How would I go about debugging this problem? – ruubel Apr 18 '19 at 09:35
  • I didn't make myself clear. Until your glWidget::initializeGL() method runs, you are not guaranteed to have version 3/4 OpenGL. If you compile the shaders before initializeGL(), you would be using the application default GL context, which AFAIK is version 2.x compatibility on MacOS. Which seems to be what is happening here, since you say version 2.1 doesn't give the version error. So do all your OpenGL setup - shaders, textures, etc - from initializeGL(), not before – Hugh Fisher Apr 18 '19 at 22:25

2 Answers2

2

Copied from the question above:

After a lot of research, I have concluded that first and foremost I have to use openGL v: 3.3 for the shaders to work. Which my "OpenGl Extensions Viewer" says my mac does not support, i'm still wondering if there is a loophole I can exploit. So any information about how or if it's possible to do, would help.

So the initial question is answered, and the second part regarding the "loophole" should be asked as a separate question (relating to this question).

Update:

LAST EDIT:

I found the solution, you have to pass the QGLFormat to QGLWidget as a constructor parameter. Found some of the solution here:

Qt5 OpenGL GLSL version error

Thank you for all the help!

YesThatIsMyName
  • 1,585
  • 3
  • 23
  • 30
-1

Update 2023-01-01 Qt 5.15:

int main(int argc, char *argv[])
{
    QSurfaceFormat glFormat;
    glFormat.setVersion(3, 3); // or (4,1) or whatever
    glFormat.setDepthBufferSize(24);
    glFormat.setProfile(QSurfaceFormat::CoreProfile);
    QSurfaceFormat::setDefaultFormat(glFormat);

    QApplication app(argc, argv);
...
}

class MainWidget : public QOpenGLWidget, protected QOpenGLFunctions_3_3_Core // or QOpenGLFunctions_4_1_Core, etc.
{
 ...
protected:
    void initializeGL() override;
 ...
}


void MainWidget::initializeGL()
{
    QOpenGLWidget::initializeGL();

    QOpenGLContext* glContext = this->context();    
    int glMajorVersion = glContext->format().majorVersion();
    int glMinorVersion = glContext->format().minorVersion();

    qDebug() << "Running MyProgram";
    qDebug() << "Checking QOpenGLWidget:";
    qDebug() << "Widget OpenGL:" << QString("%1.%2").arg(glMajorVersion).arg(glMinorVersion);
    qDebug() << "Context valid:" << glContext->isValid();
    qDebug() << "OpenGL information:";
    qDebug() << "VENDOR:"       << (const char*)glGetString(GL_VENDOR);
    qDebug() << "RENDERER:"     << (const char*)glGetString(GL_RENDERER);
    qDebug() << "VERSION:"      << (const char*)glGetString(GL_VERSION);
    qDebug() << "GLSL VERSION:" << (const char*)glGetString(GL_SHADING_LANGUAGE_VERSION);

...
}

Possible output:

Running MyProgram 
Checking QOpenGLWidget: 
Widget OpenGL: "4.1" 
Context valid: true 
OpenGL information: 
VENDOR: ATI Technologies Inc. 
RENDERER: AMD Radeon R9 M370X OpenGL Engine 
VERSION: 4.1 ATI-4.8.101 
GLSL VERSION: 4.10
Edward LeBlanc
  • 309
  • 4
  • 10