Consider a very simple OpenGL program:
#include <GL/glut.h>
static void RenderScene()
{
glClear(GL_COLOR_BUFFER_BIT);
glutSwapBuffers();
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(500, 500);
glutCreateWindow("OpenGL Test");
glutDisplayFunc(RenderScene);
glClearColor(0.3f, 0.3f, 0.3f, 0.0f);
glutMainLoop();
return 0;
}
This compiles and runs fine, displaying a grey window as expected.
Then, I introduce three variables into the main function, before the OpenGL processing. The code becomes:
#include <string>
#include <GL/glut.h>
static void RenderScene()
{
glClear(GL_COLOR_BUFFER_BIT);
glutSwapBuffers();
}
int main(int argc, char** argv)
{
int x = 5;
char y = 'a';
std::string z = "abc";
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(500, 500);
glutCreateWindow("OpenGL Test");
glutDisplayFunc(RenderScene);
glClearColor(0.3f, 0.3f, 0.3f, 0.0f);
glutMainLoop();
return 0;
}
Compiling this is fine, but if I run it, I receive a segmentation fault error. Now, if I comment out the line std::string z = "abc";
, then it runs fine without any errors.
So, for some reason, declaring a string variable is causing a segmentation fault here, but not with any other variable types.
If I remove all the OpenGL code, then it runs fine. But if I remove all the OpenGL code except for just one line, such as glutInit(&argc, argv);
, then it still causes the segmentation fault.
Any ideas on what is going on?