1

I have an OpenGL application and I would like to add a GUI to it.

My problem is that I want to keep my way of creating the window and my mainloop. In fact, I want to keep my application and just add some widgets to interact with the scene...

I know that we can use OpenGL as renderer by overloading QApplication but it creates a window and create another OpenGL context that i don't want... I know that I can use a QGLWidget to render OpenGL scene into it but it doesn't fit what I want too...

I searched on many forums without finding a solution. I want to know if it's possible and if yes, what is the way to achieved that ?

If it's not possible to do it, is there a way to do it ? It seems that CEGUI can do it but in my opinion it seems a bit young no ?

Pierre Fourgeaud
  • 14,290
  • 1
  • 38
  • 62

2 Answers2

1

My problem is that I want to keep my way of creating the window and my mainloop.

To use the Qt widget system you must use QApplication and the event system it spawn. You can't have two event loops battling between the same resources (windows, user input, signal reception and delivery).

In fact, I want to keep my application and just add some widgets to interact with the scene...

Then a beautiful Rolling Stones song applies to you: "You can't always get what you want…"

BTW? How are you creating the window right now? GLUT? Then deriving from QGLWidget and overriding its resizeGL, paintGL and timerEvent with a 0 delay timer to double as idle and the mouse*Event methods will give you quite the same behavior as GLUT has.

You might find that Qt's framework and event system is just what you need.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
  • In fact, I'm creating a middleware and I want to provide a GUI matching the need of my clients, but I don't know the way my client create there window... I was wondering if it is possible to do by rendering the GUI in a texture... – Pierre Fourgeaud May 11 '13 at 10:16
  • @PierreFourgeaud: The problem with GUIs is, that to work they must be integrated into the applications event processing. In principle you can use Qt to render to a texture. But Qt's signal/slot system permeates the whole application and needs to be in control of the event loop to work. Rendering a GUI to a texture is perfectly possible, but you still got the problem: How do you get the user input events into that GUI? – datenwolf May 11 '13 at 11:14
  • In fact we wanted to simulate the event system. When the user is clicking on a button, just fire the event corresponding to that. Because in the applications using our middleware the user will not use the mouse but some Virtual Reality devices as Wand, Razer Hydra, etc... and he will interact with the GUI in 3D. With the position and the orientation of the devices combined with a raycast what the user want to do. – Pierre Fourgeaud May 13 '13 at 07:23
  • @PierreFourgeaud: In that case you should take a look at GUI systems for VR and game integration. I can't recommend a specific one, but after short Googleing I found this one, that looks promising: https://code.google.com/p/begui/ – datenwolf May 13 '13 at 07:33
  • Thank you. we saw this one and cegui too. But we were wondering if they are strong enough with a correct comunity. That's why we were trying to use QT or even GTK. But it seems we encounter the same problem with this last one. Thank you for your time. – Pierre Fourgeaud May 13 '13 at 07:46
0

Did you try hello-gl example? It has a very good and clean code on OpenGL rendering and other basic operations (resize, loop, view). If you are having problem with GLUT, simply add glutInit(&argc,argv); right after int main().