1

The situation is as follows: There is a remote Linux server (no GUI), which builds the OpenGL scene. Objective: Transfer generated image(s) to client windows machine I can not understand some thing with offscreen rendering, read a lot of literature, but still not well understood: Using GLUT implies setting variable DISPLAY. If I right understand means remote rendering via x11. If I run x11 server on windows (XWin server) machine everything works. If I try to run without rendering server , then : freeglut (. / WFWorkspace): failed to open display 'localhost: 11.0'. Anyway x11 is not suitable.

  1. Do I need to create a graphics context (hardware rendering support is required)?

  2. How can I create a graphics context on Linux server without GLUT/x11?

  3. Framebuffer object - whether it is suitable for my task and whether the graphics context is necessary for it?

  4. What is the most efficient way to solve this problem (rendering requires hardware support).

Not an important issue, but nevertheless:

  • Pixel buffer object. I plan to use it to increase the read performance of GPU memory. Is it profitable within my task?
Xymostech
  • 9,710
  • 3
  • 34
  • 44
snk
  • 175
  • 3
  • 10

2 Answers2

4

You need to modify your program to use OSMesa - it's a "null display" driver used by Mesa for software rendering. Consider this answer for near duplicate question as a starter:

https://stackoverflow.com/a/8442800/2702398

For a full example, you can check out the examples in the Mesa distribution itself, such as this: http://cgit.freedesktop.org/mesa/demos/tree/src/osdemos/osdemo.c

Update

It appears that VirtualGL (http://www.virtualgl.org) supports remote rendering of OpenGL/GLX protocol and serves rendered pixmaps to the client over VNC (whereupon, VNC head can be trivially made virtual).

Community
  • 1
  • 1
oakad
  • 6,945
  • 1
  • 22
  • 31
  • But if I right understand there is no hardware acceleration: ..."Stand-alone Mesa is the original incarnation of Mesa. On systems running the X Window System it does all its rendering through the Xlib API: The GLX API is supported, but it's really just an emulation of the real thing. The GLX wire protocol is not supported and there's no OpenGL extension loaded by the X server. There is no hardware acceleration. The OpenGL library, libGL.so, contains everything (the programming API, the GLX functions and all the rendering code)."... – snk Dec 04 '13 at 04:33
  • What about http://www.virtualgl.org/About/Background ? It appears to do exactly what you want: remote GLX accelerated rendering server. – oakad Dec 04 '13 at 04:44
  • Thanks for idea. I'll check it. The main objective for me: performance (both rendering and data transfer). – snk Dec 04 '13 at 05:11
1

If you want to use full OpenGL spec, use X11 to create context. Here is a tutorial showing how you can do this: http://arrayfire.com/remote-off-screen-rendering-with-opengl/

shehzan
  • 331
  • 1
  • 5
  • Posting links are good for background information, but you should also summarize the linked content. An answer should still be useful if the link gets broken. – skrrgwasme Jul 29 '14 at 17:15