9

I was wondering whether it is possible or not to render an OpenGL scene in Qt and stream it to an HTML5 interface in real time (I mean by that that the scene is generated on the spot).

I have been trying to find information about that and how to do it but I was not successful...

If it exists, is there any kind of existing mechanism to compress the image and optimize the bandwith use. I am thinking of a solution in the likes of Citrix but with an HTML5 client.

genpfault
  • 51,148
  • 11
  • 85
  • 139
ixM
  • 1,244
  • 14
  • 29
  • 1
    [Qt Quick WebGL Streaming](http://blog.qt.io/blog/2017/02/22/qt-quick-webgl-streaming/) may be a (new) solution. – m7913d Mar 24 '18 at 14:21

4 Answers4

12

This answer explains how this task can be accomplished with OpenGL, Qt and GStreamer. But before I start, there are 2 issues that need to be addressed right away:

  • Streaming video to HTML5 is still problematic. I suggest using Ogg for encoding since its better supported by modern browsers than h264;
  • Encoding the video and stream it to HTTP is quite a challenge without 3rd party libraries to help you. Take a good look at GStreamer (a cross-platform library for handling multimedia files). It's what I use here to encode and stream a frame from OpenGL's framebuffer;

What does a roadmap to implement something like this looks like?

Start by capturing frames from the framebuffer. There are different methods that can be used for this purpose, and a Googling for opengl offscreen rendering will return several interesting posts and documents. I will not get into technical details since this subject has been covered extensively, but for educational purposes I'm sharing the code below to demonstrate how to retrieve a frame and save it as a jpg on the disk:

// GLWidget is a class based on QGLWidget.
void GLWidget::paintGL()
{
    /* Setup FBO and RBO */

    glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb);

    glGenFramebuffersEXT(1, &_fb);
    glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb);

    glGenRenderbuffersEXT(1, &_color_rb);
    glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, _color_rb);

    GLint viewport[4];
    glGetIntegerv(GL_VIEWPORT, viewport);
    glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_BGRA, viewport[2], viewport[3]);
    glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, _color_rb);

    /* Draw the scene (with transparency) */

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glMatrixMode(GL_MODELVIEW);

    glEnable(GL_BLEND);
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

    glLoadIdentity();
    glTranslatef(-2.0f, 0.0f, -7.0f);
    glRotatef(45, 1.0f, 1.0f, 0.0f);
    _draw_cube();

    glLoadIdentity();
    glTranslatef(2.0f, 0.0f, -7.0f);
    glRotatef(30, 0.5f, 1.0f, 0.5f);
    _draw_cube();

    glFlush();

    /* Retrieve pixels from the framebuffer */

    int imgsize = viewport[2] * viewport[3];
    std::cout << "* Viewport size: " << viewport[2] << "x" << viewport[3] << std::endl;

    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
    glReadBuffer(GL_COLOR_ATTACHMENT0);

    unsigned char* pixels = new unsigned char[sizeof(unsigned char) * imgsize * 4];
    glReadPixels(0, 0, viewport[2], viewport[3], GL_BGRA, GL_UNSIGNED_BYTE, pixels);

    // Use fwrite to dump data:
    FILE* fp = fopen("dumped.bin","w");
    fwrite(pixels, sizeof(unsigned char) * imgsize * 4, 1, fp);
    fclose(fp);

    // or use QImage to encode the raw data to jpg:
    QImage image((const unsigned char*)pixels, viewport[2], viewport[3], QImage::Format_RGB32);
    QImage flipped = image.mirrored();
    flipped.save("output2.jpg");

    // Disable FBO and RBO
    glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);
    glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

    // Delete resources
    glDeleteRenderbuffersEXT(1, &_color_rb);
    glDeleteFramebuffersEXT(1, &_fb);
    delete[] pixels;
}

A QImage is used to convert the raw GL_BGRA frame to a jpg file. The draw_scene() method simply draws a colored cube with transparency:

The next step is to encode the frame and stream it through HTTP. However, you probably don't want to have to save every single frame from the framebuffer to the disk before being able to stream it. And you are right, you don't have to! GStreamer provides a C API that you can use in your application to perform the operations that are done by gst-launch (introduced below). There's even a Qt wrapper for this library named QtGstreamer to make things even easier.

GStreamer 1.0 provides a cmd-line application named gst-launch-1.0 that can be used to test it's features before you jump into coding. Developers usually play with it to assemble a pipeline of instructions that make the magic happens before starting to code.

The following command shows how it can be used to decode a jpg, encode it to Ogg theora and stream that single image to HTTP in a way that a HTML5 page can play it:

gst-launch-1.0.exe -v filesrc location=output.jpg ! decodebin ! imagefreeze ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc ! oggmux ! tcpserversink host=127.0.0.1 port=8080

The third and last step is to open a HTML5 page that was crafted to display the stream. This step must be executed while gst-launch is running, so copy and paste the code below to a file and open that page in your browser (I tested this on Chrome). The page connects to the localhost, port 8080 and starts receiving the stream. You might have noticed that the gst-launch pipeline overlays a clock over the original image:

<html>
    <title>A simple HTML5 video test</title>
</html>
<body> 
    <video autoplay controls width=320 height=240>    
    <source src="http://localhost:8080" type="video/ogg">
       You browser doesn't support element <code>video</code>.
    </video>
</body>

enter image description here

I'm just trying to figure out exactly how GStreamer can convert a raw BGRA frame to jpg (or other formats) before it is streamed.

Update:

Problem solved! It's possible to encode a raw BGRA frame to jpg or *Ogg and stream it directly without creating intermediate files on the disk. I took the liberty of setting the FPS limit at 15 and also decreased the standard quality of theoraenc by 50%:

gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! queue ! tcpserversink host=127.0.0.1 port=8080 sync-method=2

There are a few operations on this pipeline that you don't really need. Nevertheless, some of the things you can do to optimize the bandwidth is scale the frame to a smaller size (400x300), set a lower limit for the FPS, decrease the quality of the encoded frame, and so on:

gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! videoscale ! video/x-raw,width=400,height=300! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! tcpserversink host=127.0.0.1 port=8080 sync-method=2
Community
  • 1
  • 1
karlphillip
  • 92,053
  • 36
  • 243
  • 426
  • And how is this "save to disk, invoke gstreamer from command line for each image" approach exactly going to achieve low latency? You should link against gstreamer and use its internal APIs to directly encode and stream. – dtech Nov 14 '14 at 17:09
  • I don't recommend doing that at all, the answer clearly states it. I just updated it to show a `gst-launch-1.0` pipeline that shows how to overcome that problem . – karlphillip Nov 14 '14 at 22:29
8

It is entirely achievable, however depending on how much you are willing to stretch the concept of "real time". But Qt will not be able to help a lot.

  • 1 - getting an image out of GPU memory. This is pretty much the only place where Qt might be of some help. It provides two "out of the box" methods that can help you somewhat. The first one is if you have incorporated your OpenGL rendering into an element inside QQuickView or derived class, then you can use the grabWindow()to get a QImage from the framebuffer. The second is to use the QScreen class which provides a similar method, but it may be even slower than the first method. On my system (fairly highend) for a resolution of 720p it takes about 30 msec to get a raw image out of GPU memory, for lower resolution it gets faster at a quadratic rate. If you are adept in OpenGL you might want to investigate vendor specific extensions which may likely offer less overhead when copying each rendered frame from GPU to CPU memory, which is how companies like Sony or nVidia are able to achieve better graphics streaming.

  • 2 - Use FFmpeg to encode incoming QImage data to video (preferably H264) in order to minimize bandwidth. You may want to check out this wrapper which should work out of the box with Qt. FFmpeg can also help with the actual streaming, eliminating the need to use an extra library for that, although I am not sure if that stream will be available in an HTML player without using a "relay" server to re-stream it.

But you should not expect miracles. Graphics streaming works bad enough already on vendors' own devices using their proprietary tech and over fast local network. In a real world scenario, prepare for "real time" that has latency of half a second and higher. Sure, the has been some effort lately dedicated to this pointless endeavor, but like so many others it is merely done for the sake of doing it, not because there is actual benefit from doing that. Streaming graphics might be viable solution if you have a 10 gbit network and with special GPU hardware that can utilize it directly, but that solution would be expensive and inefficient, seeing how today 10$ chips which consume 2-3 watts of power are capable of rendering OpenGL, this will always be the most preferable solution. And since you mention HTML5 browser, chances are you can go for a WebGL solution, which IMO will be superior to streaming the graphics, as lousy as WebGL is at this point. Even better, Qt already supports a huge number of platforms, you can easily implement your own rendering app and get even better performance than you'd get from WebGL, and potentially more rendering features.

dtech
  • 47,916
  • 17
  • 112
  • 190
  • Thanks a lot for your answer. I had to award the bounty to @karlphillip because I think he put the most work into it, but I wish I could split it because you raise some excellent points. – static_rtti Nov 14 '14 at 20:23
  • 2
    @static_rtti IMO the code to render a trivial scene and save it to disk isn't much help, but it is your rep, if you think it helps you that's good news ;) Meanwhile, you could just as easily `view.grabWindow().save(path)` without that wall of code and regardless of the complexity of the scene, I can only assume he went though the `hello gl` tutorial because he was really eager to get the bounty, my intent was only to provide guidance, not to simulate 12 hours worth of work on answering ;) – dtech Nov 14 '14 at 20:35
4

Well, OTOY has done a similar thing...

I remembered a simpler but working open source project but I could not locate the link. In this project, the video capture (or in your case window buffer) is encoded as MPEG and sent to a browser over a WebSocket connection. Then the client-side Javascript decodes this MPEG stream and displays it. This one may give you more information about this subject...

Here it is...

Malkocoglu
  • 2,522
  • 2
  • 26
  • 32
  • Thanks! That looks promising indeed but at first glance, it doesn't seem to be available yet. Also, I'm also very interested in knowing how to make the streaming feed available for the web application. I have zero knowledge in this domain so maybe the answer is evident (in this case, could you provide a link that describes how it's done?) – ixM May 29 '14 at 09:25
2

Maybe you can use WebSockets with the excellent Socket.IO layer for maximum cross browser compatibility to transmit your data between your Qt application and an HTML5 client.

You would have to encode in some way your rendered picture in your Qt application, and send it through the socket to your HTML5 client which will decode and render it.

Socket.IO has a nice demo on their website of something like that. Check it here.

tomaoq
  • 3,028
  • 2
  • 18
  • 25