This answer explains how this task can be accomplished with OpenGL, Qt and GStreamer. But before I start, there are 2 issues that need to be addressed right away:
- Streaming video to HTML5 is still problematic. I suggest using Ogg for encoding since its better supported by modern browsers than h264;
- Encoding the video and stream it to HTTP is quite a challenge without 3rd party libraries to help you. Take a good look at GStreamer (a cross-platform library for handling multimedia files). It's what I use here to encode and stream a frame from OpenGL's framebuffer;
What does a roadmap to implement something like this looks like?
Start by capturing frames from the framebuffer. There are different methods that can be used for this purpose, and a Googling for opengl offscreen rendering will return several interesting posts and documents. I will not get into technical details since this subject has been covered extensively, but for educational purposes I'm sharing the code below to demonstrate how to retrieve a frame and save it as a jpg on the disk:
// GLWidget is a class based on QGLWidget.
void GLWidget::paintGL()
{
/* Setup FBO and RBO */
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb);
glGenFramebuffersEXT(1, &_fb);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fb);
glGenRenderbuffersEXT(1, &_color_rb);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, _color_rb);
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_BGRA, viewport[2], viewport[3]);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, _color_rb);
/* Draw the scene (with transparency) */
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glLoadIdentity();
glTranslatef(-2.0f, 0.0f, -7.0f);
glRotatef(45, 1.0f, 1.0f, 0.0f);
_draw_cube();
glLoadIdentity();
glTranslatef(2.0f, 0.0f, -7.0f);
glRotatef(30, 0.5f, 1.0f, 0.5f);
_draw_cube();
glFlush();
/* Retrieve pixels from the framebuffer */
int imgsize = viewport[2] * viewport[3];
std::cout << "* Viewport size: " << viewport[2] << "x" << viewport[3] << std::endl;
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glReadBuffer(GL_COLOR_ATTACHMENT0);
unsigned char* pixels = new unsigned char[sizeof(unsigned char) * imgsize * 4];
glReadPixels(0, 0, viewport[2], viewport[3], GL_BGRA, GL_UNSIGNED_BYTE, pixels);
// Use fwrite to dump data:
FILE* fp = fopen("dumped.bin","w");
fwrite(pixels, sizeof(unsigned char) * imgsize * 4, 1, fp);
fclose(fp);
// or use QImage to encode the raw data to jpg:
QImage image((const unsigned char*)pixels, viewport[2], viewport[3], QImage::Format_RGB32);
QImage flipped = image.mirrored();
flipped.save("output2.jpg");
// Disable FBO and RBO
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
// Delete resources
glDeleteRenderbuffersEXT(1, &_color_rb);
glDeleteFramebuffersEXT(1, &_fb);
delete[] pixels;
}
A QImage
is used to convert the raw GL_BGRA
frame to a jpg file. The draw_scene()
method simply draws a colored cube with transparency:
The next step is to encode the frame and stream it through HTTP. However, you probably don't want to have to save every single frame from the framebuffer to the disk before being able to stream it. And you are right, you don't have to! GStreamer provides a C API that you can use in your application to perform the operations that are done by gst-launch
(introduced below). There's even a Qt wrapper for this library named QtGstreamer to make things even easier.
GStreamer 1.0 provides a cmd-line application named gst-launch-1.0
that can be used to test it's features before you jump into coding. Developers usually play with it to assemble a pipeline of instructions that make the magic happens before starting to code.
The following command shows how it can be used to decode a jpg, encode it to Ogg theora and stream that single image to HTTP in a way that a HTML5 page can play it:
gst-launch-1.0.exe -v filesrc location=output.jpg ! decodebin ! imagefreeze ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc ! oggmux ! tcpserversink host=127.0.0.1 port=8080
The third and last step is to open a HTML5 page that was crafted to display the stream. This step must be executed while gst-launch is running, so copy and paste the code below to a file and open that page in your browser (I tested this on Chrome). The page connects to the localhost, port 8080 and starts receiving the stream. You might have noticed that the gst-launch
pipeline overlays a clock over the original image:
<html>
<title>A simple HTML5 video test</title>
</html>
<body>
<video autoplay controls width=320 height=240>
<source src="http://localhost:8080" type="video/ogg">
You browser doesn't support element <code>video</code>.
</video>
</body>

I'm just trying to figure out exactly how GStreamer can convert a raw BGRA frame to jpg (or other formats) before it is streamed.
Update:
Problem solved! It's possible to encode a raw BGRA frame to jpg or *Ogg and stream it directly without creating intermediate files on the disk. I took the liberty of setting the FPS limit at 15 and also decreased the standard quality of theoraenc
by 50%:
gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! queue ! tcpserversink host=127.0.0.1 port=8080 sync-method=2
There are a few operations on this pipeline that you don't really need. Nevertheless, some of the things you can do to optimize the bandwidth is scale the frame to a smaller size (400x300), set a lower limit for the FPS, decrease the quality of the encoded frame, and so on:
gst-launch-1.0.exe -v filesrc location=dumped.bin blocksize=1920000 ! video/x-raw,format=BGRA,width=800,height=600,framerate=1/1 ! videoconvert ! video/x-raw,format=RGB,framerate=1/1 ! videoflip method=vertical-flip ! videoscale ! video/x-raw,width=400,height=300! imagefreeze ! videorate ! video/x-raw,format=RGB,framerate=30/2 ! videoconvert ! clockoverlay shaded-background=true font-desc="Sans 38" ! theoraenc quality=24 ! oggmux ! tcpserversink host=127.0.0.1 port=8080 sync-method=2