4

I want to convert the output of an OpenGL Program to h264 and stream the output. I got collected most of the code somewhere and I get an output file but I have no Idea what to do with it, or if it is valid. Currently the output is just saved in file.h264.

Edit: "Global" Variables

    x264_param_t param;
    x264_t* encoder;
    x264_picture_t pic_in;
    x264_picture_t pic_out;

    x264_nal_t *headers;
    int i_nal;
    FILE* pFile;

My init function:

initX264() {
    pFile = fopen("file.h264", "wb");

    x264_param_t param;
    x264_param_default_preset(&param, "veryfast", "zerolatency");
    param.i_threads = 1;
    param.i_width = 1024;
    param.i_height = 768;
    param.i_fps_num = 30;
    param.i_fps_den = 1;

    param.i_keyint_max = 30;
    param.b_intra_refresh = 1;

    param.rc.i_rc_method = X264_RC_CRF;
    param.rc.f_rf_constant = 25;
    param.rc.f_rf_constant_max = 35;

    param.b_annexb = 0;
    param.b_repeat_headers = 0;

    param.i_log_level = X264_LOG_DEBUG;

    x264_param_apply_profile(&param, "baseline");

    encoder = x264_encoder_open(&param);
    x264_picture_alloc(&pic_in, X264_CSP_I420, 1024, 768);

    x264_encoder_parameters( encoder, &param );

    x264_encoder_headers( encoder, &headers, &i_nal );

    int size = headers[0].i_payload + headers[1].i_payload + headers[2].i_payload;
    fwrite( headers[0].p_payload, 1, size, pFile);
}

This goes in the Render function and is executed about 30 times per second:

    GLubyte *data = new GLubyte[3 * 1024 * 768];
    GLubyte *PixelYUV = new GLubyte[3 * 1024 * 768];

    glReadPixels(0, 0, 1024, 768, GL_RGB, GL_UNSIGNED_BYTE, data);
    RGB2YUV(1024, 768, data, PixelYUV, PixelYUV + 1024 * 768, PixelYUV + 1024 * 768 + (1024 * 768) / 4, true);
    pic_in.img.plane[0] = PixelYUV;
    pic_in.img.plane[1] = PixelYUV + 1024 * 768;
    pic_in.img.plane[2] = PixelYUV + 1024 * 768 + (1024 * 768) / 4;

    x264_nal_t* nals;
    int i_nals;
    int frame_size = x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);

    if( frame_size )
    {
        fwrite( (char*)nals[0].p_payload, frame_size, 1, pFile );

    }

I got the GRB2YUV funktion from http://svn.gnumonks.org/trunk/21c3-video/cutting_tagging/tools/mpeg4ip-1.2/server/util/rgb2yuv/rgb2yuv.c

The output looks like

x264 [debug]: frame=   0 QP=11.14 NAL=3 Slice:I Poc:0   I:3072 P:0    SKIP:0    size=21133 bytes
x264 [debug]: frame=   1 QP=20.08 NAL=2 Slice:P Poc:2   I:0    P:14   SKIP:3058 size=72 bytes
x264 [debug]: frame=   2 QP=18.66 NAL=2 Slice:P Poc:4   I:0    P:48   SKIP:3024 size=161 bytes
x264 [debug]: frame=   3 QP=18.23 NAL=2 Slice:P Poc:6   I:0    P:84   SKIP:2988 size=293 bytes

On Linux file file.h264 returns data.

user2660369
  • 245
  • 1
  • 4
  • 11

2 Answers2

6

Your file format is bad. Nothing will be able to read that. A .264 uses annexB with headers prepended to IDR frames.

change these parameters,

param.b_annexb = 1;
param.b_repeat_headers = 1;

And delete

x264_encoder_headers( encoder, &headers, &i_nal );
int size = headers[0].i_payload + headers[1].i_payload + headers[2].i_payload;
fwrite( headers[0].p_payload, 1, size, pFile);

Finally you should be able to take your output and convert it to an mp4.

ffmpeg -i file.264 -vcodec copy -an file.mp4
szatmary
  • 29,969
  • 8
  • 44
  • 57
  • FYI. This if for .264 files. If you are streaming to RTMP, you will need to create a sequence header from x264_encoder_headers(), and switch back to b_annexb = 0; – szatmary Aug 07 '13 at 18:51
  • For files this works great. How can I stream to RTMP, what Library (for Linux and Windows) should I use? – user2660369 Aug 07 '13 at 20:42
  • https://github.com/szatmary/RtmpBroadcaster. Look at rtmp.cpp, flvtag.cpp and encode.cpp. Are you streaming a video game? – szatmary Aug 07 '13 at 21:20
  • thanks I will have a look at that. It is not a Video game, just a simulation program that uses OpenGL. – user2660369 Aug 07 '13 at 21:34
0
 X264 expects YUV420P data (I guess some others also, but that's the common one). You can use libswscale (from ffmpeg) to convert images to the right format. Initializing this is like this (i assume RGB data with 24bpp).

That's most likely why you're not getting any usable video. You can find more about what to do here: How does one encode a series of images into H264 using the x264 C API?

To scale:

 sws_scale(convertCtx, &data, &srcstride, 0, h, pic_in.img.plane, pic_in.img.stride);
 //pic_in is your RGB data getting fed to scale. 
Community
  • 1
  • 1
KrisSodroski
  • 2,796
  • 3
  • 24
  • 39
  • I wanted to get around using libffmpeg just for YUV420P conversion. So you think the RGB2YUV function is broken? – user2660369 Aug 07 '13 at 13:14
  • Not broken, but maybe different than what you think its doing. Which library are you getting that from? Are you sure its taking RGB and going to YUV420P? It could be doing YUV444 to RGB888, which might not work. – KrisSodroski Aug 07 '13 at 13:17
  • "I got the GRB2YUV funktion from http://svn.gnumonks.org/trunk/21c3-video/cutting_tagging/tools/mpeg4ip-1.2/server/util/rgb2yuv/rgb2yuv.c " – user2660369 Aug 07 '13 at 13:34
  • I think you have to scale it in order to get it to work. There seems to be some specifications about how the image should be scaled before encoding the image. Use the way in my link and I bet you it will work. – KrisSodroski Aug 07 '13 at 13:47
  • so how do I feed the RGB Data to sws_scale? – user2660369 Aug 07 '13 at 13:52
  • where does srcstride come from? data is the pointer to the beginning of the RGB data, h is 768 (im my case), pic_in is the output of sws_scale? pic_in.img.stride is undefined did you mean pic_in.img.i_stride? – user2660369 Aug 07 '13 at 14:08
  • They might have changed the parameter name, so yes, whichever property represents the stride(which gets auto computed). – KrisSodroski Aug 07 '13 at 14:16