0

I'm using ffmpeg in Android project via JNI to decode real-time H264 video stream. On the Java side I'm only sending the the byte arrays into native module. Native code is running a loop and checking data buffers for new data to decode. Each data chunk is processed with:

int bytesLeft = data->GetSize();
int paserLength = 0;
int decodeDataLength = 0;
int gotPicture = 0;
const uint8_t* buffer = data->GetData();
while (bytesLeft > 0) {
    AVPacket packet;
    av_init_packet(&packet);
    paserLength = av_parser_parse2(_codecPaser, _codecCtx, &packet.data, &packet.size, buffer, bytesLeft, AV_NOPTS_VALUE, AV_NOPTS_VALUE, AV_NOPTS_VALUE);
    bytesLeft -= paserLength;
    buffer += paserLength;

    if (packet.size > 0) {
        decodeDataLength = avcodec_decode_video2(_codecCtx, _frame, &gotPicture, &packet);
    }
    else {
        break;
    }
    av_free_packet(&packet);
}

if (gotPicture) {
// pass the frame to rendering
}

The system works pretty well until incoming video's resolution changes. I need to handle transition between 4:3 and 16:9 aspect ratios. While having AVCodecContext configured as follows:

_codecCtx->flags2|=CODEC_FLAG2_FAST;
_codecCtx->thread_count = 2;
_codecCtx->thread_type = FF_THREAD_FRAME;

if(_codec->capabilities&CODEC_FLAG_LOW_DELAY){
    _codecCtx->flags|=CODEC_FLAG_LOW_DELAY;
}

I wasn't able to continue decoding new frames after video resolution change. The got_picture_ptr flag that avcodec_decode_video2 enables when whole frame is available was never true after that.
This ticket made me wonder if the issue isn't connected with multithreading. Only useful thing I've noticed is that when I change thread_type to FF_THREAD_SLICE the decoder is not always blocked after resolution change, about half of my attempts were successfull. Switching to single-threaded processing is not possible, I need more computing power. Setting up the context to one thread does not solve the problem and makes the decoder not keeping up with processing incoming data.
Everything work well after app restart.

I can only think of one workoround (it doesn't really solve the problem): unloading and loading the whole library after stream resolution change (e.g as mentioned in here). I don't think it's good tho, it will propably introduce other bugs and take a lot of time (from user's viewpoint).

Is it possible to fix this issue?

EDIT:
I've dumped the stream data that is passed to decoding pipeline. I've changed the resolution few times while stream was being captured. Playing it with ffplay showed that in moment when resolution changed and preview in application froze, ffplay managed to continue, but preview is glitchy for a second or so. You can see full ffplay log here. In this case video preview stopped when I changed resolution to 960x720 for the second time. (Reinit context to 960x720, pix_fmt: yuv420p in log).

Community
  • 1
  • 1
Krzysztof Kansy
  • 305
  • 2
  • 13
  • This is supposed to work, the ffmpeg test suite has H264 files with resolution changes in it, and they decode fine, even with either type of threading enabled. The ticket you pointed to is also closed as "fixed". Which version are you using? Are you using FFmpeg API directly, or a wrapper? Can you show more code? Do you get any console messages? Does it work if you dump the NAL data to a file and play it using ffplay? – Ronald S. Bultje Oct 06 '16 at 20:28
  • @RonaldS.Bultje Thank you for looking into this. I cannot determine the version right now (except there is a way to check it through code) - it was build on different machine a year ago or so. In my native classes I'm using ffmpeg libraries functions directly, without wrappers. There are no console messages that could point to the problem. I will dump NALs to file and check if ffplay can open it. Thanks for confirming that it actually should works, knowing it for sure helps a lot. When I'm sure stream is ok I will move to building latest ffmpeg. – Krzysztof Kansy Oct 07 '16 at 09:33
  • @RonaldS.Bultje I've updated the question with information recovered from dumped stream. – Krzysztof Kansy Oct 07 '16 at 11:59
  • So if ffplay manages to continue through some of the resize, that indicates to me your application (rather than ffmpeg as a set of libraries) is failing to deal with the resize, no? I don't immediately have any ideas on what would cause that, but posting a full set of code that we can run might help. – Ronald S. Bultje Oct 07 '16 at 12:42
  • @RonaldS.Bultje I didn't mention it, but ffplay I have used to play the stream dump is from the latest ffmpeg. The application uses some older version. Since the libraries differ, there is also possibility that something is not 100% ok with my ffmpeg library, am I right? Unfortunately I'm not able to post the complete code now, I'll see about that next week. – Krzysztof Kansy Oct 07 '16 at 15:46
  • Yes that is also possible, indeed. – Ronald S. Bultje Oct 07 '16 at 19:10

0 Answers0