1

I'm writing my own iOS Video Player and I'm facing next problem - decoding of frames is too slow. Player works perfectly fine with 30/60 fps, but 120 fps videos are playing in slow motion.

After some researches, I established that the main problem in decoding of the file. It's simply too slow.

Right now I have 3 threads: 1) reading from the file 2) avcodec_send_packet, avcodec_receive_frame and generating raw data of picture 3) video displaying

The problem is laying down in 2 thread, with avcodec_send_packet and avcodec_receive_frame.

I removed "generating raw data of picture from AVFrame" from the 2 thread just to measure execution time of avcodec_send_packet/avcodec_receive_frame, and loop with those functions barely make 120 fps. And when I say "barely" I mean that in first secs of video will be lags.

Here's the log of the decoding thread:

2017-10-08 19:31:55.476538 VULCAM[910:85974] SEC! 106
2017-10-08 19:31:56.484544 VULCAM[910:85974] SEC! 110
2017-10-08 19:31:57.486751 VULCAM[910:85974] SEC! 119
2017-10-08 19:31:58.491270 VULCAM[910:85974] SEC! 122
2017-10-08 19:31:59.500061 VULCAM[910:85974] SEC! 119
2017-10-08 19:32:00.507599 VULCAM[910:85974] SEC! 126
2017-10-08 19:32:01.513110 VULCAM[910:85974] SEC! 126
2017-10-08 19:32:02.517102 VULCAM[910:85974] SEC! 125
2017-10-08 19:32:03.518195 VULCAM[910:85974] SEC! 121
2017-10-08 19:32:04.528745 VULCAM[910:85974] SEC! 120
2017-10-08 19:32:05.529884 VULCAM[910:85974] SEC! 122
2017-10-08 19:32:06.540589 VULCAM[910:85974] SEC! 123

I'm pretty sure there's got to be some good trick to use ffmpeg more efficiently and reduce execution time. Here're small code snippets which is enough to understand the whole idea.

Here's the method of launching video decoding:

- (void)play:(PlayerState)state
{

    [self decodePackets_loop];
    [self decodeFrames_loop];
    [self video_refresh];
}

Here's decodePackets_loop:

- (void)decodePackets_loop
{
    dispatch_queue_t pktsQueue = dispatch_queue_create("decodePacketsQueue", DISPATCH_QUEUE_SERIAL);
    dispatch_async(pktsQueue, ^{


        while (_state == PlayerPlayForward) {

            AVPacket packet;
            int ret = av_read_frame(_pFormatCtx, &packet);
            if (ret < 0) {

                if (ret == AVERROR_EOF) {
                    _isCached = YES;
                    break;
                }
                else {
                    NSLog(@"av_read_frame error!");
                    continue;
                }
            }

            [_picturesQueue put:&packet];
        }
        NSLog(@"DONE!");
    });
}

Here's decodeFrames_loop:

- (void)decodeFrames_loop
{
    dispatch_queue_t fmsQueue = dispatch_queue_create("decodeFramesQueue", DISPATCH_QUEUE_SERIAL);
    dispatch_async(fmsQueue, ^{

        AVFrame *pFrame = NULL;
        pFrame = av_frame_alloc();
        if (pFrame == NULL) {
            NSLog(@"Couldn't init pFrame!");
            return;
        }
        double pts = 0.0;

        NSTimeInterval currTime = [NSDate timeIntervalSinceReferenceDate];
        float sec = 0.0f;
        int frames_cnt = 0;

        while (_state == PlayerPlayForward) {

            AVPacket pkt;

            if ([_picturesQueue get:&pkt]) {

                if (avcodec_send_packet(_pCodecCtx, &pkt) != 0) {
                    NSLog(@"Couldn't send packet");
                    continue;
                }

                if (avcodec_receive_frame(_pCodecCtx, pFrame) != 0) {
                    NSLog(@"Couldn't receive frame");
                    av_packet_unref(&pkt);
                    continue;
                }

                if (pkt.dts != AV_NOPTS_VALUE)
                {
                    pts = av_frame_get_best_effort_timestamp(pFrame);
                }
                else
                {
                    pts = 0.0f;
                }
                pts *= av_q2d(_activeStream->time_base);

                av_packet_unref(&pkt);

                [self synchronizeVideo:pFrame pts:pts];
                //VideoPicture *vid_pic = [self generateVideoPictureFromFrame:pFrame withPts:pts withDts:pFrame->pkt_dts];
                /*if (vid_pic) {
                    [_picturesPool addPicture:vid_pic];
                }*/
                //av_free(vid_pic.data);

                frames_cnt++;
                sec = [NSDate timeIntervalSinceReferenceDate] - currTime;
                if (sec >= 1.0f) {
                    NSLog(@"SEC! %d", frames_cnt);
                    sec = 0.0f;
                    frames_cnt = 0;
                    currTime = [NSDate timeIntervalSinceReferenceDate];
                }
            }
        }

        av_frame_free(&pFrame);
    });
}
Eugene Alexeev
  • 1,152
  • 12
  • 32
  • You should use cache or something in such a case... – Flash Thunder Oct 08 '17 at 16:59
  • Thank you for the answer, but I'm pretty sure that is not the case. iOS native player can easily play 120 fps without any delays and additional memory consumption which are clear signs of some cache technique. – Eugene Alexeev Oct 08 '17 at 17:01
  • you sure that without any delays? copy a new vid to the device and try to play it for a first time... the delay before starting would be 2-5 seconds. – Flash Thunder Oct 08 '17 at 17:10
  • iOS native must be some kind of hardware decoder. Since you didn't post which codec using, I assume it is a normal software one. Another hint, the reason why ffmpeg splits codecs api into 2 functions, avcodec_send/receive_xxx, is to make them run in 2 separate threads. – halfelf Oct 09 '17 at 03:14
  • Thank you halfelf! By the way, it was a good point about codec thing. "avcodec_send/receive_xxx, is to make them run in 2 separate threads" - I just separated this on two threads, but it won't make any difference. But I think I should consider the codec thing. – Eugene Alexeev Oct 09 '17 at 19:02
  • Have a look at this problem and it's answer, it may help: https://stackoverflow.com/a/69025802/2536681 – imikbox Oct 07 '21 at 09:09

0 Answers0