I'm writing my own iOS Video Player and I'm facing next problem - decoding of frames is too slow. Player works perfectly fine with 30/60 fps, but 120 fps videos are playing in slow motion.
After some researches, I established that the main problem in decoding of the file. It's simply too slow.
Right now I have 3 threads: 1) reading from the file 2) avcodec_send_packet, avcodec_receive_frame and generating raw data of picture 3) video displaying
The problem is laying down in 2 thread, with avcodec_send_packet and avcodec_receive_frame.
I removed "generating raw data of picture from AVFrame" from the 2 thread just to measure execution time of avcodec_send_packet/avcodec_receive_frame, and loop with those functions barely make 120 fps. And when I say "barely" I mean that in first secs of video will be lags.
Here's the log of the decoding thread:
2017-10-08 19:31:55.476538 VULCAM[910:85974] SEC! 106
2017-10-08 19:31:56.484544 VULCAM[910:85974] SEC! 110
2017-10-08 19:31:57.486751 VULCAM[910:85974] SEC! 119
2017-10-08 19:31:58.491270 VULCAM[910:85974] SEC! 122
2017-10-08 19:31:59.500061 VULCAM[910:85974] SEC! 119
2017-10-08 19:32:00.507599 VULCAM[910:85974] SEC! 126
2017-10-08 19:32:01.513110 VULCAM[910:85974] SEC! 126
2017-10-08 19:32:02.517102 VULCAM[910:85974] SEC! 125
2017-10-08 19:32:03.518195 VULCAM[910:85974] SEC! 121
2017-10-08 19:32:04.528745 VULCAM[910:85974] SEC! 120
2017-10-08 19:32:05.529884 VULCAM[910:85974] SEC! 122
2017-10-08 19:32:06.540589 VULCAM[910:85974] SEC! 123
I'm pretty sure there's got to be some good trick to use ffmpeg more efficiently and reduce execution time. Here're small code snippets which is enough to understand the whole idea.
Here's the method of launching video decoding:
- (void)play:(PlayerState)state
{
[self decodePackets_loop];
[self decodeFrames_loop];
[self video_refresh];
}
Here's decodePackets_loop
:
- (void)decodePackets_loop
{
dispatch_queue_t pktsQueue = dispatch_queue_create("decodePacketsQueue", DISPATCH_QUEUE_SERIAL);
dispatch_async(pktsQueue, ^{
while (_state == PlayerPlayForward) {
AVPacket packet;
int ret = av_read_frame(_pFormatCtx, &packet);
if (ret < 0) {
if (ret == AVERROR_EOF) {
_isCached = YES;
break;
}
else {
NSLog(@"av_read_frame error!");
continue;
}
}
[_picturesQueue put:&packet];
}
NSLog(@"DONE!");
});
}
Here's decodeFrames_loop
:
- (void)decodeFrames_loop
{
dispatch_queue_t fmsQueue = dispatch_queue_create("decodeFramesQueue", DISPATCH_QUEUE_SERIAL);
dispatch_async(fmsQueue, ^{
AVFrame *pFrame = NULL;
pFrame = av_frame_alloc();
if (pFrame == NULL) {
NSLog(@"Couldn't init pFrame!");
return;
}
double pts = 0.0;
NSTimeInterval currTime = [NSDate timeIntervalSinceReferenceDate];
float sec = 0.0f;
int frames_cnt = 0;
while (_state == PlayerPlayForward) {
AVPacket pkt;
if ([_picturesQueue get:&pkt]) {
if (avcodec_send_packet(_pCodecCtx, &pkt) != 0) {
NSLog(@"Couldn't send packet");
continue;
}
if (avcodec_receive_frame(_pCodecCtx, pFrame) != 0) {
NSLog(@"Couldn't receive frame");
av_packet_unref(&pkt);
continue;
}
if (pkt.dts != AV_NOPTS_VALUE)
{
pts = av_frame_get_best_effort_timestamp(pFrame);
}
else
{
pts = 0.0f;
}
pts *= av_q2d(_activeStream->time_base);
av_packet_unref(&pkt);
[self synchronizeVideo:pFrame pts:pts];
//VideoPicture *vid_pic = [self generateVideoPictureFromFrame:pFrame withPts:pts withDts:pFrame->pkt_dts];
/*if (vid_pic) {
[_picturesPool addPicture:vid_pic];
}*/
//av_free(vid_pic.data);
frames_cnt++;
sec = [NSDate timeIntervalSinceReferenceDate] - currTime;
if (sec >= 1.0f) {
NSLog(@"SEC! %d", frames_cnt);
sec = 0.0f;
frames_cnt = 0;
currTime = [NSDate timeIntervalSinceReferenceDate];
}
}
}
av_frame_free(&pFrame);
});
}