I have MPEG-TS files on the device. I would like to cut a fairly-exact time off the start of the files on-device.
Using FFmpegWrapper as a base, I'm hoping to achieve this.
I'm a little lost on the C API of ffmpeg, however. Where do I start?
I tried just dropping all packets prior to a start PTS I was looking for, but this broke the video stream.
packet->pts = av_rescale_q(packet->pts, inputStream.stream->time_base, outputStream.stream->time_base);
packet->dts = av_rescale_q(packet->dts, inputStream.stream->time_base, outputStream.stream->time_base);
if(startPts == 0){
startPts = packet->pts;
}
if(packet->pts < cutTimeStartPts + startPts){
av_free_packet(packet);
continue;
}
How do I cut off part of the start of the input file without destroying the video stream? When played back to back, I want 2 cut segments to run seamlessly together.
ffmpeg -i time.ts -c:v libx264 -c:a copy -ss $CUT_POINT -map 0 -y after.ts
ffmpeg -i time.ts -c:v libx264 -c:a copy -to $CUT_POINT -map 0 -y before.ts
Seems to be what I need. I think the re-encode is needed so the video can start at any arbitrary point and not an existing keyframe. If there's a more efficient solution, that's great. If not, this is good enough.
EDIT: Here's my attempt. I'm cobbling together various pieces I don't fully understand copied from here. I'm leaving off the "cutting" piece for now to try and get audio + video encoded written without layering complexity. I get EXC_BAD_ACCESS on avcodec_encode_video2(...)
- (void)convertInputPath:(NSString *)inputPath outputPath:(NSString *)outputPath
options:(NSDictionary *)options progressBlock:(FFmpegWrapperProgressBlock)progressBlock
completionBlock:(FFmpegWrapperCompletionBlock)completionBlock {
dispatch_async(conversionQueue, ^{
FFInputFile *inputFile = nil;
FFOutputFile *outputFile = nil;
NSError *error = nil;
inputFile = [[FFInputFile alloc] initWithPath:inputPath options:options];
outputFile = [[FFOutputFile alloc] initWithPath:outputPath options:options];
[self setupDirectStreamCopyFromInputFile:inputFile outputFile:outputFile];
if (![outputFile openFileForWritingWithError:&error]) {
[self finishWithSuccess:NO error:error completionBlock:completionBlock];
return;
}
if (![outputFile writeHeaderWithError:&error]) {
[self finishWithSuccess:NO error:error completionBlock:completionBlock];
return;
}
AVRational default_timebase;
default_timebase.num = 1;
default_timebase.den = AV_TIME_BASE;
FFStream *outputVideoStream = outputFile.streams[0];
FFStream *inputVideoStream = inputFile.streams[0];
AVFrame *frame;
AVPacket inPacket, outPacket;
frame = avcodec_alloc_frame();
av_init_packet(&inPacket);
while (av_read_frame(inputFile.formatContext, &inPacket) >= 0) {
if (inPacket.stream_index == 0) {
int frameFinished;
avcodec_decode_video2(inputVideoStream.stream->codec, frame, &frameFinished, &inPacket);
// if (frameFinished && frame->pkt_pts >= starttime_int64 && frame->pkt_pts <= endtime_int64) {
if (frameFinished){
av_init_packet(&outPacket);
int output;
avcodec_encode_video2(outputVideoStream.stream->codec, &outPacket, frame, &output);
if (output) {
if (av_write_frame(outputFile.formatContext, &outPacket) != 0) {
fprintf(stderr, "convert(): error while writing video frame\n");
[self finishWithSuccess:NO error:nil completionBlock:completionBlock];
}
}
av_free_packet(&outPacket);
}
if (frame->pkt_pts > endtime_int64) {
break;
}
}
}
av_free_packet(&inPacket);
if (![outputFile writeTrailerWithError:&error]) {
[self finishWithSuccess:NO error:error completionBlock:completionBlock];
return;
}
[self finishWithSuccess:YES error:nil completionBlock:completionBlock];
});
}