2

I have a problem with a video which has an audio stream longer than the video stream according to ffmpeg command line :

ffmpeg -i input.mp4 -vcodec copy -an -f null -   <--- video stream duration
ffmpeg -i input.mp4 -acodec copy -an -f null -   <--- audio stream duration

For example, the the first command gives a list of data including the stream duration of 3.64 and the second 3.80.

I need to retrieve the video stream duration on Swift. I tried this :

guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaType.video).first else {
    return
}

let duration: CMTime = videoAssetTrack.timeRange.duration
let durationTime = CMTimeGetSeconds(duration)

It gives me 3.80, not the value I was expecting.

Thank you for your attention, I hope you can help me on this case.

Jordane H
  • 33
  • 6
  • Who don't you [use `ffprobe` to get the duration](https://stackoverflow.com/a/22243834/1109017)? – llogan Mar 14 '19 at 17:17
  • Thank you @llogan, this is a nice ffmpeg command. What is the best way to execute this command on Swift ? I read about adding a wrapper or doing something like [this](https://stackoverflow.com/questions/6854190/ffmpeg-integration-on-iphone-ipad-project/15429359#15429359). – Jordane H Mar 15 '19 at 09:13
  • Sorry, but I don't know. I've never used Swift. – llogan Mar 15 '19 at 17:22
  • Well, think of it like this - the combined overall time length would be the maximum time (I assume) of any asset, i.e. the 3.8. I've never dealt with a stream/video that had different video and audio lengths; tbh, I find that quite odd haha – impression7vx Aug 03 '19 at 15:03

0 Answers0