17

I have the following code which works for iOS 6 & 7.x.

In iOS 8.1 I have a strange issue where if you capture a session for about 13 seconds or longer, the resulting AVAsset only has 1 track (video), the audio track is just not there.

If you record for a shorter period the AVAsset has 2 tracks (video and audio) as expected. I have plenty of disk space, the app has permission to use camera and microphone.

I created a new project with minimal code, it reproduced the issue.

Any ideas would be greatly appreciated.

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController
{
    enum RecordingState { Recording, Stopped };
    enum RecordingState recordingState;

    AVCaptureSession *session;
    AVCaptureMovieFileOutput *output;
    AVPlayer *player;
    AVPlayerLayer *playerLayer;
    bool audioGranted;
}

- (void)viewDidLoad {
    [super viewDidLoad];

    [self setupAV];
    recordingState = Stopped;
}

-(void)setupAV
{
    session = [[AVCaptureSession alloc] init];
    [session beginConfiguration];
    AVCaptureDevice *videoDevice = nil;

    for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
        if ( device.position == AVCaptureDevicePositionBack ) {
            videoDevice = device;
            break;
        }
    }
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    if (videoDevice && audioDevice)
    {
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
        [session addInput:input];

        AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
        [session addInput:audioInput];

        NSURL *recordURL = [self tempUrlForRecording];
        [[NSFileManager defaultManager] removeItemAtURL:recordURL error:nil];

        output= [[AVCaptureMovieFileOutput alloc] init];
        output.maxRecordedDuration = CMTimeMake(45, 1);
        output.maxRecordedFileSize = 1028 * 1028 * 1000;
        [session addOutput:output];
    }
    [session commitConfiguration];
}

- (IBAction)recordingButtonClicked:(id)sender {
    if(recordingState == Stopped)
    {
        [self startRecording];
    }
    else
    {
        [self stopRecording];
    }
}

-(void)startRecording
{
    recordingState = Recording;
    [session startRunning];
    [output startRecordingToOutputFileURL:[self tempUrlForRecording] recordingDelegate:self];

}

-(void)stopRecording
{
    recordingState = Stopped;
    [output stopRecording];
    [session stopRunning];
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    AVAsset *cameraInput = [AVAsset assetWithURL:[self tempUrlForRecording]];
    //DEPENDING ON HOW LONG RECORDED THIS DIFFERS (<14 SECS - 2 Tracks, >14 SECS - 1 Track)
    NSLog(@"Number of tracks: %i", cameraInput.tracks.count);
}

-(id)tempUrlForRecording
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [paths objectAtIndex:0];

    NSString *path = @"camerabuffer.mp4";
    NSString *pathCameraInput =[documentsDirectoryPath stringByAppendingPathComponent: path];
    NSURL *urlCameraInput = [NSURL fileURLWithPath:pathCameraInput];

    return urlCameraInput;
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end
Si-N
  • 1,495
  • 1
  • 13
  • 27
  • I should also mention that no errors are reported (nil) in didFinishRecordingToOutputFileAtURL – Si-N Nov 04 '14 at 15:55
  • Ok setting the fragmentInterval greater than the recording is going to be fixes it. But I'm sure I shouldn't need this `CMTime fragmentInterval = CMTimeMake(5,1); [movieOutput setMovieFragmentInterval:fragmentInterval];` – Si-N Nov 04 '14 at 16:04
  • What happens if you don't use the `maxRecordedDuration` and stop the recording manually after 45 seconds? – Ja͢ck Nov 11 '14 at 08:44
  • I have the same issue. I found out that if you transcode the stream with `ffmpeg`, explicitly setting the volume (i.e. `ffmpeg -i movie.mp4 -vol 256 movie2.mp4`) you get sound back. – mvds Apr 17 '15 at 10:30

2 Answers2

18

This will help you to fix it.

[movieOutput setMovieFragmentInterval:kCMTimeInvalid];

I think this is a bug. The documentation says the sample table is not written if the recording does not complete successfully. So it will automatically be written if it does complete successfully. But now it seems like it doesn't.

Any ideas?

Patrick Pijnappel
  • 7,317
  • 3
  • 39
  • 39
Hugo
  • 1,122
  • 11
  • 12
  • 1
    Wow. That worked -- I've been pulling my hair out on this bug. For others reference, I didn't have a max duration or size. – Aaron Zinman Mar 27 '15 at 20:41
  • I have a max size *and* duration. I was previously using `movieOutput.movieFragmentInterval = CMTime(value: 2, timescale: 1)` and was occasionally getting audio but no video. Setting to kCMTimeInvalid solved 40% of my issues. My videos are 8 seconds long so fragments aren't needed. – atlex2 Feb 13 '16 at 01:14
  • Hi @atlex2 . Could you provide me a demo? I did not encounter this issue. – Hugo Feb 15 '16 at 06:42
  • @dusty I'd be happy to share a sample off-line. This bug also part of my trouble. http://stackoverflow.com/a/10962216/1433553 – atlex2 Feb 15 '16 at 19:15
  • `self.movieOutput!.movieFragmentInterval = kCMTimeInvalid` (in swift) – drpawelo Nov 04 '16 at 13:49
3

I had this issue and the way to fix this in Swift 4 is the following:

  • Do not set movieFileOutput.maxRecordedDuration. There seems to be a bug with this where if you set this then if you are recording videos for longer than 12-13 seconds they will have no audio.

  • Instead use a timer to stop the recording and set movieFragmentInterval like this:

movieFileOutput.movieFragmentInterval = CMTime.invalid

Here is a whole block of code just to show you how I did it:

var seconds = 20
var timer = Timer()
var movieFileOutput = AVCaptureMovieFileOutput()

func startRecording(){
    movieFileOutput.movieFragmentInterval = CMTime.invalid
    movieFileOutput.startRecording(to: URL(fileURLWithPath: getVideoFileLocation()), recordingDelegate: self)
    startTimer()
}

func stopRecording(){
    movieFileOutput.stopRecording()
    timer.invalidate()
}

func startTimer(){
    timer = Timer.scheduledTimer(timeInterval: 1, target: self, selector: (#selector(updateTimer)), userInfo: nil, repeats: true)
}

@objc func updateTimer(){
    seconds -= 1
    if(seconds == 0){
        stopRecording()
    }
}

func getVideoFileLocation() -> String {
    return NSTemporaryDirectory().appending("myrecording.mp4")
}


extension FTVideoReviewViewController : AVCaptureFileOutputRecordingDelegate{
    public func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        print("Finished recording: \(outputFileURL)")
        // do stuff here when recording is finished
    }
}
BrettARose
  • 434
  • 3
  • 12