13

Let me see if I understood it correctly.

At the present most advanced hardware, iOS allows me to record at the following fps: 30, 60, 120 and 240.

But these fps behave differently. If I shoot at 30 or 60 fps, I expect the videos files created from shooting at these fps to play at 30 and 60 fps respectively.

But if I shoot at 120 or 240 fps, I expect the video files creating from shooting at these fps to play at 30 fps, or I will not see the slow motion.

A few questions:

  1. am I right?
  2. is there a way to shoot at 120 or 240 fps and play at 120 and 240 fps respectively? I mean play at the fps the videos were shoot without slo-mo?
  3. How do I control that framerate when I write the file?

I am creating the AVAssetWriter input like this...

  NSDictionary *videoCompressionSettings = @{AVVideoCodecKey                  : AVVideoCodecH264,
                                             AVVideoWidthKey                  : @(videoWidth),
                                             AVVideoHeightKey                 : @(videoHeight),
                                             AVVideoCompressionPropertiesKey  : @{ AVVideoAverageBitRateKey      : @(bitsPerSecond),
                                                                                   AVVideoMaxKeyFrameIntervalKey : @(1)}
                                             };

    _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];

and there is no apparent way to control that.

NOTE: I have tried different numbers where that 1 is. I have tried 1.0/fps, I have tried fps and I have removed the key. No difference.

This is how I setup `AVAssetWriter:

  AVAssetWriter *newAssetWriter = [[AVAssetWriter alloc] initWithURL:_movieURL fileType:AVFileTypeQuickTimeMovie
                                          error:&error];

  _assetWriter = newAssetWriter;
  _assetWriter.shouldOptimizeForNetworkUse = NO;

  CGFloat videoWidth = size.width;
  CGFloat videoHeight  = size.height;

  NSUInteger numPixels = videoWidth * videoHeight;
  NSUInteger bitsPerSecond;

  // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
  //  if ( numPixels < (640 * 480) )
  //    bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
  //  else
  NSUInteger bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.

  bitsPerSecond = numPixels * bitsPerPixel;

  NSDictionary *videoCompressionSettings = @{AVVideoCodecKey                  : AVVideoCodecH264,
                                             AVVideoWidthKey                  : @(videoWidth),
                                             AVVideoHeightKey                 : @(videoHeight),
                                             AVVideoCompressionPropertiesKey  : @{ AVVideoAverageBitRateKey      : @(bitsPerSecond)}
                                             };

  if (![_assetWriter canApplyOutputSettings:videoCompressionSettings forMediaType:AVMediaTypeVideo]) {
    NSLog(@"Couldn't add asset writer video input.");
    return;
  }

 _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                              outputSettings:videoCompressionSettings
                                                            sourceFormatHint:formatDescription];
  _assetWriterVideoInput.expectsMediaDataInRealTime = YES;      

  NSDictionary *adaptorDict = @{
                                (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA),
                                (id)kCVPixelBufferWidthKey : @(videoWidth),
                                (id)kCVPixelBufferHeightKey : @(videoHeight)
                                };

  _pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
                         initWithAssetWriterInput:_assetWriterVideoInput
                         sourcePixelBufferAttributes:adaptorDict];


  // Add asset writer input to asset writer
  if (![_assetWriter canAddInput:_assetWriterVideoInput]) {
    return;
  }

  [_assetWriter addInput:_assetWriterVideoInput];

captureOutput method is very simple. I get the image from the filter and write it to file using:

if (videoJustStartWriting)
    [_assetWriter startSessionAtSourceTime:presentationTime];

  CVPixelBufferRef renderedOutputPixelBuffer = NULL;
  OSStatus err = CVPixelBufferPoolCreatePixelBuffer(nil,
                                                    _pixelBufferAdaptor.pixelBufferPool,
                                                    &renderedOutputPixelBuffer);

  if (err) return; //          NSLog(@"Cannot obtain a pixel buffer from the buffer pool");

  //_ciContext is a metal context
  [_ciContext render:finalImage
     toCVPixelBuffer:renderedOutputPixelBuffer
              bounds:[finalImage extent]
          colorSpace:_sDeviceRgbColorSpace];

   [self writeVideoPixelBuffer:renderedOutputPixelBuffer
                  withInitialTime:presentationTime];


- (void)writeVideoPixelBuffer:(CVPixelBufferRef)pixelBuffer withInitialTime:(CMTime)presentationTime
{

  if ( _assetWriter.status == AVAssetWriterStatusUnknown ) {
    // If the asset writer status is unknown, implies writing hasn't started yet, hence start writing with start time as the buffer's presentation timestamp
    if ([_assetWriter startWriting]) {
      [_assetWriter startSessionAtSourceTime:presentationTime];
    }
  }

  if ( _assetWriter.status == AVAssetWriterStatusWriting ) {
    // If the asset writer status is writing, append sample buffer to its corresponding asset writer input

      if (_assetWriterVideoInput.readyForMoreMediaData) {
        if (![_pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime]) {
          NSLog(@"error", [_assetWriter.error localizedFailureReason]);
        }
      }
  }

  if ( _assetWriter.status == AVAssetWriterStatusFailed ) {
    NSLog(@"failed");
  }

}

I put the whole thing to shoot at 240 fps. These are presentation times of frames being appended.

time ======= 113594.311510508
time ======= 113594.324011508
time ======= 113594.328178716
time ======= 113594.340679424
time ======= 113594.344846383

if you do some calculation between them you will see that the framerate is about 240 fps. So the frames are being stored with the correct time.

But when I watch the video the movement is not in slow motion and quick time says the video is 30 fps.

Note: this app grabs frames from the camera, the frames goes into CIFilters and the result of those filters is converted back to a sample buffer that is stored to file and displayed on the screen.

Duck
  • 34,902
  • 47
  • 248
  • 470
  • How are you calculating presentation time stamp? Is readyForMoreMediaData ever false? What is your bitrate? – Rhythmic Fistman Apr 13 '17 at 09:30
  • I am not calculating presentation time stamp, it is coming from live video, from the sampleBuffer. _assetWriterVideoInput is true when it is writing. Why? – Duck Apr 13 '17 at 09:53
  • I have added, at the bottom of the question, the presentation times I am getting when I shoot at 240 fps. These times are the ones that are being used to append the frames coming from the camera. As you can see, frames have the proper times for 240 fps. – Duck Apr 13 '17 at 10:05
  • Try simplifying your code by removing the CoreImage stuff and the `AVAssetWriterInputPixelBufferAdaptor`. Do you get the expected frame rate? If you do, look closer at them, if not you've simplified the problem. – Rhythmic Fistman Apr 13 '17 at 10:39
  • As far as I know I need it. The sample buffer is converted to a CIImage to serve as an input to a CIFilter. The output from the filter is a CIImage and I need core image to convert it back to a sample buffer so I can write it to file... I don't know any other way to do that. Sorry, I failed to mention that but now the question has this info. – Duck Apr 13 '17 at 10:45
  • So try skipping that - write the sample buffer straight to the `_assetWriterVideoInput` with `appendSampleBuffer`. Both success and failure will tell you something about your problem. It's win-win. – Rhythmic Fistman Apr 13 '17 at 10:51

2 Answers2

5

I'm reaching here, but I think this is where you're going wrong. Think of your video capture as a pipeline.

(1) Capture buffer -> (2) Do Something With buffer -> (3) Write buffer as frames in video.

Sounds like you've successfully completed (1) and (2), you're getting the buffer fast enough and you're processing them so you can vend them as frames.

The problem is almost certainly in (3) writing the video frames.

https://developer.apple.com/reference/avfoundation/avmutablevideocomposition

Check out the frameDuration setting in your AVMutableComposition, you'll need something like CMTime(1, 60) //60FPS or CMTime(1, 240) // 240FPS to get what you're after (telling the video to WRITE this many frames and encode at this rate).

Using AVAssetWriter, it's exactly the same principle but you set the frame rate as a property in the AVAssetWriterInput outputSettings adding in the AVVideoExpectedSourceFrameRateKey.

NSDictionary *videoCompressionSettings = @{AVVideoCodecKey                  : AVVideoCodecH264,
                                         AVVideoWidthKey                  : @(videoWidth),
                                         AVVideoHeightKey                 : @(videoHeight),
                                       AVVideoExpectedSourceFrameRateKey : @(60),
                                         AVVideoCompressionPropertiesKey  : @{ AVVideoAverageBitRateKey      : @(bitsPerSecond),
                                                                               AVVideoMaxKeyFrameIntervalKey : @(1)}
                                         };

To expand a little more - you can't strictly control or sync your camera capture exactly to the output / playback rate, the timing just doesn't work that way and isn't that exact, and of course the processing pipeline adds overhead. When you capture frames they are time stamped, which you've seen, but in the writing / compression phase, it's using only the frames it needs to produce the output specified for the composition.

It goes both ways, you could capture only 30 FPS and write out at 240 FPS, the video would display fine, you'd just have a lot of frames "missing" and being filled in by the algorithm. You can even vend only 1 frame per second and play back at 30FPS, the two are separate from each other (how fast I capture Vs how many frames and what I present per second)

As to how to play it back at different speed, you just need to tweak the playback speed - slow it down as needed.

If you've correctly set the time base (frameDuration), it will always play back "normal" - you're telling it "play back is X Frames Per Second", of course, your eye may notice a difference (almost certainly between low FPS and high FPS), and the screen may not refresh that high (above 60FPS), but regardless the video will be at a "normal" 1X speed for it's timebase. By slowing the video, if my timebase is 120, and I slow it to .5x I know effectively see 60FPS and one second of playback takes two seconds.

You control the playback speed by setting the rate property on AVPlayer https://developer.apple.com/reference/avfoundation/avplayer

Tim Bull
  • 2,375
  • 21
  • 25
  • why do you mention AVMutableComposion? I never hear of using that for live camera video. I am using AVAssetWriter as Apple does but Apple has no example of videos shot with higher fps. I will research that. – Duck Apr 14 '17 at 12:56
  • You may be right in one thing. When I shoot 240 fps for, lets say 5, seconds, I am shooting in total 240 x 5 frames, that is 1220 frames but when I watch that video the video is played as it was shot at 30 fps. So, in fact, the whole thing is not writing all the frames. In fact the whole thing is dropping frames in a premeditated way because the video appears a normal 30 fps video. What is odd. – Duck Apr 14 '17 at 13:33
  • Nothing odd about it, you're not telling the output file what frame rate to encode at. I haven't tried it, but I think what you need is to add AVVideoExpectedSourceFrameRateKey into your video compression settings and set this to your desired frame rate. By default it's 0 which is "let the encoder choose". I've edited my answer to include this. – Tim Bull Apr 14 '17 at 18:53
  • You should definitely try the `AVVideoExpectedSourceFrameRateKey`, even though the header file says this is just to the encoder a hint and that durations determine frame rate. It also says you should use this if you are encoding at higher rates than 30fps. – Rhythmic Fistman Apr 14 '17 at 23:33
  • 3
    @TimBull Hello! I've been struggling with the same problem @SpaceDog had (encoding video of 120+ fps with saving correct fps), and your advice seems legit for solving my problem. However, when I add `AVVideoExpectedSourceFrameRateKey` to my `AVAssetWriterInput` settings, I'm getting this error: `Output settings dictionary contains one or more invalid keys: ExpectedFrameRate'`. I've been searching on the internet about that problem, but unfortunately - no luck. Did you face the same issue and if so, how did you solve it? Thank you. – Eugene Alexeev Mar 23 '18 at 08:00
  • Hi Eugene, it's been a while, so sorry - no help unfortunately. It's possible the key name has changed or is different in swift (that code is ObjectiveC). – Tim Bull Apr 17 '18 at 17:50
  • 1
    @EugeneAlexeev The key will be under AVVideoCompressionPropertiesKey – souvickcse Jul 30 '18 at 10:48
  • 2
    AVVideoCompressionPropertiesKey : @{ AVVideoAverageBitRateKey : @(bitsPerSecond), AVVideoMaxKeyFrameIntervalKey : @(1), AVVideoExpectedSourceFrameRateKey : @(60)} – souvickcse Jul 30 '18 at 10:51
  • I've tried all the possible combinations for the AVVideoCompressionPropertiesKey that you guys mention. But my video is always around 96fps. Im appending CMSampleBuffer directly to the AssetWriterInput, not using pixelAdaptor like above tho. – omarojo Oct 18 '19 at 07:11
2

The iOS screen refresh is locked at 60fps, so the only way to "see" the extra frames is, as you say, to slow down the playback rate, a.k.a slow motion.

So

  1. yes, you are right
  2. the screen refresh rate (and perhaps limitations of the human visual system, assuming you're human?) means that you cannot perceive 120 & 240fps frame rates. You can play them at normal speed by downsampling to the screen refresh rate. Surely this is what AVPlayer already does, although I'm not sure if that's the answer you're looking for.
  3. you control the framerate of the file when you write it with the CMSampleBuffer presentation timestamps. If your frames are coming from the camera, you're probably passing the timestamps straight through, in which case check that you really are getting the framerate you asked for (a log statement in your capture callback should be enough to verify this). If you're procedurally creating frames, then you choose the presentation timestamps so that they're spaced 1.0/desiredFrameRate seconds apart!

Is 3. not working for you?

p.s. you can discard & ignore AVVideoMaxKeyFrameIntervalKey - it's a quality setting and has nothing to do with playback framerate.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • 1. thanks, 3. I am getting the frames at 240 fps. I have NSLogged the timestamps and by calculating the difference I see I am at 240 fps, then I write these frames with these timestamps to the file and when I play the file I see the video as it was 30 fps and it is not in slow motion. – Duck Apr 12 '17 at 13:37
  • Can you open the file in Quicktime on the mac and see what the FPS is in the Movie Inspector (⌘-I)? – Rhythmic Fistman Apr 12 '17 at 13:40
  • I have shot 720p @ 240 fps. Quicktime says the video is 30 fps, what makes sense because the idea in slo-mo is to shoot fast, play slow. But the video I see is not in slo-mo. The video appears to be shot at 30 fps... – Duck Apr 12 '17 at 13:50
  • I don't get results like that. I hit 240fps@720p in the file with `AVCaptureMovieFileOutput`, 215fps@720p with `AVCaptureVideoDataOutput`, and with no complaints from `input.isReadyForMoreMediaData` if I set `input.expectsMediaDataInRealTime`. Unless it's the bitrate's fault, I think you'll need to show some more code. – Rhythmic Fistman Apr 12 '17 at 21:19
  • 1
    At least show how you set up the capture session, the asset writer & input and the capture `didOutputSampleBuffer` delegate method. – Rhythmic Fistman Apr 12 '17 at 23:13
  • `didOutputSampleBuffer` - why does it matter if I set fps for asset video writer? it should ignore frames if `didOutputSampleBuffer` sends too much – user924 Mar 27 '18 at 12:41
  • Is this a new question? Can you give some context on how you’re setting FPS? – Rhythmic Fistman Mar 27 '18 at 13:59