Questions tagged [cmsamplebufferref]

71 questions
28
votes
5 answers

iOS - Scale and crop CMSampleBufferRef/CVImageBufferRef

I am using AVFoundation and getting the sample buffer from AVCaptureVideoDataOutput, I can write it directly to videoWriter by using: - (void)writeBufferFrame:(CMSampleBufferRef)sampleBuffer { CMTime lastSampleTime =…
vodkhang
  • 18,639
  • 11
  • 76
  • 110
15
votes
4 answers

Deep Copy of Audio CMSampleBuffer

I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureAudioDataOutputSampleBufferDelegate. The problem I am having is that my frames coming from delegate method…
Neil Galiaskarov
  • 5,015
  • 2
  • 27
  • 46
12
votes
1 answer

How to get the current captured timestamp of Camera data from CMSampleBufferRef in iOS

I developed and iOS application which will save captured camera data into a file and I used (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection…
Mr.G
  • 1,275
  • 1
  • 18
  • 48
11
votes
1 answer

How to convert CMSampleBufferRef to NSData

How do you convert CMSampleBufferRef to NSData? I've managed to get the data for an MPMediaItem by following Erik Aigner's answer on this thread, however the data is of type CMSampleBufferRef. I know CMSampleBufferRef is a struct and is defined in…
RyanM
  • 4,474
  • 4
  • 37
  • 44
11
votes
2 answers

How to set timestamp of CMSampleBuffer for AVWriter writing

I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand. Basically I want to capture audio from AVCaptureSession and write it using AVWriter, however I need some shifting in the timestamp of…
11
votes
1 answer

Getting desired data from a CVPixelBuffer Reference

I have a program that views a camera input in real-time and gets the color value of the middle pixel. I use a captureOutput: method to grab the CMSampleBuffer from an AVCaptureSession output (which happens to be read as a CVPixelBuffer) and then I…
bbrownd
  • 525
  • 1
  • 6
  • 14
9
votes
1 answer

error converting AudioBufferList to CMBlockBufferRef

I am trying to take a video file read it in using AVAssetReader and pass the audio off to CoreAudio for processing (adding effects and stuff) before saving it back out to disk using AVAssetWriter. I would like to point out that if i set the…
odyth
  • 4,324
  • 3
  • 37
  • 45
9
votes
1 answer

Using AVAssetWriter with raw NAL Units

I noticed in the iOS documentation for AVAssetWriterInput you can pass nil for the outputSettings dictionary to specify that the input data should not be re-encoded. The settings used for encoding the media appended to the output. Pass nil to…
bsirang
  • 473
  • 4
  • 14
7
votes
1 answer

Retaining CMSampleBufferRef cause random crashes

I'm using captureOutput:didOutputSampleBuffer:fromConnection: in order to keep track of the frames. For my use-case, I only need to store the last frame and use it in case the app goes to background. That's a sample from my code: @property…
Rizon
  • 1,516
  • 4
  • 25
  • 45
6
votes
1 answer

How to get the timestamp of each video frame in iOS while decoding a video.mp4

Scenario: I am writing an iOS app to try decode a videoFile.mp4. I am using AVAssetReaderTrackOutput with AVAssetReader to decode frames from the video file. This works very well. I get each & every frame from videoFile.mp4 basically using the…
TheWaterProgrammer
  • 7,055
  • 12
  • 70
  • 159
6
votes
4 answers

How to create instance of Sample Buffer (CMSampleBufferRef)?

I try to write ios camera, and I took some part of code from apple: - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { …
Andrew Kachalin
  • 229
  • 2
  • 13
6
votes
2 answers

CMSampleBufferRef kCMSampleBufferAttachmentKey_TrimDurationAtStart crash

This has bothering me for a while. i have video convert to convert video into “.mp4” format. But there is a crash that happens on some video but not all. here is the crash log *** Terminating app due to uncaught exception…
Xu Yin
  • 3,932
  • 1
  • 26
  • 46
5
votes
0 answers

Render failed because a pixel format YCC420f is not supported

I'm trying to convert a CVPixelBufferRef into a UIImage using the following snippet: UIImage *image = nil; CMSampleBufferRef sampleBuffer = (CMSampleBufferRef)CMBufferQueueDequeueAndRetain(_queue); if (sampleBuffer) { CVPixelBufferRef…
javidecas
  • 51
  • 4
5
votes
1 answer

Why AVSampleBufferDisplayLayer stops showing CMSampleBuffers taken from AVCaptureVideoDataOutput's delegate?

I want to display some CMSampleBuffer's with the AVSampleBufferDisplayLayer, but it freezes after showing the first sample. I get the samplebuffers from the AVCaptureVideoDataOutputSampleBuffer delegate: -(void)captureOutput:(AVCaptureOutput…
5
votes
1 answer

NSData or bytes from CMSampleBufferRef

Hello I need to send a CMSampleBufferRef over a network. The client then plays the CMSampleBufferRef via the Audio Queue Services. I have seen some examples on stack overflow, but most of them just send the buffer. But then some informations are…
1
2 3 4 5