I developed and iOS application which will save captured camera data into a file and I used
(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
to capture CMSampleBufferRef and this will encode into H264 format, and frames will be saved to a file using AVAssetWriter
.
I followed the sample source code to create this app:
Now I want to get the timestamp of saved video frames to create a new movie file. For this, I have done the following things
Locate the file and create
AVAssestReader
to read the fileCMSampleBufferRef sample = [asset_reader_output copyNextSampleBuffer]; CMSampleBufferRef buffer; while ([assestReader status] == AVAssetReaderStatusReading) { buffer = [asset_reader_output copyNextSampleBuffer]; // CMSampleBufferGetPresentationTimeStamp(buffer); CMTime presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(buffer); UInt32 timeStamp = (1000 * presentationTimeStamp.value) / presentationTimeStamp.timescale; NSLog(@"timestamp %u", (unsigned int) timeStamp); NSLog(@"reading"); // CFRelease(buffer); }
printed value gives me a wrong timestamp and I need to get frame's captured time.
Is there any way to get frame captured timestamp?
I've read an answer to get it to timestamp but it does not properly elaborate my question above.
Update:
I read the sample time-stamp before it writes to a file, it gave me xxxxx
value (33333.23232
). After I tried to read the file it gave me different value. Any specific reason for this??