The Video Toolbox framework (VideoToolbox.framework) includes direct access to hardware video encoding and decoding in iOS and OSX.
Questions tagged [video-toolbox]
117 questions
78
votes
7 answers
How to use VideoToolbox to decompress H.264 video stream
I had a lot of trouble figuring out how to use Apple's Hardware accelerated video framework to decompress an H.264 video stream. After a few weeks I figured it out and wanted to share an extensive example since I couldn't find one.
My goal is to…

Olivia Stork
- 4,660
- 5
- 27
- 40
16
votes
3 answers
Optimally using hevc_videotoolbox and ffmpeg on OSX
I'm using ffmpeg 4.3.1 to convert videos from h264 to h265 and initially I was excited to discover that I can use my Mac's GPU to speed up the conversion with the flag hevc_videotoolbox.
My Mac hardware is the 10th generation Intel i5 with AMD…

Brajesh
- 271
- 1
- 3
- 11
10
votes
1 answer
How do you access hardware decoding on tvOS without VideoToolbox?
Since VideoToolbox isn't available for tvOS, how do I decode video?
I have an app where I have frames of h.264 in memory (streams in over the network) and I was handling the decoding with VideoToolbox previously. What's the replacement?

David
- 27,652
- 18
- 89
- 138
9
votes
3 answers
Image buffer display order with VTDecompressionSession
I have a project where I need to decode h264 video from a live network stream and eventually end up with a texture I can display in another framework (Unity3D) on iOS devices. I can successfully decode the video using VTDecompressionSession and then…

Kaleb
- 1,855
- 1
- 18
- 24
9
votes
2 answers
What is the most efficient way to display CVImageBufferRef on iOS
I have CMSampleBufferRef(s) which I decode using VTDecompressionSessionDecodeFrame which results in CVImageBufferRef after decoding of a frame has completed, so my questions is..
What would be the most efficient way to display these…

user2690268
- 91
- 1
- 5
9
votes
1 answer
AVFoundation Vs VideoToolbox - Hardware Encoding
So this is a more theoretical question/discussion, as I haven't been able to come to a clear answer reading other SO posts and sources from the web. It seems like there are a lot of options:
Brad Larson's comment about AVFoundation
Video Decode…

royherma
- 4,095
- 1
- 31
- 42
6
votes
1 answer
Encoding H.264 Compression Session with CGDisplayStream
I'm trying to create an H.264 Compression Session with the data from my screen. I've created a CGDisplayStreamRef instance like so:
displayStream = CGDisplayStreamCreateWithDispatchQueue(0, 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue,…

narner
- 2,908
- 3
- 26
- 63
6
votes
2 answers
How to encode audio along with video to h264 format using VideoToolbox?
I am able to compress video captured from camera device to h264 format using video toolbox framework, but when I tried to play that h264 file in VLC player I am not able to hear the audio of the video. I think audio compression should also be done…

Sivasagar Palakurthy
- 471
- 5
- 7
5
votes
0 answers
How to add colorProfile with AVAssetWriter to video recorded from screen using CGDisplayStream
I've written a screen-recording app that writes out H.264 movie files using VideoToolbox and AVWriter. The colors in the recorded files are a bit dull compared to the original screen. I know that this if because the colorProfile is not stored in the…

Thies Ḉ Arntzen
- 83
- 5
5
votes
3 answers
Why does AVSampleBufferDisplayLayer fail with Operation Interrupted (-11847)?
I'm using an AVSampleBufferDisplayLayer to decode and display H.264 video streamed from a server. When my app goes into the background and then returns to the foreground, the decoding process gets screwed up and the AVSampleBufferDisplayLayer fails.…

Greg
- 10,360
- 6
- 44
- 67
4
votes
2 answers
VTDecompressionSessionDecodeFrame returns imageBuffer = nil but OSStatus = noErr
I am trying to decode a raw H264 stream using VideoToolbox APIs in Swift (macOS).
In the viewDidLoad() I setup my display layer and CMTimeBase as so:
self.view.wantsLayer = true
self.VideoLayer = AVSampleBufferDisplayLayer()
self.VideoLayer.frame =…

user3339439
- 95
- 1
- 7
4
votes
2 answers
How to debug why Mac OS is not using Hardware H264 encoder
I'm trying to encode some video only stream using H264, and I'm willing to use the hardware encoder in order to compare both quality and resource consumption between hardware and CPU encoding. The thing is that I'm not being able to force the OS to…

Gonzalo Larralde
- 3,523
- 25
- 30
4
votes
3 answers
Set rate at which AVSampleBufferDisplayLayer renders sample buffers
I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot seem to control the frame rate. Specifically, if I…

Amos Joshua
- 1,601
- 18
- 25
3
votes
0 answers
AVSampleBufferDisplayLayer hangs / freezes
I am live-streaming H264 video over the network from another device and rendering it using AVSampleBufferDisplayLayer to render H264 frames.
Some more context: I've timed how long it takes to emit one full H264 frame and it's on average 14ms.
Since…

user3339439
- 95
- 1
- 7
3
votes
0 answers
VTDecompressionOutputCallback returns kVTVideoDecoderBadDataErr = -12909
I have desktop server which is encode frames. My iOS client is connecting to this client and receives data using NWConnection like this:
NWListener using .udp parameter causes odd logs sometimes
Actual problem:
Device without error (video is ok):…

vpoltave
- 1,612
- 3
- 14
- 31