2

In an effort to extract the raw CMSampleBufferRef from a HLS Live Stream (for re-encoding the video), I'm trying to use AVAssetReader to read the HLS stream (.m3u8 file). Since reading from network stream directly is not supported by AVAssetReader, I'm trying to download the .ts files listed in the HLS m3u8 index file to the local drive first, then read it back using AVAssetReader. However, when I tried opening the .ts file with AVAssetReader, I'm getting the error "This media format is not supported" (Error Domain=AVFoundationErrorDomain Code=-11828 "Cannot Open" UserInfo=0x7fd3aa723570 {NSLocalizedFailureReason=This media format is not supported).

Does anyone know if AVAssetReader supports reading .ts (mpeg2 transport stream) from local drive? If not, is there any other way to create/extract CMSampleBufferRef from a HLS stream? Thanks!

clx
  • 175
  • 3
  • 7
  • here in the modern era, is a SIMILAR QUESTION (I think): https://stackoverflow.com/a/53797643/294884 hope it helps anyone googling here.... – Fattie Dec 16 '18 at 16:53

1 Answers1

1

As far as I know, it does not. However, it seems fairly straightforward to extract the audio and video streams from a TS manually. WWDC session 513 from WWDC2014 has a great session on VideoToolkit, and also happens to talk about muxing/demuxing TS at a high level. Maybe I'm missing something, but it seems like you can just take the two first blocks of data and extract the PPS and SPS from these, create a CMVideoFormatDescriptionRef from these, and then just change the start code from the following NAL units into a length header and just feed these straight into VTDecompressionSession. This answer goes through these steps in detail.

Alternatively, here's a CocoaPod that remuxes TS2 into MP4.

Community
  • 1
  • 1
nevyn
  • 7,052
  • 3
  • 32
  • 43