5

If we wanted to make an HTTP/1.0 or HTTP/1.1 request, for example bytes 20000 to 100000 or seconds 20 through 35 seconds of a given media resource for example, .webm or .ogg audio or .mp4 video, where the response would be a discrete media fragment capable of playback without other portions of the media resource, how would we achieve this?

For example

let headers = new Headers();
let range = 1024 * 1024;
headers.append("Range", "bytes=0-" + range); 
let request = new Request(url, {headers:headers});
fetch(request)
.then(response => response.blob())
.then(data => audio.src = URL.createObjectURL(data));

loads and media plays

let headers = new Headers();
let range = 1024 * 1024;
headers.append("Range", "bytes=" + range + "-" + range * 2);
let request = new Request(url, {headers:headers});
fetch(request)
.then(response => response.blob())
.then(data => audio.src = URL.createObjectURL(data));

logs successful 200 and 206 response statuses, though does not render media playback at <audio> element.

How to create a range request for a given media, which returns only the requested range of the resource as a discrete resource capable of playback without other portions of the media, where requested media fragment can be any discrete portion of the media resource?

guest271314
  • 1
  • 15
  • 104
  • 177
  • @charlietfl Yes http://plnkr.co/edit/h1MOoIPwgEf0WiHMdm6z?p=preview – guest271314 Jul 11 '17 at 03:46
  • Just figured it out that the 206 would probably mean it had to return that header – charlietfl Jul 11 '17 at 03:47
  • @charlietfl Composed a workaround, which has issues. Interested to view how other developers address the case. – guest271314 Jul 11 '17 at 03:48
  • @charlietfl [HTTP/1.1 Range Requests](https://tools.ietf.org/html/rfc7233) – guest271314 Jul 11 '17 at 04:01
  • @charlietfl fwiw, this is the workaround that followed code at Question when trying to implement getting media fragment as a standalone file https://stackoverflow.com/questions/45123057/lets-build-and-implement-an-offlinemediacontext – guest271314 Jul 16 '17 at 01:08

1 Answers1

2

You simply can't.

You absolutely need the headers of your media file (metadata) for your browser to be able to decode the data it contains.

Different media formats will have different parsing rules, with chunks of data ordered differently and getting only a portion of the raw data will break the whole structure of the data. So with some file formats, you might be able to play the beginning of an media by supplying only the beginning of the file, but all media formats don't allow it and don't even expect an start to end reading.

What can be done though is to use the timerange parameter of the MediaElement's src:

#t=[starttime][,endtime]

const url = 'https://upload.wikimedia.org/wikipedia/commons/4/4b/011229beowulf_grendel.ogg';

btn.onclick = e => {
  // fast input check for the demo
  if (max.value > aud.duration)
    max.value = aud.duration;
  if (min.value > max.value)
    min.value = max.value - 1;
  // construct our timerange parameter
  let range = '#t=' + min.value + ',' + max.value;
  // append it to our original url
  aud.src = url + range;
}
btn.onclick();
<audio id="aud" aud controls></audio><br>
<label>start: <input id="min" type="number" value="10" min="0" max="119"></label>
<label>end: <input id="max" type="number" value="20" min="1" max="120"></label>
<button id="btn">re-load</button>
Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • End label should be "end"? Yes, familiar with Media Fragment URIs. Is it possible to create a recording of each time slice of media fragment which can be played individually and merge the fragments into a single file which can be played seamlessly? How can we extract, and possibly create or reuse the media headers for each media type? Why is it possible to play a range beginning at `0`, though not a range that does not begin at `0`? – guest271314 Jul 16 '17 at 03:02
  • @guest271314, yes you can obviously create a new MediaElement of each slice with this method, then use its `captureStream` method along with MediaRecorders to export these fragments. If you want to concatenate them, you'd have to control a single stream where you'd update the currentTime of the original source since you can't record at faster rate than *x1*. – Kaiido Jul 16 '17 at 03:12
  • For the *why it doesn't work with start range-requests other than 0*, it's only because you got lucky all the metadata required for **your file** was included in this range. If it had been at the end of the file, you wouldn't have been able to play anything from it. – Kaiido Jul 16 '17 at 03:13
  • How to extract the necessary metadata and copy and include the metadata at the ranges which require them https://stackoverflow.com/questions/44976805/wave-file-extended-part-of-fmt-chunk? – guest271314 Jul 16 '17 at 03:15
  • That doesn't make sense to extract the metadata : you'll have something like `video chunk 2:00:00 to 2:01:05 is at bits position 24536 to 28543` but if you break the bits position, the metadata will be wrong. – Kaiido Jul 16 '17 at 03:17
  • How can the necessary metadata be created? `MediaSource` is able to read array buffers in chunks and render the media, why can we not port the decoded media as a `Blob` capable of independent playback? – guest271314 Jul 16 '17 at 03:19
  • Because the browser does construct it from the original source. It does create the MediaStream itself, which has its own structure (including stream metadata which I don't know too much about to be honest). server-side, ffmpeg can create media stream, front-side you only get `captureStream` method and AudioContext's `createMediaStreamDestination`. But to create it yourself is an overly complicated task since you'd have to be able to parse all different structures of media, which do vary depending on each codecs. – Kaiido Jul 16 '17 at 03:28
  • 1
    Where do we find the metadata? How are codecs and containers related? We can proceed one at a time. – guest271314 Jul 16 '17 at 03:43
  • 1
    https://www.webmproject.org/docs/container/ for webm, which is probably the best to start since it has an nice xml like datastructure. But note that chrome itself [isn't even able to generate correct metadata](https://bugs.chromium.org/p/chromium/issues/detail?id=642012) for its recorded files. So if you happen to have enough time to do it, don't hesitate to let us know. (You can find examples of EBML parsers for nodejs which might help you in your task) – Kaiido Jul 16 '17 at 03:48
  • Can you provide link for reference documentation as to chrome failing to generate correct metadata? – guest271314 Jul 16 '17 at 03:50
  • This is what am trying to implement https://stackoverflow.com/questions/45217962/how-to-use-blob-url-mediasource-or-other-methods-to-play-concatenated-blobs-of – guest271314 Jul 20 '17 at 15:03