13

I know it's doable with mediaSource but media source doesn't support all video formats (like fragmented mp4 for example). Which is a problem because my application doesn't have a server that can fix the file. It's a client side application only.

const blob = await ipfs.getBlobFromStream(hash)

const url = URL.createObjectURL(blob)

this.setState({...this.state, videoSrc: url})

const getBlobFromStream = async (hash) => {

  return new Promise(async resolve => {

    let entireBuffer

    const s = await stream(hash)
    s.on('data', buffer => {

      console.log(buffer)

      if (!entireBuffer) {
        entireBuffer = buffer
      }
      else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer)
      }

    })

    s.on('end', () => {

      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer)
      const blob = new Blob(arrayBuffer)
      resolve(blob)
    })

  })

}

this is the code i'm using right now, which basically waits for the entire file and puts it in a single array and then into a blob and then into URL.createObjectURL

Jeremy
  • 1
  • 85
  • 340
  • 366
cooldude101
  • 1,215
  • 2
  • 14
  • 29
  • 1
    What do you want to achieve? To start the video when first buffer(s) arrived? Also to create on each incoming buffer blob url and pass that url into a video element? – Serkan Sipahi Sep 12 '18 at 11:46
  • 3
    What contains the `hash` variable? Is the `stream` function a library? Could your please give more details, thanks. – Serkan Sipahi Sep 12 '18 at 11:53
  • @Bitcollage I want the video to start playing before the entire buffer is downloaded. I want it to "buffer" or "stream" like on youtube. The hash is the IPFS infoHash. It's a wrapper function I made around this method https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/FILES.md#filesgetreadablestream – cooldude101 Sep 12 '18 at 16:41

2 Answers2

2

You can do it in which you restructure your code:

await ipfs.startBlobStreaming(hash);
this.setState({...this.state, videoComplete: true});

const startBlobStreaming = async (hash) => {
  return new Promise(async (resolve) => {

    let entireBuffer;
    const s = await stream(hash);
    s.on('data', buffer => {
      if (!entireBuffer) {
        entireBuffer = buffer;
      } else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer);
      }
      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer);
      const blob = new Blob(arrayBuffer);
      const url = URL.createObjectURL(blob);
      this.setState({...this.state, videoSrc: url});

    });
    s.on('end', _ => resolve())
  });
}

I dont know how intensive the buffers are come into s.on but you could also collect a amount of buffer in a certain time(e.g. 1000ms) and then create the blob url.

Serkan Sipahi
  • 691
  • 6
  • 19
  • I had already tested this method but for some reason the video did not play unless it had the entire buffer. As if creating an ObjectURLSource from an incomplete buffer doesn't result into a playable video. Also even if it did work, wouldn't that method restart the video every single react update? – cooldude101 Sep 12 '18 at 16:39
  • I checked how youtube is working. The blob url is always the same: `blob:https://www.youtube.com/e07d4e7f-0031-4354-9425-d08aa5d59a1e` when chunks of buffers arrived. I have a guess how it works. I think you have to find a way how to save the incoming buffer in the same blob url. – Serkan Sipahi Sep 12 '18 at 18:07
  • there is no way to append to a blob. – cooldude101 Sep 13 '18 at 10:51
0

Unfortunately it is not currently possible to create a generally readable blob url with content that will be determined asynchronously.

If the goal is specifically for media playback, then there is the MediaSource API which you mention you know about. You imply that it requires server-side processing, but it is not always true - you can generate fragmented mp4 from a normal mp4 file with client-side code, for example with something like mux.js (last time I used it, it generated wrong/buggy fmp4 header so I needed some custom code to fix their stuff) or emsciptened ffmpeg or something else.

I agree with you that MediaStream API has many drawbacks/differences from a generic stream concept:

  • the data can not be arbitrary formats or arbitrarily split into chunks but must be in one of a few specific formats, i.e. fragmented mp4 or webm, and its fragmentation must follow the format's specific requirements
  • can not be read by generic url reading methods like xhr or fetch, it is only usable by audio/video elements;
  • can only be assigned to a single media element, and only once;
  • can be read non-sequentially by the corresponding media element;
  • can not control the data flow with stream-like mechanisms like backpressure or pull events, instead you need to manually monitor the media element's current position in seconds and figure out the corresponding data segments;
  • buffers a copy of the data added to it, doubling memory usage in some use-cases (you can manually remove data from its buffer to try and mitigate this);

Unfortunately for now that is the only option.

vvv
  • 11