In JavaScript there are currently 2 alternatives for streaming a File
as blocks of Uint8Array
:
slice()
:File.
slice( start, end )
This gives a sub-
Blob
, which can then be read vianew FileReader()
.
readAsArrayBuffer( blob )
; theArrayBuffer
can be cast toUint8Array
.stream()
:File.
stream()
.
getReader()
.
read()
This goes via
ReadableStream
->ReadableStreamDefaultReader
, andread()
's.value
gives theUint8Array
.
(The motivation for this "chunked" reading is usually processing large File
s which the user Drag-and-Dropped into the browser, without reading them into memory all at once.)
Some differences I found:
slice()
is older,stream()
is newer.slice()
allows specifying a buffer size, whilestream()
is fixed to 64 KiB in current browsers.
The older slice()
seems more flexible on all axes, providing random access (permitting easy parallel processing) and allowing custom buffer sizes (libraries such as hash-wasm
are much faster on larger buffers).
Question
Are there any benefits to using stream()
for this type of chunked File
reading?
Perhaps stream()
can implement some optimisations, such as better read-ahead or better support/performance when loading data from a local network mount? Evidence would be appreciated.