I would like to be able to live stream video (or any other file that is large and continuously modified/appended) via dat
.
Here it says,
The dat:// protocol doesn't support partial updates at the file-level, which means that with multiple records in a single file, every time a user adds a record, anyone who follows that user must sync and re-download the entire file. As the file continues to grow, performance will degrade. Putting each record in an individual file is much more efficient: when a record is created, peers in the network will only download the newly-created file.
However, it also says here that dat
uses Rabin fingerprinting to create deterministic chunks of files, so presumably a dat client would be able to easily identify the chunks that it has already downloaded by their hash, and should therefore be able to only download the latest final chunk of the file, if that is the only part that has changed.
And also here in the faq, it says:
The type of Merkle tree used by Dat lets peers compare which pieces of a specific version of a dataset they each have and efficiently exchange the deltas to complete a full sync.
There is hypervision, but from my rudimentary understanding of how it works, it looks like it saves it's own "bundle.js" file for the video data, I'm not sure how it achieves streaming, but this is not exactly the same as what I'm trying to achieve, which is being able to efficiently stream an arbitrary large and expanding file, for example a .ts or .mkv video stream.
So, my question is - is efficient live-streaming of video (ie without redownloading already-downloaded chunks) something that is simply currently not supported and could be added in future, or is that something that is inherently unachievable using the dat
protocol?