I have an UWP application that allows to select videos from my local directory and then creates a copy of the selected video in the application data folder, it works fine but I have a problem if I try to copy a file larger than 2GB, the application explodes with the following error:
The arithmetic operation resulted in an overflow.
Reading a bit I have found the following information:
No array in .NET can contain more than 2 ^ 31 elements (System.Int32.MaxValue) or a maximum size of 2GB, which would be roughly a 2GB byte array.
I use the following service to save videos:
public class SaveImageService : ISaveImageService
{
public SaveImageService() { }
public async Task<byte[]> SaveVideo(Stream stream, string Id, bool OverwriteIfExist = false, FileType FileType = FileType.None)
{
if (FileType != FileType.None)
{
var inStream = stream.AsRandomAccessStream();
var fileBytes = new byte[inStream.Size];
using (DataReader reader = new DataReader(inStream))
{
await reader.LoadAsync((uint)inStream.Size);
reader.ReadBytes(fileBytes);
}
if (FileType == FileType.Video)
{
var videoStoragePath = await Windows.Storage.ApplicationData.Current.LocalFolder.CreateFolderAsync("Videos", Windows.Storage.CreationCollisionOption.OpenIfExists);
var video = await videoStoragePath.CreateFileAsync(Id + ".mp4", Windows.Storage.CreationCollisionOption.ReplaceExisting);
await WriteBytes(video, fileBytes);
}
return fileBytes;
}
return null;
}
private async Task WriteBytes(StorageFile file, byte[] fileBytes)
{
using (var fs = await file.OpenAsync(Windows.Storage.FileAccessMode.ReadWrite))
{
var outStream = fs.GetOutputStreamAt(0);
var dataWriter = new DataWriter(outStream);
dataWriter.WriteBytes(fileBytes);
await dataWriter.StoreAsync();
dataWriter.DetachStream();
await outStream.FlushAsync();
outStream.Dispose();
fs.Dispose();
}
}
}
What should I use instead of the byte array? read the byte array bit by bit? I would appreciate an example of how I should proceed. Thanks.
Additional Information:
- I don't use File.Copy to copy files because UWP has security restrictions with this method as applications run in a sandbox.
- Running the application as x64 does not solve the problem as an object larger than 2 gigabytes cannot be created in .NET regardless of whether the application is x64. When you try to copy a video larger than 2GB in an x64 application, it displays the following error:
Array dimensions exceeded supported range.
In my opinion what needs to be done is a more efficient way to copy the file without having to declare the full size of the byte array from the beginning and to do it progressively, and that complies with UWP security protocols, the problem I just don't know how to do it exactly on a practical level.
Buffer
I have tried to use a buffer as suggested by a user in the comments, but copying a 3gb video takes a long time (more than 40 minutes) and my computer has an i9 10900k and 32 GB of Ram and SSD drive so it is not a problem hardware, is there a way to optimize the buffer? I leave the adaptation that I have made from another answer from stackoverflow:
public async Task SaveMedia(Stream stream, string Id, bool OverwriteIfExist = false, FileType FileType = FileType.None)
{
if (FileType != FileType.None)
{
try
{
if (FileType == FileType.Video)
{
var newFolder = await Windows.Storage.ApplicationData.Current.LocalFolder.CreateFolderAsync("Videos", Windows.Storage.CreationCollisionOption.OpenIfExists);
var target = await newFolder.CreateFileAsync(Id + ".mp4", Windows.Storage.CreationCollisionOption.ReplaceExisting);
var inputStream = stream.AsInputStream();
using (var srcStream = stream.AsRandomAccessStream())
using (var targetStream = await target.OpenAsync(FileAccessMode.ReadWrite))
using (var reader = new DataReader(inputStream))
{
var output = targetStream.GetOutputStreamAt(0);
await reader.LoadAsync((uint)srcStream.Size);
while (reader.UnconsumedBufferLength > 0)
{
uint dataToRead = reader.UnconsumedBufferLength > 1024*512
? 1024*512
: reader.UnconsumedBufferLength;
IBuffer buffer = reader.ReadBuffer(dataToRead);
await output.WriteAsync(buffer);
}
await output.FlushAsync();
}
}
}
catch (Exception ex)
{
}
}
}
Proportionally speaking, using a byte array is much higher in terms of performance since copying a 1gb video takes seconds, however the buffer is infinitely slower, so its use is not an option unless there is a bottleneck that is causing that slowness.