I am testing how big a collection could be in .Net. Technically, any collection object could grows to the size of the physical memory.
Then I tested the following code in a sever, which has 16GB memory, running Windows 2003 server and Visual Studio…
I'm using a BinarySerializer with a pretty big (althought not very deep) graph of items. I have 8GB of ram backed by 12Gig of swap and i'm getting an OutOfMemoryException when serializing which is expected ( it's possible the graph could go near or…
I've a 64-bit PC with 128 GB of RAM and I'm using C# and .NET 4.5.
I've the following code:
double[,] m1 = new double[65535, 65535];
long l1 = m1.LongLength;
double[,] m2 = new double[65536, 65536]; // Array dimensions exceeded supported range
long…
In .Net 4.5 gcAllowVeryLargeObjects was introduced to allow arrays greater than 2gb in size for 64 bit systems. However arrays were (and still are) limited to ~4.2 billion elements and ~2.1 billion in any dimension. Why?
Is there no interest for it…
After several outOfMemory exceptions, I enabled "gcAllowVeryLargeObjects", it works perfectly fine. I am now wondering why it is not a default option in C# (on a 64-bit platform).
Is it for pure compatibility reasons ? Or am I missing a major…
I'm using the Array.Copy Method (Array, Array, Int64). I've the following static method
public static T[,] Copy(T[,] a)
where T : new()
{
long n1 = a.GetLongLength(0);
long n2 = a.GetLongLength(1);
T[,] b = new T[n1, n2];
…
I want to resize the image in my website, but when I using Bitmap to load a image of 14032*19864(png extension), an OutOfMemoryException is thrown. My compiler configuration is any cpu.
I was doubting whether the running environment is x64.
the code…
I\m unable to understand how to set gcAllowVeryLargeObjectsruntime param for worker role. I set this pram in app.config. But it doesnt work. As I understand I need to somehow config it in config file of wotker host.
Update: Final solution based on…
The question is regarding the allocation of arrays in .net. i have a sample program below in which maximum array i can get is to length. I increase length to +1 it gives outofMemory exception. But If I keep the length and remove the comments I am…
Given
// r is a System.Data.IDataRecord
var blob = new byte[(r.GetBytes(0, 0, null, 0, int.MaxValue))];
r.GetBytes(0, 0, blob, 0, blob.Length);
and r.GetBytes(...) returns Int64
Since Array.zeroCreate and Array.init take Int32 how do I create an…
I have to operate with two-dimensional array(s) with size 34000x34000 items for math calculations.
Obvious problem - is impossibility of CLR to store such big parts of data in memory. I tried to use MemoryMappedFile, but it fails too while I'm…
I wanted to show to a colleague that you can allocate more than 2GB of ram, so I made a little test application.
let mega = 1 <<< 20
let konst x y = x
let allocate1MB _ = Array.init mega (konst 0uy)
let memoryHog = Array.Parallel.init 8192…
after a lot of searching I have unfortunately not found a solution for this problem under .Net 6.0.
My application is running on a x64 Linux server and the dictionary contains just 3,192,915 elements and I get the OutOfMemoryException.
I have tried…
I want to be able to programmatically (or at least using the Project properties) set gcAllowVeryLargeObjects to true. I know I can use the App.config file, but this is quite ugly as it needs to create a separate file to the main executable which…