139

For any arbitrary instance (collections of different objects, compositions, single objects, etc)

How can I determine its size in bytes?

(I've currently got a collection of various objects and i'm trying to determine the aggregated size of it)

EDIT: Has someone written an extension method for Object that could do this? That'd be pretty neat imo.

Janie
  • 1,855
  • 2
  • 13
  • 9
  • 3
    possible duplicate of [Getting the size of a field in bytes with C#](http://stackoverflow.com/questions/207592/getting-the-size-of-a-field-in-bytes-with-c-sharp) – AxelEckenberger Jan 11 '13 at 22:27

17 Answers17

63

First of all, a warning: what follows is strictly in the realm of ugly, undocumented hacks. Do not rely on this working - even if it works for you now, it may stop working tomorrow, with any minor or major .NET update.

You can use the information in this article on CLR internals MSDN Magazine Issue 2005 May - Drill Into .NET Framework Internals to See How the CLR Creates Runtime Objects - last I checked, it was still applicable. Here's how this is done (it retrieves the internal "Basic Instance Size" field via TypeHandle of the type).

object obj = new List<int>(); // whatever you want to get the size of
RuntimeTypeHandle th = obj.GetType().TypeHandle;
int size = *(*(int**)&th + 1);
Console.WriteLine(size);

This works on 3.5 SP1 32-bit. I'm not sure if field sizes are the same on 64-bit - you might have to adjust the types and/or offsets if they are not.

This will work for all "normal" types, for which all instances have the same, well-defined types. Those for which this isn't true are arrays and strings for sure, and I believe also StringBuilder. For them you'll have add the size of all contained elements to their base instance size.

tibx
  • 840
  • 13
  • 20
Pavel Minaev
  • 99,783
  • 25
  • 219
  • 289
  • No. There's no "proper" way to do this, because it is not something a well-behaved .NET application should be concerned of in the first place. The above mucks directly with _internal_ data structures of a _particular implementation_ of CLR (which may easily change in next version of .NET, for example). – Pavel Minaev Mar 18 '10 at 18:46
  • 3
    is this supposed to work in C# or only managed c++? it's not happy in C# so far that I've tried it: `Cannot take the address of, get the size of, or declare a pointer to a managed type ('System.RuntimeTypeHandle')` – Maslow Apr 03 '14 at 17:19
  • The code is C#, but it looks like something has changed since I last wrote it - it compiled fine with C# 3 and .NET 3.5. You can probably still use unions (`LayoutKind.Explicit`) to "cast" the handle to a pointer and do arithmetic on it. – Pavel Minaev Apr 22 '14 at 09:18
  • 23
    The .NET 4 version of this doesn't even need unsafe code: `Marshal.ReadInt32(type.TypeHandle.Value, 4)` works for x86 and x64. I only tested struct and class types. Keep in mind that this returns the *boxed* size for value types. @Pavel Maybe you could update your answer. – jnm2 Mar 01 '15 at 01:52
  • @jnm2 I must be missing something obvious; your code doesn't compile for me. `The name 'type' doesn't exist in this context` is the error I get. I'm using .NET 4.5, however. – sab669 Feb 01 '17 at 14:18
  • 2
    @sab669 well, replace `type` with `obj.GetType()` in his example. It doesn't matter which framework you're using, only what CLR (v2 or v4 or CoreCLR). I haven't tried this on CoreCLR. – jnm2 Feb 01 '17 at 15:42
  • 1
    Does this answer actually match intent of the question? I tested @jnm2's code, and it returns the size of the Type only (i.e. the sum of the size of all the reference handles and perhaps also size of pointers to methods?) , But it does not include the size of data in each of the objects fields. I would assume that what OP wants is to know the aggregate size of the object including the size of sub-objects owned by the root? – Sam Goldberg Mar 09 '17 at 17:52
  • @SamGoldberg When I tested this on desktop CLR v4, it returned the size of all the class's fields plus the size of a single pointer to the type information. It does not contain the size of the method pointers; they are shared by all instances of that type and you follow the type pointer to get to them. – jnm2 Mar 09 '17 at 17:56
  • @SamGoldberg it aggregates the size of the class fields without following any references. Following references to subobjects would not make sense. You could have cyclical references. Of course, structs are embedded directly in the class so the size of nested struct fields is always the aggregate of the struct sizes. – jnm2 Mar 09 '17 at 17:58
  • @jnm2: I reached this post because I was looking to measure whether it would a significant difference in my app to share one instance of the same object or to let each object have its own separate instance (i.e. new object every time). The code to share a single instance would be much more than to just create a new one each time. (The instance(s) are long-lived). So just getting the sizeof(Type) doesn't help evaluate that much, because, if the object is owning a lot sub-objects, it takes up much more space than an object with many references to primitives. – Sam Goldberg Mar 09 '17 at 18:08
  • 2
    @SamGoldberg Calculating this manually is a lot of work with a million edge cases. Sizeof tells you the static size of an object, not the memory consumption of a runtime graph of objects. VS2017's memory and CPU profiling are very good, as are ReSharper's and other tools, and that's what I'd use to measure. – jnm2 Mar 09 '17 at 18:23
  • did not work for me when I am using with a list of an object – Arvind Krmar Apr 23 '20 at 13:20
  • @PavelMinaev "it is not something a well-behaved .NET application should be concerned of in the first place". `Microsoft.Extensions.Caching.Memory.MemoryCache` requires a size to operate in some cases. – Little Endian Nov 17 '20 at 20:27
21

Not directly answers the question, but for those who are interested to investigate object sizes while debugging:

  1. Start debugging in VS, make sure the Diagnostics Tools window is shown (Debug > Windows > Show Diagnostic Tools)
  2. Set a breakpoint (optional)
  3. Click Take Snapshot in the Memory Usage while paused
  4. Explore the snapshot (optionally sort the object list alphabetically to find the type you're interested in)

enter image description here

Steak Overflow
  • 7,041
  • 1
  • 37
  • 59
Alex from Jitbit
  • 53,710
  • 19
  • 160
  • 149
20

You may be able to approximate the size by pretending to serializing it with a binary serializer (but routing the output to oblivion) if you're working with serializable objects.

class Program
{
    static void Main(string[] args)
    {
        A parent;
        parent = new A(1, "Mike");
        parent.AddChild("Greg");
        parent.AddChild("Peter");
        parent.AddChild("Bobby");

        System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf =
           new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
        SerializationSizer ss = new SerializationSizer();
        bf.Serialize(ss, parent);
        Console.WriteLine("Size of serialized object is {0}", ss.Length);
    }
}

[Serializable()]
class A
{
    int id;
    string name;
    List<B> children;
    public A(int id, string name)
    {
        this.id = id;
        this.name = name;
        children = new List<B>();
    }

    public B AddChild(string name)
    {
        B newItem = new B(this, name);
        children.Add(newItem);
        return newItem;
    }
}

[Serializable()]
class B
{
    A parent;
    string name;
    public B(A parent, string name)
    {
        this.parent = parent;
        this.name = name;
    }
}

class SerializationSizer : System.IO.Stream
{
    private int totalSize;
    public override void Write(byte[] buffer, int offset, int count)
    {
        this.totalSize += count;
    }

    public override bool CanRead
    {
        get { return false; }
    }

    public override bool CanSeek
    {
        get { return false; }
    }

    public override bool CanWrite
    {
        get { return true; }
    }

    public override void Flush()
    {
        // Nothing to do
    }

    public override long Length
    {
        get { return totalSize; }
    }

    public override long Position
    {
        get
        {
            throw new NotImplementedException();
        }
        set
        {
            throw new NotImplementedException();
        }
    }

    public override int Read(byte[] buffer, int offset, int count)
    {
        throw new NotImplementedException();
    }

    public override long Seek(long offset, System.IO.SeekOrigin origin)
    {
        throw new NotImplementedException();
    }

    public override void SetLength(long value)
    {
        throw new NotImplementedException();
    }
}
BlueMonkMN
  • 25,079
  • 9
  • 80
  • 146
  • 7
    Of course, this can get you a minimum size, but tells you nothing about the size in memory. – John Saunders Jul 14 '09 at 23:04
  • Lol, the next lightbulb I had before coming back to check the replies was using the binary serializer. John, how would this not give you the actual size in memory? – Janie Jul 14 '09 at 23:17
  • 2
    It would give you the serialized size, which will be the size the serializer wanted it, for "serializer" purposes. Those are likely different from the "sit-in-memory" purposes. Maybe serializer stores smaller integers in three bytes, for instance. – John Saunders Jul 14 '09 at 23:34
  • 5
    Like I said, it's only an approximation. It's not perfect, but I would disagree that it tells you "nothing" about the size in memory. I would say that it given you *some* idea -- larger serializations would generally be correlated with larger in-memory sizes. There is *some* relationship. – BlueMonkMN Jul 15 '09 at 10:46
  • I agree - it's useful to get a ballpark estimate of the size of a .NET object graph. – Craig Shearer Jan 08 '11 at 21:25
13

For unmanaged types aka value types, structs:

Marshal.SizeOf(object);

For managed objects the closer i got is an approximation.

long start_mem = GC.GetTotalMemory(true);

aclass[] array = new aclass[1000000];
for (int n = 0; n < 1000000; n++)
    array[n] = new aclass();

double used_mem_median = (GC.GetTotalMemory(false) - start_mem)/1000000D;

Do not use serialization.A binary formatter adds headers, so you can change your class and load an old serialized file into the modified class.

Also it won't tell you the real size in memory nor will take into account memory alignment.

[Edit] By using BiteConverter.GetBytes(prop-value) recursivelly on every property of your class you would get the contents in bytes, that doesn't count the weight of the class or references but is much closer to reality. I would recommend to use a byte array for data and an unmanaged proxy class to access values using pointer casting if size matters, note that would be non-aligned memory so on old computers is gonna be slow but HUGE datasets on MODERN RAM is gonna be considerably faster, as minimizing the size to read from RAM is gonna be a bigger impact than unaligned.

The_Black_Smurf
  • 5,178
  • 14
  • 52
  • 78
Aridane Álamo
  • 324
  • 3
  • 12
7

safe solution with some optimizations CyberSaving/MemoryUsage code. some case:

/* test nullable type */      
TestSize<int?>.SizeOf(null) //-> 4 B

/* test StringBuilder */    
StringBuilder sb = new StringBuilder();
for (int i = 0; i < 100; i++) sb.Append("わたしわたしわたしわ");
TestSize<StringBuilder>.SizeOf(sb ) //-> 3132 B

/* test Simple array */    
TestSize<int[]>.SizeOf(new int[100]); //-> 400 B

/* test Empty List<int>*/    
var list = new List<int>();  
TestSize<List<int>>.SizeOf(list); //-> 205 B

/* test List<int> with 100 items*/
for (int i = 0; i < 100; i++) list.Add(i);
TestSize<List<int>>.SizeOf(list); //-> 717 B

It works also with classes:

class twostring
{
    public string a { get; set; }
    public string b { get; set; }
}
TestSize<twostring>.SizeOf(new twostring() { a="0123456789", b="0123456789" } //-> 28 B
AlexPalla
  • 191
  • 3
  • 4
  • This is the approach I would take, too. You could add a set of previously encountered objects in a graph to avoid a) infinite recursion and b) avoid adding the same memory twice. – mafu Mar 28 '18 at 17:08
5

This doesn't apply to the current .NET implementation, but one thing to keep in mind with garbage collected/managed runtimes is the allocated size of an object can change throughout the lifetime of the program. For example, some generational garbage collectors (such as the Generational/Ulterior Reference Counting Hybrid collector) only need to store certain information after an object is moved from the nursery to the mature space.

This makes it impossible to create a reliable, generic API to expose the object size.

Sam Harwell
  • 97,721
  • 20
  • 209
  • 280
  • Interesting. So what do people do to dynamically determine the size of their objects/collections of objects? – Janie Jul 14 '09 at 22:42
  • 2
    It depends on what they need it for. If for P/Invoke (native code interop), they use Marshal.SizeOf(typeof(T)). If for memory profiling, they use a separate profiler that cooperates with the execution environment to provide the information. If you are interested in the element alignment in an array, you can use the SizeOf IL opcode in a DynamicMethod (I don't think there's an easier way in the .NET framework for this). – Sam Harwell Jul 14 '09 at 23:04
4

This is impossible to do at runtime.

There are various memory profilers that display object size, though.

EDIT: You could write a second program that profiles the first one using the CLR Profiling API and communicates with it through remoting or something.

SLaks
  • 868,454
  • 176
  • 1,908
  • 1,964
  • 22
    If it's impossible to do at runtime, how are the memory profilers providing the information? – Janie Jul 14 '09 at 22:13
  • 2
    By using the Profiling API. However, a program cannot profile itself – SLaks Jul 14 '09 at 22:18
  • Interesting. What if I wanted to have the code deal with cases when objects were consuming too much memory? – Janie Jul 14 '09 at 22:26
  • 4
    Then you'd be dealing with self-aware software, and I'd be very afraid. :-) Seriously, "single responsibility principal" - let the program be the program, let some other piece of code watch for objects taking up too much memory. – John Saunders Jul 14 '09 at 22:41
  • 2
    @Janie: you would also be making assumptions about the significance of the size and how it relates to performance. I think you'd want to be a real low-level CLR performance expert (the kind who already knows about the Profiling API) before you do that. Otherwise, you might be applying your earlier experiences to a situation in which they do not apply. – John Saunders Jul 14 '09 at 23:06
3

Use Son Of Strike which has a command ObjSize.

Note that actual memory consumed is always larger than ObjSize reports due to a synkblk which resides directly before the object data.

Read more about both here MSDN Magazine Issue 2005 May - Drill Into .NET Framework Internals to See How the CLR Creates Runtime Objects.

tibx
  • 840
  • 13
  • 20
weston
  • 54,145
  • 21
  • 145
  • 203
3

For anyone looking for a solution that doesn't require [Serializable] classes and where the result is an approximation instead of exact science. The best method I could find is json serialization into a memorystream using UTF32 encoding.

private static long? GetSizeOfObjectInBytes(object item)
{
    if (item == null) return 0;
    try
    {
        // hackish solution to get an approximation of the size
        var jsonSerializerSettings = new JsonSerializerSettings
        {
            DateFormatHandling = DateFormatHandling.IsoDateFormat,
            DateTimeZoneHandling = DateTimeZoneHandling.Utc,
            MaxDepth = 10,
            ReferenceLoopHandling = ReferenceLoopHandling.Ignore
        };
        var formatter = new JsonMediaTypeFormatter { SerializerSettings = jsonSerializerSettings };
        using (var stream = new MemoryStream()) { 
            formatter.WriteToStream(item.GetType(), item, stream, Encoding.UTF32);
            return stream.Length / 4; // 32 bits per character = 4 bytes per character
        }
    }
    catch (Exception)
    {
        return null;
    }
}

No, this won't give you the exact size that would be used in memory. As previously mentioned, that is not possible. But it'll give you a rough estimation.

Note that this is also pretty slow.

Peter
  • 14,221
  • 15
  • 70
  • 110
2

AFAIK, you cannot, without actually deep-counting the size of each member in bytes. But again, does the size of a member (like elements inside a collection) count towards the size of the object, or a pointer to that member count towards the size of the object? Depends on how you define it.

I have run into this situation before where I wanted to limit the objects in my cache based on the memory they consumed.

Well, if there is some trick to do that, I'd be delighted to know about it!

Charles Prakash Dasari
  • 4,964
  • 1
  • 27
  • 46
2

For value types, you can use Marshal.SizeOf. Of course, it returns the number of bytes required to marshal the structure in unmanaged memory, which is not necessarily what the CLR uses.

Mehrdad Afshari
  • 414,610
  • 91
  • 852
  • 789
  • SizeOf(Object) may be unavailable in future releases. Instead, use SizeOf(). For more info, go to http://go.microsoft.com/fwlink/?LinkID=296514 – Vinigas Aug 26 '19 at 08:02
2

I have created benchmark test for different collections in .NET: https://github.com/scholtz/TestDotNetCollectionsMemoryAllocation

Results are as follows for .NET Core 2.2 with 1,000,000 of objects with 3 properties allocated:

Testing with string: 1234567
Hashtable<TestObject>:                                     184 672 704 B
Hashtable<TestObjectRef>:                                  136 668 560 B
Dictionary<int, TestObject>:                               171 448 160 B
Dictionary<int, TestObjectRef>:                            123 445 472 B
ConcurrentDictionary<int, TestObject>:                     200 020 440 B
ConcurrentDictionary<int, TestObjectRef>:                  152 026 208 B
HashSet<TestObject>:                                       149 893 216 B
HashSet<TestObjectRef>:                                    101 894 384 B
ConcurrentBag<TestObject>:                                 112 783 256 B
ConcurrentBag<TestObjectRef>:                               64 777 632 B
Queue<TestObject>:                                         112 777 736 B
Queue<TestObjectRef>:                                       64 780 680 B
ConcurrentQueue<TestObject>:                               112 784 136 B
ConcurrentQueue<TestObjectRef>:                             64 783 536 B
ConcurrentStack<TestObject>:                               128 005 072 B
ConcurrentStack<TestObjectRef>:                             80 004 632 B

For memory test I found the best to be used

GC.GetAllocatedBytesForCurrentThread()
Scholtz
  • 2,878
  • 2
  • 23
  • 23
2

For arrays of structs/values, I have different results with:

first = Marshal.UnsafeAddrOfPinnedArrayElement(array, 0).ToInt64();
second = Marshal.UnsafeAddrOfPinnedArrayElement(array, 1).ToInt64();
arrayElementSize = second - first;

(oversimplified example)

Whatever the approach, you really need to understand how .Net works to correctly interpret the results. For instance, the returned element size is the "aligned" element size, with some padding. The overhead and thus the size is different depending on the usage of a type: "boxed" on the GC heap, on the stack, as a field, as an array element.

(I wanted to know what would be the memory impact of using "dummy" empty structs (without any field) to mimic "optional" arguments of generics; making tests with different layouts involving empty structs, I can see that an empty struct uses (at least) 1 byte per element; I vaguely remember it is because .Net needs a different address for each field, which wouldn't work if a field really was empty/0-sized).

haldo
  • 14,512
  • 5
  • 46
  • 52
Luc Rogge
  • 21
  • 1
1

You can use reflection to gather all the public member or property information (given the object's type). There is no way to determine the size without walking through each individual piece of data on the object, though.

Charlie
  • 15,069
  • 3
  • 64
  • 70
1

From Pavel and jnm2:

private int DumpApproximateObjectSize(object toWeight)
{
   return Marshal.ReadInt32(toWeight.GetType().TypeHandle.Value, 4);
}

On a side note be careful because it only work with contiguous memory objects

Antonin GAVREL
  • 9,682
  • 8
  • 54
  • 81
1

Simplest way is: int size = *((int*)type.TypeHandle.Value + 1)

I know this is implementation detail but GC relies on it and it needs to be as close to start of the methodtable for efficiency plus taking into consideration how GC code complex is nobody will dare to change it in future. In fact it works for every minor/major versions of .net framework+.net core. (Currently unable to test for 1.0)
If you want more reliable way, emit a struct in a dynamic assembly with [StructLayout(LayoutKind.Auto)] with exact same fields in same order, take its size with sizeof IL instruction. You may want to emit a static method within struct which simply returns this value. Then add 2*IntPtr.Size for object header. This should give you exact value.
But if your class derives from another class, you need to find each size of base class seperatly and add them + 2*Inptr.Size again for header. You can do this by getting fields with BindingFlags.DeclaredOnly flag.
Arrays and strings just adds that size its length * element size. For cumulative size of aggreagate objects you need to implement more sophisticated solution which involves visiting every field and inspect its contents.

TakeMeAsAGuest
  • 957
  • 6
  • 11
1

For anyone looking for a rough approximation comparing the sizes of disparate object graphs/collections, just serialize to JSON - e.g.:

Console.WriteLine($"Size1:\t{(JsonConvert.SerializeObject(someBusyObject)).Length}")); Console.WriteLine($"Size2:\t{(JsonConvert.SerializeObject(someOtherObject)).Length}"));

In my case I have a bunch of IEnumerable's being pulled during a login I'm benchmarking, and I just wanted to roughly size them to see their relative weight.

They're expensive operations and won't give you direct heap allocation size or anything like that, but it was good enough for my use case and was readily available.

user326608
  • 2,210
  • 1
  • 26
  • 33
  • 1
    Exactly what I was looking for -- Measuring my objects to determine cookie size estimates and this is much simpler than the other methods here. Plus its a one liner and is easily used in C# Interactive / Immediate window! – Rider Harrison Feb 22 '23 at 19:16