1

I'm looking to get an estimate of the size of the data that is contained in DataTable in MB. I've to loop through each row and Marshal.SizeOf() but at runtime I get an error that

Type 'System.String' cannot be marshaled as an unmanaged structure; no meaningful size or offset can be computed.

here is a sample of the code I'm using to get the size:

private static int dsSize(DataSet ds)
    {
        int returnValue = 0;

        foreach (DataRow dr in ds.Tables[ds.Tables.Count - 1].Rows)
        {
            int rowSize = 0;
            for (int i = 0; i < ds.Tables[ds.Tables.Count - 1].Columns.Count; i++)
            {
                rowSize += System.Runtime.InteropServices.Marshal.SizeOf(dr[i]);
            }
            returnValue += rowSize;
        }

        return returnValue;
    }

any ideas as to how to achieve getting the size of data?

thanks in advance

orlando15767
  • 145
  • 1
  • 2
  • 15
  • [This](https://stackoverflow.com/questions/605621/how-to-get-object-size-in-memory) question should help you. – NtFreX Jul 20 '17 at 19:52
  • Size in what form? Imagine if the datatable contained a million string references, but they were all references to the same string - what would that count as? If you use one form of measurement when you're really trying to work out something else, it could be worse than useless. – Jon Skeet Jul 20 '17 at 19:53
  • basically i'm trying to find the total size of the data contained in the datatable. i'm writing out the data as an XML file that needs to be 5MB or less, and will need to split the data into multiple files if it's over the threashold – orlando15767 Jul 20 '17 at 19:54
  • @orlando15767 Thats a very specific requirement. Wouldn't it be enough to paginate it? – NtFreX Jul 20 '17 at 19:56
  • normally yes, but within the overreaching dataset, i have a mix of 29 different DataTables that contain a range of datatypes (int, string, datetime, etc..) and an unknown row count of each. and from each time i receive data to be exported, the amount and size can vary greatly. – orlando15767 Jul 20 '17 at 19:59
  • The size of a complex object in memory, and the size of that data output as an XML file can be vastly different. How will knowing the former allow you to obey limits in the latter? – hatchet - done with SOverflow Jul 20 '17 at 19:59
  • my thought process is that if i can limit the records that will be output to 2.5MB, then once it's output as XML data, this should keep to under the 5MB threshold and will check file size before clearing data and moving to next file. – orlando15767 Jul 20 '17 at 20:02
  • You could save the xml file and then split into chunks of 5mb, might be easier. – WooHoo Jul 20 '17 at 20:02
  • but i need to keep the elements in tact and splitting on the 5MB would not guarantee this, if just splitting the data. it would need to be well formed and pass schema validation to be a valid data file for ingesting onto my other system. – orlando15767 Jul 20 '17 at 20:04
  • 1
    I spotted this on codeproject, I've not tested it but may help you. https://www.codeproject.com/Articles/31114/Split-large-XML-files-into-small-files – WooHoo Jul 20 '17 at 20:06
  • the link to the codeproject is promising. i do not have it fully functioning, but i'm able to split out the files. just working into my requirements now. thank you for the link – orlando15767 Aug 04 '17 at 12:57

0 Answers0