2

I have a byte array that will always be 10 megs. No matter what data is in it, it will be ten megs. The purpose has to do with the Large Object Heap (more information).

So if a 1Mb file is placed into the byte[], then the last nine megs are zero'd out. Problem is, when I upload this to SQL Server, the uploaded file is always 10 megs. I'd much rather upload only the necessary bits.

I have the size before it's copied into the byte array, so I could do a trim if needs be. Problem is, I don't know how to do one efficiently that is not creating new byte[] on the LOH.

Ideas?

Updated #1

Here is some pseudo code. I've removed most of the unneccessary code. It's using the Microsoft.Practices.EnterpriseLibrary.Data library to access the database. (Legacy code, can't change)

Param[] parametersArray = new Param[10];
// other params
parametersArray[4] = new Param("DocumentData", DbType.Binary, documentToSave.Data);
// other params
DataAccess.ExecuteNonQuery("esMD.proc_WS_UpdateDocument", parametersArray);

public static int ExecuteNonQuery(string spName, params Param[] parameters)
{
    int ret = -1;

    DbCommand storedProcedure = Database.Db.GetStoredProcCommand(spName);
    storedProcedure.CommandTimeout = commandTimeout;

    if (parameters != null)
    {
        foreach (Param parameter in parameters)
        {
            if (parameter != null)
            {
                Database.Db.AddInParameter(storedProcedure, parameter.ParameterName, parameter.DbType, parameter.Value);
            }
        }
    }

    try
    {
        ret = MedicareDatabase.Db.ExecuteNonQuery(storedProcedure);
    }
    catch (Exception)
    {
        throw;
    }
    finally
    {
        if (storedProcedure != null)
        {
            storedProcedure.Dispose();
        }
    }
    return ret;
}

Updated #2

I modified the database call above and also modified how I entered Parameters.

if (parameter.Size != null && parameter.Size > 0)
{
    MedicareDatabase.Db.AddParameter(storedProcedure, parameter.ParameterName, DbType.Binary, parameter.Size, ParameterDirection.Input, true, 0, 0, string.Empty, DataRowVersion.Default, parameter.Value); 
}
else
{
    MedicareDatabase.Db.AddInParameter(storedProcedure, parameter.ParameterName, parameter.DbType, parameter.Value);
}

This seems to be working for me. Does anyone see any issues with this?

-Chad

Community
  • 1
  • 1
Cyfer13
  • 369
  • 7
  • 17
  • 3
    Show us the code you use to "upload the file to SQL Server" – dtb Feb 07 '12 at 21:53
  • Updated with the database call. – Cyfer13 Feb 07 '12 at 22:02
  • 1
    Where did you find the information that it was more memory efficient to use an array larger than you actually needed? I can't see anything in the answers to that question you linked that suggests so. On the contrary the answers recommending using several small arrays over one large, but that's obviously not applicable in your case. – PHeiberg Feb 07 '12 at 22:25
  • The need came from getting out of memory exceptions. If I process too many files at once, my LOH gets fragmented and I can't add new files to it and it errors out. So if I constantly refill an existing array, then I don't run into that issue. – Cyfer13 Feb 07 '12 at 22:35
  • If you're using a SQL Server 2008, the [FileStream](http://msdn.microsoft.com/en-us/library/cc716724.aspx) could be an alternative, since it supports seeking and writing from a stream directly. – PHeiberg Feb 07 '12 at 22:35
  • How would you a convert a filestream to SQL? – Cyfer13 Feb 08 '12 at 13:27

1 Answers1

4

Create a MemoryStream from the byte[] and desired length. Use the stream to construct a SqlBytes instance. Build the parameter using the SqlBytes.

As a side comment, your large data should never be materialized as a byte[]. It should live as a stream through the processing. See Download and Upload images from SQL Server via ASP.Net MVC for an actual example of how you can keep a large object as a stream, never materialized into a large byte[], from end-to-end on both download and upload paths.

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569
  • I'm using itextsharp to cut the large pdf's into smaller chunks. Itextsharp requires a byte[] (I think). – Cyfer13 Feb 07 '12 at 22:33