0

I'm trying to write a data migration app. There are a few dozen tables with a LOT of columns.

I'm having trouble importing the data from a datatable to the SQL Server table.

When trying to use SqlBulkCopy, I'm getting the error:

The given value of type String from the data source cannot be converted to type image of the specified target column.

I've constructed a test which generates the same error.

Create the table in a Test db.

CREATE TABLE [dbo].[CALTEST]
(
    [ACCOUNTNO] [VARCHAR](20) NULL,
    [NOTES] [IMAGE] NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

Paste the code into a button click event

private void button3_Click(object sender, EventArgs e)
{
    //Convert json to Datatable
    //bulkcopy datatable into CALTEST

    string json =
@"[{
  ""ACCOUNTNO"": ""B1082380914D!J5_8BAR"",
  ""NOTES"": ""WyBDViBTZW50IF0NCkNhbmRpZGF0ZSA6IEFsYmVydCBHYXZpbnMNCkRlc2NyaXB0aW9uIDoNCg0KDQo=""
},
{
  ""ACCOUNTNO"": ""B1082380914D!J5_8BAR"",
  ""NOTES"": ""WyBJbnRlcnZpZXcgXQ0KQ2FuZGlkYXRlIDogQWxiZXJ0IEdhdmlucw0KRGVzY3JpcHRpb24gOg0KDQoNCg==""
},
{
  ""ACCOUNTNO"": ""B1090241471P!-R0?ELI"",
  ""NOTES"": ""WyBDViBTZW50IF0NCkNhbmRpZGF0ZSA6IEFsYmVydCBHYXZpbnMNCkRlc2NyaXB0aW9uIDoNCg0KDQo=""
}]
";

    string connectionString = string.Format($"Server=SERVERNAME;Database=TEST;Trusted_Connection=True;");

    using (var conn = new SqlConnection(connectionString))
    {
        conn.Open();

        DataTable dt = new DataTable();

        using (SqlDataAdapter adapter = new SqlDataAdapter("SELECT TOP 0 * FROM CALTEST", conn))
        {
            adapter.Fill(dt);
        };

        DataTable dtJson = JsonConvert.DeserializeObject<DataTable>(json, new JsonSerializerSettings
        {
            NullValueHandling = NullValueHandling.Ignore,
            MissingMemberHandling = MissingMemberHandling.Ignore
        });

        //foreach (DataRow dr in dtJson.Rows)
        //{
        //  dt.Rows.Add(dr.ItemArray);
        //}

        using (var bulkCopy = new SqlBulkCopy(conn))
        {
             //  dt.Columns.Cast<DataColumn>().ToList().ForEach(x => bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(x.ColumnName, x.ColumnName)));
            bulkCopy.DestinationTableName = "CALTEST";
            bulkCopy.BatchSize = 50000;
            bulkCopy.BulkCopyTimeout = 60; //seconds

            bulkCopy.WriteToServer(dt);
        }
    }
}

I'm retrieving the table data structure from the database, which works. The NOTES column is of type image.

But when adding the json datatable, that data schema is overwritten.

How would I go about setting the datatypes of the fields in the retrieved from json?

Should I give up on SqlBulkCopy and generate insert queries for each record?

What is the best way of getting json data into a SQL Server table?

Thanks.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Don
  • 67
  • 6
  • Have you checked this [https://stackoverflow.com/questions/34485420/how-do-you-put-an-image-file-in-a-json-object](https://stackoverflow.com/questions/34485420/how-do-you-put-an-image-file-in-a-json-object) – Mdyahiya Nov 29 '18 at 02:24
  • `ntext`, `text`, and `image` data types will be removed in a future version of SQL Server. Avoid using these data types in new development work, and plan to modify applications that currently use them. Use `nvarchar(max)`, `varchar(max)`, and `varbinary(max)` instead. [See details here](http://msdn.microsoft.com/en-us/library/ms187993.aspx) – marc_s Nov 29 '18 at 05:21
  • I don't understand why you would want to use a **binary** datatype (`image` - or preferably `VARBINARY(MAX)`) for your JSON data?? JSON is a **text-based** format - not binary. SQL Server 2016 and up uses `NVARCHAR(MAX)` (a **text-based** datatype) for its JSON data - you should, too! – marc_s Nov 29 '18 at 05:22
  • Thanks for the reply Marc. Unfortunately I don't have control of the data types used. Its part of an established database. I don't understand why that datatype was used either. I'm just trying to work with whats there. – Don Nov 29 '18 at 06:01

0 Answers0