2

The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.

When I run the SQL query on my desktop machine, its fine. No errors or anything. However, when I execute the same code on the server that will be running it, it fails.

I've determined this is the section of SQL causing the issue. DateOfRun is a DateTime field.

 ,(SELECT intField
    FROM tableA
    WHERE RowIdentifier= ParentTable.RowIdentifier
            AND DateOfRun = Convert(Varchar(10),ParentTable.OfferOutcomeDateTime,120)) AS Days

Why would this work on one machine, but not the other?

Matt
  • 4,140
  • 9
  • 40
  • 64
  • 2
    Why are you converting a `DateTime` field to a `varchar` and then comparing it to another `DateTime`? – Adam Robinson Jun 24 '11 at 03:52
  • Because DateOfRun doesn't have a time, where as OfferOutcome does. 00:00:00 <> 14:23:22 – Matt Jun 24 '11 at 03:54
  • My guess is he is attempting to shear off the time component – billinkc Jun 24 '11 at 03:54
  • It shouldn't work on either of them. The SQL makes no sense, as Adam mentioned (converting a `DateTime` to a `Varchar(10)` and then comparing to a `DateTime`) - in addition, using `AS Days` in a `WHERE` clause isn't valid SQL AFAIK either. – Ken White Jun 24 '11 at 03:58
  • Converting to varchar is **bad**. Use date math like DateDiff and DateAdd. Or convert to a data type that doesn't have time. Or tell us your DBMS so we can give you specifics. Actually, you shouldn't even be converting, you should use a range expression instead. – ErikE Jun 24 '11 at 04:08
  • @billinkc: If he were using `101`, I would buy that. But `120` gives you the ODBC canonical format (http://msdn.microsoft.com/en-us/library/ms187928.aspx), which includes the time. – Adam Robinson Jun 24 '11 at 04:30
  • @adam: It doesn't matter if you use 101 or 120 – Matt Jun 24 '11 at 04:51
  • @Matt: `120` includes the time. The only reason you aren't getting the time in your case is because the size of your target (`varchar(10)`) is not long enough to hold it. – Adam Robinson Jun 24 '11 at 12:14

4 Answers4

2

My guess would be that DateTime conversion from string fails because of different cultures on the local and server environments.

Petr Abdulin
  • 33,883
  • 9
  • 62
  • 96
2

That's definitively odd and likely to do with regional settings but its like the old joke

A man goes to a doctor's office. He says, "Doctor, it hurts when I raise my arm over my head." The doctor replies, "Then don't raise your arm over your head."

So don't do that. Do this instead

WHERE RowIdentifier= ParentTable.RowIdentifier
          AND DateOfRun = 
           DATEADD(DAY, DATEDIFF(DAY, 0, ParentTable.OfferOutcomeDateTime), 0) 

Its the best way to go see Thomas' answer to Most efficient way in SQL Server to get date from date+time?

Then it will work regardless of regional settings because it never gets represented as a string

Community
  • 1
  • 1
Conrad Frix
  • 51,984
  • 12
  • 96
  • 155
0

You don't mention your flavour of SQL, but the most likely candidate is a discrepancy between the input format of the char field and the system's locale settings.

"3/30/2011" is the 30th of March in the US, but makes no sense in the UK. "30/3/2011" is the 30th of March in the UK, but makes no sense in the US.

Neville Kuyt
  • 29,247
  • 1
  • 37
  • 52
0

This would work on one machine and not the other if the data is different on the different machines.

It would also not work if your regional settings are different.

You should do something like SET DATEFORMAT dmy to set the format you want to use.

You could also convert/cast your date to local settings before you convert it to a varchar - http://msdn.microsoft.com/en-us/library/ms187928.aspx

SqlSandwiches
  • 187
  • 1
  • 9