In the following T-SQL Code:
wt.clsdt_date >= CAST('1/1/' + CAST(2011 - 1 AS varchar) AS DateTime)
what does '1/1' mean?
In the following T-SQL Code:
wt.clsdt_date >= CAST('1/1/' + CAST(2011 - 1 AS varchar) AS DateTime)
what does '1/1' mean?
(Please see the other answers for improvements to the SQL.)
'1/1/'
is string so the resulting string after the +
is '1/1/year'
, e.g.
'1/1/' + CAST(2011 - 1 as varchar)
-> '1/1/' + '2010'
-> '1/1/2010'
which is then cast (converted, really) to the appropriate type.
I would be surprised if there is not a prettier method to do this.
For SQL Server 2008 and before there does not appear to be a particularly nice standard method.
For SQL Server 2012 there is DATEFROMPARTS
as shown in this related SO post:
SELECT DATEFROMPARTS(@Year, @Month, @Day)
A much safer way to do this is:
WHERE wt.clsdt_date >= CONVERT(CHAR(4), 2011 - 1) + '0101';
You don't need to explicitly cast it to a datetime
, but you can:
WHERE wt.clsdt_date >= CONVERT(DATETIME, CONVERT(CHAR(4), 2011 - 1) + '0101');
This uses a non-regional date format and also still makes use of an index on the column, if one exists.
This technique is used to evaluate whether or not wt.clsdt_date is during or after the year 2010. The cleaner way to accomplish this would be as follows:
YEAR(wt.clsdt_date) >= 2010