0

I have this SQL query:

  SELECT DISTINCT 
         [BatchCode]
         ,SUM([Quantity]) as 'Created'
         ,[TotalQuantity]
         ,[Status]
         ,[Destination]
         ,[DateCreated]
         ,[CreatedBy]
    FROM [FGIS].[dbo].[DropshipPackinglist]
GROUP BY BatchCode, TotalQuantity, Status, Destination, CreatedBy, ModifiedBy, DateCreated

The Result is this:

BatchCode               Created   TotalQuantity   Status     Destination        DateCreated               CreatedBy
---------------------------------------------------------------------------------------------------------------
0005041007100AHWA11HG   86        86              CREATED    MediTelecom S.A.   2010-09-10  00:00:00.000    NULL
0005041007100AHWA11HGK  19        50              ALLOCATED  USA                2010-09-12 07:35:17.000     jy
0005041007100AHWA11HGK  31        50              ALLOCATED  USA                2010-09-12 07:35:20.000     jy

My Problem now is I can't Group DateCreated because of it has different time .

I want to group it by date only. Example: 2010-09-12

Thanks and regards...

ErikE
  • 48,881
  • 23
  • 151
  • 196
Crimsonland
  • 2,194
  • 3
  • 24
  • 42

2 Answers2

4

Use CAST or CONVERT to alter the DATETIME format so the time portion is omitted:

  SELECT [BatchCode],
         SUM([Quantity]) as 'Created',
         [TotalQuantity],
         [Status],
         [Destination],
         CONVERT(VARCHAR(10), [DateCreated], 101) AS datecreated,
         [CreatedBy]
    FROM [FGIS].[dbo].[DropshipPackinglist]
GROUP BY BatchCode, 
         TotalQuantity, 
         Status, 
         Destination, 
         CreatedBy, 
         ModifiedBy, 
         CONVERT(VARCHAR(10), [DateCreated], 101)
OMG Ponies
  • 325,700
  • 82
  • 523
  • 502
  • 1
    I prefer `CONVERT(VARCHAR(8), [DateCreated], 112)` because then you can sort by it as well as group by it. Of course, on SQL Server 2008 you can use `CONVERT(date, [DateCreated])` and not worry about the string conversion. – Gabe Sep 12 '10 at 04:28
  • @Gabe: You're still sorting by VARCHAR using 112 – OMG Ponies Sep 12 '10 at 04:34
  • 1
    OMG Ponies: I was just saying that if you use `112` then you get a string where sorting in lexicographical order also sorts in chronological order. Sorting on the `101` converted string will not give you results in chronological order, so as a general rule `112` works better than `101`. – Gabe Sep 12 '10 at 04:52
  • 1
    Using char conversions to chop the time off dates is **meaningfully** slower than using `DateAdd(Day, DateDiff(Day, 0, DateCreated), 0)` – ErikE Sep 12 '10 at 05:31
  • Please see [this post for more detail](http://stackoverflow.com/questions/2775/whats-the-best-way-to-remove-the-time-portion-of-a-datetime-value-sql-server/3696991#3696991) on my claims about conversion performance. – ErikE Sep 13 '10 at 00:59
2

I guess it's worth posting this separately:

Using char conversions to chop the time off dates (cast or convert to varchar) is slower than using DateAdd(Day, DateDiff(Day, 0, DateCreated), 0). I worked up full script and performance testing results to support this assertion.

SELECT DISTINCT 
   BatchCode
   ,SUM(Quantity) as Created
   ,TotalQuantity
   ,Status
   ,Destination
   ,DateAdd(Day, DateDiff(Day, 0, DateCreated), 0) as DayCreated
   ,CreatedBy
FROM FGIS.dbo.DropshipPackinglist
GROUP BY
   BatchCode,
   TotalQuantity,
   Status,
   Destination,
   CreatedBy,
   ModifiedBy,
   DateDiff(Day, 0, DateCreated) -- note that the DateAdd convert back to datetime is not needed

Also, please note that your GROUP BY list is not the same as your SELECT list so some tweaking is needed.

UPDATE

It seems that the CPU savings for using DateAdd vs. varchar conversions, while a lot relatively, isn't a lot absolutely (just fractions of a millisecond per row). However, it is still a performance difference, and it seems best to me to save every bit possible.

Community
  • 1
  • 1
ErikE
  • 48,881
  • 23
  • 151
  • 196
  • I was going to make a similar post, but I don't have any timing information to back it up. I'd love to see your quantification of *meaningfully slower*. – Gabe Sep 12 '10 at 06:43
  • @Gabe on Monday I'll try to post some info for you. What I found in past tests is that it takes a noticeable amount of CPU more to do string conversions rather than those involving math. This is a general programming truism that I've found consistent across all languages and platforms. If there are two ways to do something and one involves math and the other involves string conversion, the math way almost always wins. Sometimes there *are* efficiencies using strings when the math calculations are otherwise laborious, but for this straightforward difference the math wins. – ErikE Sep 12 '10 at 16:51
  • So it looks like a string conversion takes about 2.5us while your method takes about 0.9us. In other words, your query has to have a million rows before it makes your query becomes *meaningfully* slower. – Gabe Sep 13 '10 at 01:52
  • Thanks for putting that in real world terms, Gabe. Maybe "meaningfully" was a poor choice of words. In real-world terms, saving just a tiny bit per row may not matter that much. On the other hand, making the switch to DateDiff is easy and any performance gain we get does ease the load on the server. Could you post your numbers for the 2.5 microseconds and 0.9 microseconds calculations? – ErikE Sep 13 '10 at 01:57
  • @Gabe Okay I see where you're getting it. I think subtracting the apparent base CPU cost makes sense, so I get about 1.8μs and 0.18μs. – ErikE Sep 13 '10 at 07:34
  • Where's your recursive test answer? I've been looking for it. – OMG Ponies Jan 25 '11 at 17:01
  • @OMG-Ponies I'm not understanding what you mean by "recursive test answer." Would you help me out? – ErikE Jan 25 '11 at 17:43