I have the following table on SQL Server:
ID | FROM | TO | OFFER NUMBER |
---|---|---|---|
1 | 2022.01.02 | 9999.12.31 | 1 |
1 | 2022.01.02 | 2022.02.10 | 2 |
2 | 2022.01.05 | 2022.02.15 | 1 |
3 | 2022.01.02 | 9999.12.31 | 1 |
3 | 2022.01.15 | 2022.02.20 | 2 |
3 | 2022.02.03 | 2022.02.25 | 3 |
4 | 2022.01.16 | 2022.02.05 | 1 |
5 | 2022.01.17 | 2022.02.13 | 1 |
5 | 2022.02.05 | 2022.02.13 | 2 |
The range includes the start date but excludes the end date. The date 9999.12.31 is given (comes from another system), but we could use the last day of the current quarter instead. I need to find a way to determine the number of days when the customer sees exactly one, two, or three offers. The following picture shows the method upon id 3:
The expected results should be like (without using the last day of the quarter):
ID | # of days when the customer sees only 1 offer | # of days when the customer sees 2 offers | # of days when the customer sees 3 offers |
---|---|---|---|
1 | 2913863 | 39 | 0 |
2 | 41 | 0 | 0 |
3 | 2913861 | 24 | 17 |
4 | 20 | 0 | 0 |
5 | 19 | 8 | 0 |
I've found this article but it did not enlighten me. Also I have limited privileges that is I am not able to declare a variable for example so I need to use "basic" TSQL.
Please provide a detailed explanation besides the code.
Thanks in advance!