0

I have some data that I want to calculate a 7 day rolling sum on. Every row for a specific date should be counted as 1 occurrence. My thought process here is to use something like:

val myWindow = Window.orderBy("Date").rangeBetween(currentRow,days(7))
val myData = df.withColumn("Count",df.count().over(myWindow))

But the rangeBetween piece doesn't allow for days(7), for looking 7 days ahead from the current date.

Any thoughts?

Input Data:

val df = Seq(
    ("08/04/2013",22),
    ("08/05/2013",24),
    ("08/06/2013",26),
    ("08/07/2013",29),
    ("08/08/2013",24),
    ("08/09/2013",24),
    ("08/10/2013",22),
    ("08/11/2013",24),
    ("08/11/2013",26)
    ).toDF("Date","Code")


+----------+----+
|      Date|Code|
+----------+----+
|08/04/2013|  22|
|08/05/2013|  24|
|08/06/2013|  26|
|08/07/2013|  29|
|08/08/2013|  24|
|08/09/2013|  24|
|08/10/2013|  22|
|08/11/2013|  24|
|08/11/2013|  26|
+----------+----+

Expected output:

+----------+-----------+------+
|      Start|End|Amount|Count |
+----------+-----------+------+
|08/04/2013| 08/10/2013|7     |
|08/05/2013| 08/11/2013|8     |
+----------+-----------+------+
peakstatus
  • 381
  • 1
  • 3
  • 15

1 Answers1

0

From Spark 2.3 you have to use long values with rangeBetween. As one day has 86400 seconds, you can express your query as:

val myWindow = Window.orderBy("Date").rangeBetween(0, 7 * 86400)
val myData = df
  .withColumn("Date", to_date($"Date", "MM/dd/yyyy").cast("timestamp").cast("long"))
  .withColumn("Count", count($"*").over(myWindow))
  .withColumn("Date", $"Date".cast("timestamp").cast("date"))
bottaio
  • 4,963
  • 3
  • 19
  • 43