0

I have a table with a grouping variable and a time variable. I want to calculate for every observation how long it is until the next observation (within the groups). My idea was to join the table with itself using a data.table rolling join, but because the original value is also present in the table I'm joining with (they are the same table, after all) this doesn't produce the desired result.

Here's an example of what I mean. group and time are the variables I have, and next_time is the variable I want to create.

 group                time           next_time
     a 2017-11-01 05:00:00 2017-11-01 06:00:00
     a 2017-11-01 06:00:00 2017-11-01 07:00:00
     a 2017-11-01 07:00:00                <NA>
     b 2017-11-01 05:00:00 2017-11-01 11:00:00
     b 2017-11-01 11:00:00                <NA>

Dummy code for the data I have:

structure(list(group = structure(c(1L, 1L, 1L, 2L, 2L), .Label = c("a", "b"), class = "factor"), time = structure(c(1509508800, 1509512400, 1509516000, 1509508800, 1509530400), class = c("POSIXct", "POSIXt"))), .Names = c("group", "time"), row.names = c(NA, -5L), class = c("data.table", "data.frame"), .internal.selfref = <pointer: 0x0000000002660788>, sorted = c("group", "time"))

Dummy code for the desired result:

structure(list(group = structure(c(1L, 1L, 1L, 2L, 2L), .Label = c("a", "b"), class = "factor"), time = structure(c(1509508800, 1509512400, 1509516000, 1509508800, 1509530400), class = c("POSIXct", "POSIXt")), next_time = structure(c(1509512400, 1509516000, NA, 1509530400, NA), class = c("POSIXct", "POSIXt"))), class = "data.frame", .Names = c("group", "time", "next_time"), row.names = c(NA, -5L))
A. Stam
  • 2,148
  • 14
  • 29
  • 2
    just use `?shift` from data.table to get the leading / lagging entry – talat Nov 21 '17 at 13:53
  • 1
    "how long it is until the next observation" - `diff`? – Henrik Nov 21 '17 at 13:55
  • `shift` works, especially because it can be combined with the `by` operator in the `data.table` syntax. If you want to submit that as an answer I can accept it as being correct – A. Stam Nov 21 '17 at 14:03

0 Answers0