So I'm currently trying to perform a spatio-temporal kernel density function where I'm able to see kernel density distribution change over time. This was attempted using the sparr package. I'm running the following code:
smell_Cases <- subset(newdata_proj, smell == '1',
select=c(x,y, smell))
smell_controls <- subset(newdata_proj, smell == '0',
select=c(x,y, smell))
smell_ppp <- list()
smell_ppp$cases<-ppp((smell_Cases$x), smell_Cases$y, marks=as_vector(as.integer(smell_Cases$smell)),
window=as.owin(as_Spatial(boundary)))
smell_ppp$controls<-ppp((smell_controls$x), smell_controls$y,
window=as.owin(as_Spatial(boundary)))
smell_ppp_Cases <- smell_ppp$cases
hlam <- LIK.spattemp(smell_ppp_Cases)
Then get the following error: Error in checkranin(tlim, tt, "tlim") : 'tlim[1]' must be < 'tlim[2]'