When making predictions for a linear statistical model we usually have a model matrix X
of predictors corresponding to the points at which we want to make predictions; a vector of coefficients beta
; and a variance-covariance matrix V
. Computing the predictions is just X %*% beta
. The most straightforward way to compute the variances of the predictions is
diag(X %*% V %*% t(X))
or slightly more efficiently
diag(X %*% tcrossprod(V,X))
However, this is very inefficient, because it constructs an n*n matrix when all we really want is the diagonal. I know I could write some Rcpp-loopy thing that would compute just the diagonal terms, but I'm wondering if there is an existing linear algebra trick in R that will nicely do what I want ... (if someone wants to write the Rcpp-loopy thing for me as an answer I wouldn't object, but I'd prefer a pure-R solution)
FWIW predict.lm
seems to do something clever by multiplying X
by the inverse of the R component of the QR-decomposition of the lm
; I'm not sure that's always going to be available, but it might be a good starting point (see here)