There is no exact equivalence in R for polyfit
and polyvar
, as these MATLAB routines are so primitive compared with R's statistical tool box.
In MATLAB, polyfit
mainly returns polynomial regression coefficients (covariance can be obtained if required, though). polyvar
takes regression coefficients p
, and a set of new x
values to predict the fitted polynomial.
In R, the fashion is: use lm
to obtain a regression model (much broader; not restricted to polynomial regression); use summary.lm
for model summary, like obtaining covariance; use predict.lm
for prediction.
So here is the way to go in R:
## don't use `$` in formula; use `data` argument
fit <- lm(y ~ poly(x,3, raw=TRUE), data = Spectra_BIR)
Note, fit
not only contains coefficients, but also essential components for orthogonal computation. If you want to extract coefficients, do coef(fit)
, or unname(coef(fit))
if you don't want names of coefficients to be shown.
Now, to predict, we do:
x.new <- rnorm(5) ## some random new `x`
## note, `predict.lm` takes a "lm" model, not coefficients
predict.lm(fit, newdata = data.frame(x = x.new))
predict.lm
is much much more powerful than polyvar
. It can return confidence interval. Have a read on ?predict.lm
.
There are a few sensitive issues with the use of predict.lm
. There have been countless questions / answers regarding this, and you can find the root question to which I often close those questions as duplicated:
So make sure you get the good habit of using lm
and predict
at the early stage of learning R.
Extra
It is also not difficult to construct something identical to polyvar
in R. The function g
in my answer Function for polynomials of arbitrary order is doing this, although by setting nderiv
we can also get derivatives of the polynomial.