ar.yw(x, aic=T, order.max=<<see below>>)
TRUE
, use the Akaike information criterion to choose the
best order not greater than
order.max
.
If
FALSE
,
order.max
will be the order of the model fitted.
10 * log10(n.used/ncol(x))
.
aic=TRUE
, then this is the order less than or equal to
order.max
which minimizes the AIC, otherwise, it is
order.max
.
order
by "nser" by "nser" array,
where "nser" is the number of univariate components of
x
(
1
or
ncol(x)
).
If
order
is
0
,
ar
will have dimensions
1
by "nser" by "nser"
and will be filled with zeros.
The first level of the first dimension corresponds to one observation back in
time, the second level corresponds to two observations back, etc.
The second dimension corresponds to the predicted series, and the third
corresponds to the predicting series.
ar
.
0
through
order.max
.
These have the minimum value subtracted from all of them so the minimum
is always zero.
order.max
.
ar
are used in the forward direction on the
series with mean(s) removed.
"yule-walker"
.
x
.
Transformations are included.
Continuous strings of missing values are removed from the ends of the
series.
Any remaining missing values will cause an error.
First the autocovariance matrices of the time series are estimated
and then Whittle's recursion
(a multivariate extension of the Levinson-Durbin method)
is used to estimate the autoregressive coefficients.
The output may be used in
spec.ar
to estimate the spectrum of the process,
or
acf.plot
to produce a plot of the partial autocorrelation function.
The estimation is performed using the sample mean of each univariate
series as the estimate of the mean.
Remember that the coefficients in
ar
are for the series with the mean(s)
removed.
Whittle, P. (1983). Prediction and Regulation by Linear Least-Squares Methods, 2nd ed. Univ. Minnesota Press, Minn.
a <- ar.yw(log(lynx)) acf.plot(a, conf=T) # look at the partial correlations tsplot(a$aic) # and at the shape of Akaike's criteria. llynx.ar.fit <- ar.yw(log(lynx), aic=F, order=11) # Fit an AR(11).