A Refined Efficiency Rate for Ordinary Least Squares and Generalized Least Squares Estimators for a Linear Trend with Autoregressive Errors

Document Type

Article

Publication Date

3-1-2012

Abstract

When a straight line is fitted to time series data, generalized least squares (GLS) estimators of the trend slope and intercept are attractive as they are unbiased and of minimum variance. However, computing GLS estimators is laborious as their form depends on the autocovariances of the regression errors. On the other hand, ordinary least squares (OLS) estimators are easy to compute and do not involve the error autocovariance structure. It has been known for 50 years that OLS and GLS estimators have the same asymptotic variance when the errors are second-order stationary. Hence, little precision is gained by using GLS estimators in stationary error settings. This article revisits this classical issue, deriving explicit expressions for the GLS estimators and their variances when the regression errors are drawn from an autoregressive process. These expressions are used to show that OLS methods are even more efficient than previously thought. Specifically, we show that the convergence rate of variance differences is one polynomial degree higher than that of least squares estimator variances. We also refine Grenander's (1954) variance ratio. An example is presented where our new rates cannot be improved upon. Simulations show that the results change little when the autoregressive parameters are estimated.

Share

COinS