摘要

We consider a multiple regression model in which the explanatory variables are specified by time series. To sequentially test for the stability of the regression parameters in time, we introduce a detector which is based on the first excess time of a CUSUM-type statistic over a suitably constructed threshold function. The aim of this paper is to study the delay time associated with this detector. As our main result, we derive the limit distribution of the delay time and provide thereby a theory that extends the benchmark average run-length concept utilized in most of the sequential monitoring literature. To highlight the applicability of the limit results in finite samples, we present a Monte Carlo simulation study and an application to macroeconomic data.

  • 出版日期2009-4