In a continuous time nonlinear regression model the residual correlogram is considered as an estimator of the stationary Gaussian random noise covariance function. For this estimator the functional central limit theorem is proved in the space of continuous functions. The result obtained shows that the limiting sample continuous Gaussian random process coincides with the limiting process in the central limit theorem for standard correlogram of the random noise in the specified regression model.
A multivariate errors-in-variables (EIV) model with an intercept term, and a polynomial EIV model are considered. Focus is made on a structural homoskedastic case, where vectors of covariates are i.i.d. and measurement errors are i.i.d. as well. The covariates contaminated with errors are normally distributed and the corresponding classical errors are also assumed normal. In both models, it is shown that (inconsistent) ordinary least squares estimators of regression parameters yield an a.s. approximation to the best prediction of response given the values of observable covariates. Thus, not only in the linear EIV, but in the polynomial EIV models as well, consistent estimators of regression parameters are useless in the prediction problem, provided the size and covariance structure of observation errors for the predicted subject do not differ from those in the data used for the model fitting.