A multivariate errors-in-variables (EIV) model with an intercept term, and a polynomial EIV model are considered. Focus is made on a structural homoskedastic case, where vectors of covariates are i.i.d. and measurement errors are i.i.d. as well. The covariates contaminated with errors are normally distributed and the corresponding classical errors are also assumed normal. In both models, it is shown that (inconsistent) ordinary least squares estimators of regression parameters yield an a.s. approximation to the best prediction of response given the values of observable covariates. Thus, not only in the linear EIV, but in the polynomial EIV models as well, consistent estimators of regression parameters are useless in the prediction problem, provided the size and covariance structure of observation errors for the predicted subject do not differ from those in the data used for the model fitting.
We consider a multivariable functional errors-in-variables model $AX\approx B$, where the data matrices A and B are observed with errors, and a matrix parameter X is to be estimated. A goodness-of-fit test is constructed based on the total least squares estimator. The proposed test is asymptotically chi-squared under null hypothesis. The power of the test under local alternatives is discussed.
We consider a multivariate functional measurement error model $AX\approx B$. The errors in $[A,B]$ are uncorrelated, row-wise independent, and have equal (unknown) variances. We study the total least squares estimator of X, which, in the case of normal errors, coincides with the maximum likelihood one. We give conditions for asymptotic normality of the estimator when the number of rows in A is increasing. Under mild assumptions, the covariance structure of the limit Gaussian random matrix is nonsingular. For normal errors, the results can be used to construct an asymptotic confidence interval for a linear functional of X.