This paper establishes the conditions for the existence of a stationary solution to the first-order autoregressive equation on a plane as well as properties of the stationary solution. The first-order autoregressive model on a plane is defined by the equation
A stationary solution X to the equation exists if and only if $(1-a-b-c)(1-a+b+c)(1+a-b+c)(1+a+b-c)\gt 0$. The stationary solution X satisfies the causality condition with respect to the white noise ϵ if and only if $1-a-b-c\gt 0$, $1-a+b+c\gt 0$, $1+a-b+c\gt 0$ and $1+a+b-c\gt 0$. A sufficient condition for X to be purely nondeterministic is provided.
An explicit expression for the autocovariance function of X on the axes is provided. With Yule–Walker equations, this facilitates the computation of the autocovariance function everywhere, at all integer points of the plane. In addition, all situations are described where different parameters determine the same autocovariance function of X.
In this paper the study of a three-parametric class of Gaussian Volterra processes is continued. This study was started in Part I of the present paper. The class under consideration is a generalization of a fractional Brownian motion that is in fact a one-parametric process depending on Hurst index H. On the one hand, the presence of three parameters gives us a freedom to operate with the processes and we get a wider application possibilities. On the other hand, it leads to the need to apply rather subtle methods, depending on the intervals where the parameters fall. Integration with respect to the processes under consideration is defined, and it is found for which parameters the processes are differentiable. Finally, the Volterra representation is inverted, that is, the representation of the underlying Wiener process via Gaussian Volterra process is found. Therefore, it is shown that for any indices for which Gaussian Volterra process is defined, it generates the same flow of sigma-fields as the underlying Wiener process – the property that has been used many times when considering a fractional Brownian motion.
is considered, where W is a standard Wiener process, $\alpha >-\frac{1}{2}$, $\gamma >-1$, and $\alpha +\beta +\gamma >-\frac{3}{2}$. It is proved that the process X is well-defined and continuous. The asymptotic properties of the variances and bounds for the variances of the increments of the process X are studied. It is also proved that the process X satisfies the single-point Hölder condition up to order $\alpha +\beta +\gamma +\frac{3}{2}$ at point 0, the “interval” Hölder condition up to order $\min \big(\gamma +\frac{3}{2},\hspace{0.2222em}1\big)$ on the interval $[{t_{0}},T]$ (where $0<{t_{0}}<T$), and the Hölder condition up to order $\min \big(\alpha +\beta +\gamma +\frac{3}{2},\hspace{0.2778em}\gamma +\frac{3}{2},\hspace{0.2778em}1\big)$ on the entire interval $[0,T]$.
This paper deals with a homoskedastic errors-in-variables linear regression model and properties of the total least squares (TLS) estimator. We partly revise the consistency results for the TLS estimator previously obtained by the author [18]. We present complete and comprehensive proofs of consistency theorems. A theoretical foundation for construction of the TLS estimator and its relation to the generalized eigenvalue problem is explained. Particularly, the uniqueness of the estimate is proved. The Frobenius norm in the definition of the estimator can be substituted by the spectral norm, or by any other unitarily invariant norm; then the consistency results are still valid.
We consider the two-line fitting problem. True points lie on two straight lines and are observed with Gaussian perturbations. For each observed point, it is not known on which line the corresponding true point lies. The parameters of the lines are estimated.
This model is a restriction of the conic section fitting model because a couple of two lines is a degenerate conic section. The following estimators are constructed: two projections of the adjusted least squares estimator in the conic section fitting model, orthogonal regression estimator, parametric maximum likelihood estimator in the Gaussian model, and regular best asymptotically normal moment estimator.
The conditions for the consistency and asymptotic normality of the projections of the adjusted least squares estimator are provided. All the estimators constructed in the paper are equivariant. The estimators are compared numerically.
We consider the Berkson model of logistic regression with Gaussian and homoscedastic error in regressor. The measurement error variance can be either known or unknown. We deal with both functional and structural cases. Sufficient conditions for identifiability of regression coefficients are presented.
Conditions for identifiability of the model are studied. In the case where the error variance is known, the regression parameters are identifiable if the distribution of the observed regressor is not concentrated at a single point. In the case where the error variance is not known, the regression parameters are identifiable if the distribution of the observed regressor is not concentrated at three (or less) points.
The key analytic tools are relations between the smoothed logistic distribution function and its derivatives.