Charles Explorer logo
🇨🇿

EIV regression with bounded errors in data: total 'least squares' with Chebyshev norm

Publikace na Matematicko-fyzikální fakulta |
2020

Tento text není v aktuálním jazyce dostupný. Zobrazuje se verze "en".Abstrakt

We consider the linear regression model with stochastic regressors and stochastic errors both in regressors and the dependent variable ("structural EIV model"), where the regressors and errors are assumed to satisfy some interesting and general conditions, different from traditional assumptions on EIV models (such as Deming regression). The most interesting fact is that we need neither independence of errors, nor identical distributions, nor zero means.

The first main result is that the TLS estimator, where the traditional Frobenius norm is replaced by the Chebyshev norm, yields a consistent estimator of regression parameters under the assumptions summarized below. The second main result is that we design an algorithm for computation of the estimator, reducing the computation to a family of generalized linear-fractional programming problems (which are easily computable by interior point methods).

The conditions under which our estimator works are (said roughly): it is known which regressors are affected by random errors and which are observed exactly; that the regressors satisfy a certain asymptotic regularity condition; all error distributions, both in regressors and in the endogenous variable, are bounded in absolute value by a common bound (but the bound is unknown and is estimated); there is a high probability that we observe a family of data points where the errors are close to the bound. We also generalize the method to the case that the bounds of errors in the dependent variable and regressors are not the same, but their ratios are known or estimable.

The assumptions, under which our estimator works, cover many settings where the traditional TLS is inconsistent.