We consider linear regression models where both input data (the observations of independent variables) and output data (the observations of the dependent variable) are affected by loss of information caused by uncertainty, indeterminacy, rounding or censoring. Instead of real-valued (crisp) data, only intervals are available.
We study a possibilistic generalization of the least squares estimator, so called OLS-set for the interval model. Investigation of the OLS-set allows us to quantify whether the replacement of real-valued (crisp) data by interval values can have a significant impact on our knowledge of the value of the OLS estimator.
We show that in the general case, very elementary questions about properties of the OLS-set are computationally intractable (assuming P - NP). We also focus on restricted versions of the general interval linear regression model to the crisp input case.
Taking the advantage of the fact that in the crisp input - interval output model the OLS-set is a zonotope, we design both exact and approximate methods for its description. We also discuss special cases of the regression model, e.g. a model with repeated observations.