The total least squares (TLS) represents a popular data fitting approach for solving linear approximation problems Ax approximate to b (i.e., with a vector right-hand side) and AX approximate to B (i.e., with a matrix right-hand side) contaminated by errors. This paper introduces a generalization of TLS formulation to problems with structured right-hand sides.
First, we focus on the case, where the right-hand side and consequently also the solution are tensors. We show that whereas the basic solvability result can be obtained directly by matricization of both tensors, generalization of the core problem reduction is more complicated.
The core reduction allows to reduce mathematically the problem dimensions by removing all redundant and irrelevant data from the system matrix and the right-hand side. We prove that the core problems within the original tensor problem and its matricized counterpart are in general different.
Then, we concentrate on problems with even more structured right-hand sides, where the same model A corresponds to a set of various tensor right-hand sides. Finally, relations between the matrix and tensor core problem are discussed.