We consider the problem of minimization of a positive definite quadratic form; this problem has a unique optimal solution. The question here is what are the largest allowable variations of the input data such that the optimal solution will not exceed given bounds? This problem is called global sensitivity analysis since, in contrast to the traditional sensitivity analysis, it deals with variations of possibly all input coefficients.
We propose a general framework for approaching the problem with any matrix norm. We also focus on some commonly used norms and investigate for which of them the problem is efficiently solvable.
Particularly for the max-norm, the problem is NP-hard, so we turn our attention to computationally cheap bounds.