A Quick Primer on Multi-Parametric Programming

I want to do a quick explanation of multiparametric programming and its relation to more standard mathematical programming. To start with, we need to define what mathematical programming is. This is solving an optimization problem and has the following form. Were we are trying to optimize $f(x)$ with respect to $x$ subject to the constraints $g(x)=0$, a $h(x)\leq 0$.

\[\begin{aligned} \min_{x} \quad & f(x)\\ \textrm{s.t.} \quad & g(x) = 0\\ \quad & h(x) \leq 0 \\ \end{aligned}\]

The optimal point of this is denoted as $x^* $. Particular forms of this optimization problem are given special names. If $f(x)$ is a linear function then it is called ‘linear programming’, if $f(x)$ is a quadratic function then it is called ‘quadratic programming’, other cases can be generally wrapped under term ‘nonlinear programming’. There are literally hundreds of algorithms to solve special forms of the above problem, and these fall under the general banner of deterministic optimization.

This is great and all, but what if we want to repeatedly solve the problem while changing some uncertain parameters of the optimization problem. Such as if we want to make a decision based on the optimal value in rapid succession? It would be useful to solve an optimization problem multiparametrically. The ‘multiparametric’ comes from the multiple $\theta$ parameters. This leads to the following multiparametric optimization formulation.

\[\begin{aligned} \min_{x} \quad & f(x, \theta)\\ \textrm{s.t.} \quad & g(x, \theta) = 0\\ \quad & h(x, \theta) \leq 0 \\ \end{aligned}\]

With multiparametric programming, the optimal point is a function of $\theta$, $x^* (\theta)$. In general, $x^* (\theta)$, is a piecewise function of $\theta$ where the optimization problem’s active set defines each piece.

As a quick example of multiparametric programming in practice, we will solve the following simple multiparametric quadratic programming problem. We will be using the combinatorial approach to solving the programming problem.

\[\begin{aligned} \min_{x} \quad & x^2\\ \textrm{s.t.} \quad & x \leq \theta\\ \end{aligned}\]

First, lets consider the empty active set $\mathcal{I} = \emptyset$. By inspection, if $0 < \theta$ then $x^* (\theta) = 0$. Then the only other active set combination is $\mathcal{I} = { 0 }$. We have the simple relation of $x = \theta$, and this is true if $\theta \leq 0$ then this set is activated. So the multiparametric solution to this mutliparametric optimization problem is the following.

\[x^*(\theta) = \begin{cases} \theta & \theta \leq 0 \\ 0 & \theta > 0 \end{cases}\]

Here, we can trade out the optimization problem by evaluating the above piecewise function. This has a large number of applications, such as Explicit MPC and simplifying multilevel optimization problems. For higher-dimensional problems, linear algebra and the sensitivity theorem are involved. The general algorithms that are used to solve these sorts of problems are found in the references. This is still very much a topic of active research.

Resources