This repository has been archived on 2024-10-22. You can view files and clone it, but cannot push or open issues or pull requests.
OM/Claudio_Maggioni_4/Claudio_Maggioni_4.md
2021-05-28 18:16:54 +02:00

10 KiB


title: Homework 4 -- Optimization Methods author: Claudio Maggioni header-includes:

  • \usepackage{amsmath}
  • \usepackage{hyperref}
  • \usepackage[utf8]{inputenc}
  • \usepackage[margin=2.5cm]{geometry}
  • \usepackage[ruled,vlined]{algorithm2e}
  • \usepackage{float}
  • \floatplacement{figure}{H}
  • \hypersetup{colorlinks=true,linkcolor=blue}

\maketitle

Exercise 1

Exercise 1.1

The lagrangian is the following:

$$L(X,\lambda) = f(X) - \lambda \left(c(x) - 0\right) = -3x^2 + y^2 + 2x^2 + 2(x+y+z) - \lambda x^2 - \lambda y^2 -\lambda z^2 + \lambda =$$$$= (-3 -\lambda)x^2 + (1- \lambda)y^2 + (2-\lambda)z^2 + 2 (x+y+z) + \lambda$$

The KKT conditions are the following:

First we have the condition on the partial derivatives of the Lagrangian w.r.t. X:

$$\nabla_X L(X,\lambda) = \begin{bmatrix}(-3-\lambda)x^* + 1\(1-\lambda)y^* + 1\(2-\lambda)z^* + 1\end{bmatrix} = 0 \Leftrightarrow \begin{bmatrix}x^\y^\z^*\end{bmatrix} = \begin{bmatrix}\frac1{3+\lambda}\-\frac1{1-\lambda}\-\frac{1}{2-\lambda}\end{bmatrix}

Then we have the complementarity condition:

c(X) = {x^*}^2 + {y^*}^2 + {z^*}^2 - 1 = 0 \Leftrightarrow \|X^*\| = 1

$$\lambda^* c(X^) = 0 \Leftarrow c(X^) = 0 \text{ which is true if the above condition is true.}

Since we have no inequality constraints, we don't need to apply the KKT conditions realated to inequality constraints.

Exercise 1.2

To find feasible solutions to the problem, we apply the KKT conditions. Since we have a way to derive X^* from \lambda^* thanks to the first KKT condition, we try to find the values of \lambda that satisfies the second KKT condition:

$$c(x) = \left( \frac{1}{3+\lambda} \right)^2 + \left( -\frac{1}{1-\lambda} \right)^2 + \left(-\frac{1}{2-\lambda}\right)^2 - 1 = \frac{1}{(3+\lambda)^2} + \frac{1}{(1-\lambda)^2} + \frac{1}{(2-\lambda)^2} - 1 =$$$$= \frac{(1-\lambda)^2(2-\lambda)^2 + (3+\lambda)^2(2-\lambda)^2 + (3+\lambda)^2 (1-\lambda)^2 - (3+\lambda)^2 (1-\lambda)^2 (2-\lambda)^2}{(3+\lambda)^2 (1-\lambda)^2 (2-\lambda)^2} = 0 \Leftrightarrow$$$$\Leftrightarrow (1-\lambda)^2(2-\lambda)^2 + (3+\lambda)^2(2-\lambda)^2 + (3+\lambda)^2 (1-\lambda)^2 - (3+\lambda)^2 (1-\lambda)^2 (2-\lambda)^2 = 0 \Leftrightarrow$$$$\Leftrightarrow (\lambda^4 - 6\lambda^3 + 13\lambda^2 - 12\lambda + 16) + (\lambda^4 + 2\lambda^3 - 11\lambda^2 - 12\lambda + 36) + (\lambda^4 + 4\lambda^3 - 2\lambda^2 - 12\lambda + 9)$$

  • (\lambda^6 -14\lambda^4 +12\lambda^3 +49\lambda^2 -84\lambda + 36) = $$$$ =-\lambda^6 +17\lambda^4 -12\lambda^3 -49\lambda^2 +48\lambda +13 = 0 \Leftrightarrow $$ \Leftrightarrow \lambda = \lambda_1 \approx -0.224 \lor \lambda = \lambda_2 \approx -1.892 \lor \lambda = \lambda_3 \approx 3.149 \lor \lambda = \lambda_4 \approx -4.035$$

We then compute X from each solution and evaluate the objective each time:

$$X = \begin{bmatrix}\frac1{3+\lambda}\-\frac1{1-\lambda}\ -\frac{1}{2-\lambda}\end{bmatrix} \Leftrightarrow$$$$\Leftrightarrow X = X_1 \approx \begin{bmatrix}0.360\-0.817\-0.450\end{bmatrix} \lor X = X_2 \approx \begin{bmatrix}0.902\-0.346\-0.257\end{bmatrix} \lor X = X_3 \approx \begin{bmatrix}0.163\0.465\0.870\end{bmatrix} \lor X = X_4 \approx \begin{bmatrix}-0.966\-0.199\-0.166\end{bmatrix}

$$f(X_1) = -1.1304 ;; f(X_2) = -1.59219 ;; f(X_3) = 4.64728 ;; f(X_4) = -5.36549$$

Exercise 1.3

To find the optimal solution, we choose (\lambda_4, X_4) since f(X_4) is the smallest objective value out of all the feasible points. Therefore, the solution to the constrained minimization problem is:

X \approx \begin{bmatrix}-0.966\\-0.199\\-0.166\end{bmatrix}

Exercise 2

Exercise 2.1

To reformulate the problem, we first rewrite the explicit values of G, c, A and b.

We first define matrix G as a set of 9 unknown variables and c a set of 3 unknown variables:

$$G = \begin{bmatrix}g_{11}&g_{12}&g_{13}\g_{21}&g_{22}&g_{23}\ g_{31}&g_{32}&g_{33}\end{bmatrix} ;;; c = \begin{bmatrix}c_1\c_2\c_3\end{bmatrix}

We then define f(x) in the following way:

$$f(x) = \frac12 \cdot \begin{bmatrix}x_1&x_2&x_3\end{bmatrix} \begin{bmatrix}g_{11}&g_{12}&g_{13}\g_{21}&g_{22}&g_{23}\ g_{31}&g_{32}&g_{33}\end{bmatrix} \begin{bmatrix}x_1\x_2\x_3\end{bmatrix}

  • \begin{bmatrix}c_1&c_2&c_3\end{bmatrix}\begin{bmatrix}x_1\x_2\x_3\end{bmatrix} =$$$$=x_1^2 \cdot \frac{g_{11}}{2} + x_2^2 \cdot \frac{g_{22}}{2} + x_3^2 \cdot \frac{g_{33}}{2} + \left(\frac{g_{12} + g_{21}}{2}\right) x_1 x_2
  • \left(\frac{g_{13} + g_{31}}{2}\right) x_1 x_3 + \left(\frac{g_{23} + g_{32}}{2}\right) x_2 x_3 + c_1 x_1 + c_2 x_2 + c_3 x_3$$

Then, we equal this polynomial to the given one, finding the following values and constraints for the coefficients of G and g:

$$\begin{cases}g_{11} = 3 \cdot 2 = 6 \ g_{22} = 2.5\cdot 2 = 5 \ g_{33} = 2 \cdot 2 = 4 \ c_1 = -8 \ c_2 = -3 \ c_3 = -3 \ g_{13} + g_{31} = 1 \cdot 2 = 2 \ g_{12} + g_{21} = 2 \cdot 2 = 4 \ g_{23} + g_{32} = 2 \cdot 2 = 4 \end{cases}

As it can be seen by the system of equations above, we have infinite possibility for choosing the components of the G matrix that are not on the main diagonal. Due to personal taste, we choose those components in such a way that the resulting G matrix is symmetric. We therefore obtain:

$$G = \begin{bmatrix}6&2&1\2&5&2\1&2&4\end{bmatrix} ;;; c = \begin{bmatrix}-8\-3\-3\end{bmatrix}

We perform a similar process for matrix A and vector b

$$Ax = b \Leftrightarrow \begin{bmatrix}a_{11}&a_{12}&a_{13}\a_{21}&a_{22}&a_{23}\end{bmatrix} \begin{bmatrix}x_1\x_2\x_3\end{bmatrix} = \begin{bmatrix}b_1\b_2\end{bmatrix} \Leftrightarrow \begin{cases}a_{11}x_1 + a_{12}x_2 + a_{13}x_3 = b_1\a_{21}x_1 + a_{22}x_2 + a_{23}x_3 = b_2\end{cases}

To make this system match the given system of equality constraints, we need to set the components of A and b in the following way:

$$\begin{cases}a_{11} = 1\a_{12} = 0\a_{13} = 1\a_{21} = 0\a_{22}=1\a_{23}=1 \b_1 = 3 \b_2 = 0\end{cases}

Therefore, we obtain the following A matrix and b vector:

A = \begin{bmatrix}1&0&1\\0&1&1\end{bmatrix} \;\;\; b = \begin{bmatrix}3\\0\end{bmatrix}

Then, using these G, c, A and b values, and using the quadratic formulation of the problem written on the assignment sheet, the problem is restated in the desired new form.

Exercise 2.2

The lagrangian for this problem is the following:

$$L(x, \lambda) = \frac12\langle x, Gx\rangle + \langle x, c \rangle - \lambda (Ax - b) =$$$$= \begin{bmatrix}x_1&x_2&x_3\end{bmatrix} \begin{bmatrix}6&2&1\2&5&2\1&2&4\end{bmatrix} \begin{bmatrix}x_1\x_2\x_3\end{bmatrix} + \begin{bmatrix}x_1&x_2&x_3\end{bmatrix} \begin{bmatrix}-8\-3\-3\end{bmatrix} - \lambda \left(\begin{bmatrix}1&0&1\0&1&1\end{bmatrix} \begin{bmatrix}x_1\x_2\x_3\end{bmatrix} - \begin{bmatrix}3\0\end{bmatrix}\right)

The KKT conditions are the following:

First we have the condition on the partial derivatives of the Lagrangian w.r.t. X:

$$\nabla_x L(x, \lambda) = Gx + c - A^T \lambda = \begin{bmatrix} 6 x_1 + 2 x_2 + x_3 - 8 + \lambda_1\ 2 x_1 + 5 x_2 + 2 x_3 - 3 + \lambda_2\ 1 x_1 + 2 x_2 + 4 x_3 - 3 + \lambda_1 + \lambda_2\end{bmatrix} = 0$$

Then we have the conditions on the equality constraint:

$$Ax - b = 0 \Leftrightarrow \begin{bmatrix}x_1 + x_3\x_2 + x_3\end{bmatrix} = \begin{bmatrix}3\0\end{bmatrix}

Then we have the complementarity condition:

$$\lambda^T (Ax - b) = 0 \Leftarrow Ax - b = 0 \text{ which is true if the above condition is true.}

Since we have no inequality constraints, we don't need to apply the KKT conditions realated to inequality constraints.

Exercise 3

Exercise 3.1

The lagrangian of this problem is the following:

L(x, \lambda) = c^T x - \lambda^T (Ax - b) - s^T x

The KKT conditions are the following:

  1. The partial derivative of the lagrangian w.r.t. x is 0: $$\nabla_x L(x, \lambda, s) = c - A^T \lambda - s = 0 \Leftrightarrow A^T \lambda
    • s = c$$
  2. Equality constraints hold:
    Ax - b = 0 \Leftrightarrow Ax = b
  3. Inequality constraints hold:
    x \geq 0
  4. The lagrangian multipliers for inequality constraints are non-negative:
    s \geq 0
  5. The complementarity condition holds (here considering only inequality constraints, since the condition trivially holds for equality ones): $$s^T x \geq 0$$

Exercise 3.2

We define the dual problem is the following way:

$$\max b^T \lambda ;; \text{ s.t. } ;; c - A^T \lambda \geq 0 \Leftrightarrow A^T \lambda \leq c ;$$

We then introduce a slack variable s to find the equality and inequality constraints:

$$\max b^T \lambda ;; \text{ s.t. } ;; A^T \lambda + s = c ; \text{ and }; s \geq 0$$

To convert this maximization problem in a minimization one (in order to achieve standard form), we flip the sign of the objective and we find:

$$\min - b^T \lambda ;; \text{ s.t. } ;; A^T \lambda + s = c ; \text{ and }; s \geq 0$$

We then compute the Lagrangian of the dual problem:

$$L(\lambda, x, s) = -b^T \lambda + x^T (A^T \lambda + s - c) - x^T s = - b^T \lambda + x^T (A^T \lambda - c)

The KKT conditions are the following:

  1. The partial derivative of the lagrangian w.r.t.\ \lambda is 0: $$\nabla_{\lambda} L(\lambda, x) = - b^T + x^T A^T = 0 \Leftrightarrow Ax = b$$
  2. Equality constraints hold: $$A^T \lambda + s = c$$
  3. Inequality constraints hold: $$c - A^T \lambda \geq 0 \Leftrightarrow s \geq 0 ;;; \text{ using 2.\ to find that } s = c - A^T \lambda$$
  4. The lagrangian multipliers for inequality constraints are non-negative: $$x \geq 0$$
  5. The complementarity condition holds (here considering only inequality constraints, since the condition trivially holds for equality ones): $$x^T s \geq 0 \Leftrightarrow s^T x \geq 0$$

Then, if we compare the KKT conditions of the primal problem with the ones above we can match them to see that they are identical:

  • 1.\ from the dual is identical to 2.\ from the primal;
  • 2.\ from the dual is identical to 1.\ from the primal;
  • 3.\ from the dual is identical to 4.\ from the primal;
  • 4.\ from the dual is identical to 3.\ from the primal;
  • 5.\ from the dual is identical to 5.\ from the primal.

Therefore, the primal and the dual problem are equivalent.