midterm: done all except 2.1{a,d,f}

This commit is contained in:
Claudio Maggioni 2021-05-10 17:30:38 +02:00
parent 414b351bf0
commit 5026da324a
3 changed files with 54 additions and 8 deletions

View file

@ -376,7 +376,7 @@ We first show that the lemma holds for $\tau \in [0,1]$. Since
$$\|\tilde{p}(\tau)\| = \|\tau p^U\| = \tau \|p^U\| \text{ for } \tau \in [0,1]$$ $$\|\tilde{p}(\tau)\| = \|\tau p^U\| = \tau \|p^U\| \text{ for } \tau \in [0,1]$$
Then the norm of the step $\tilde{p}$ clearly increases monotonically as $\tau$ Then the norm of the step $\tilde{p}$ clearly increases as $\tau$
increases. For the second criterion, we compute the quadratic model for a increases. For the second criterion, we compute the quadratic model for a
generic $\tau \in [0,1]$: generic $\tau \in [0,1]$:
@ -388,17 +388,18 @@ term in function
of $\tau^2$ is negative and thus the model for an increasing $\tau \in [0,1]$ of $\tau^2$ is negative and thus the model for an increasing $\tau \in [0,1]$
decreases monotonically (to be precise quadratically). decreases monotonically (to be precise quadratically).
Now we show that the monotonicity claims hold also for $\tau \in [1,2]$. We Now we show that the two claims on gradients hold also for $\tau \in [1,2]$. We
define a function $h(\alpha)$ (where $\alpha = \tau - 1$) with same monotonicity define a function $h(\alpha)$ (where $\alpha = \tau - 1$) with same gradient
as $\|\tilde{p}(\tau)\|$ and we show that this function monotonically increases: "sign"
as $\|\tilde{p}(\tau)\|$ and we show that this function increases:
$$h(\alpha) = \frac12 \|\tilde{p}(1 - \alpha)\|^2 = \frac12 \|p^U + \alpha(p^B - $$h(\alpha) = \frac12 \|\tilde{p}(1 - \alpha)\|^2 = \frac12 \|p^U + \alpha(p^B -
p^U)\|^2 = \frac12 \|p^U\|^2 + \frac12 \alpha^2 \|p^B - p^U\|^2 + \alpha (p^U)^T p^U)\|^2 = \frac12 \|p^U\|^2 + \frac12 \alpha^2 \|p^B - p^U\|^2 + \alpha (p^U)^T
(p^B - p^U)$$ (p^B - p^U)$$
We now take the derivative of $h(\alpha)$ and we show it is always positive, We now take the derivative of $h(\alpha)$ and we show it is always positive,
i.e. that $h(\alpha)$ has always positive gradient and thus that is it i.e. that $h(\alpha)$ has always positive gradient and thus that it is
monotonically increasing w.r.t. $\alpha$: increasing w.r.t. $\alpha$:
$$h'(\alpha) = \alpha \|p^B - p^U\|^2 + (p^U)^T (p^B - p^U) \geq (p^U)^T (p^B - p^U) $$h'(\alpha) = \alpha \|p^B - p^U\|^2 + (p^U)^T (p^B - p^U) \geq (p^U)^T (p^B - p^U)
= \frac{g^Tg}{g^TBg}g^T\left(- \frac{g^Tg}{g^TBg}g + B^{-1}g\right) =$$$$= \|g\|^2 = \frac{g^Tg}{g^TBg}g^T\left(- \frac{g^Tg}{g^TBg}g + B^{-1}g\right) =$$$$= \|g\|^2
@ -445,7 +446,50 @@ Which, since $B$ is symmetric, in turn is equivalent to writing:
$$g^T g \leq (g^TBg) (g^T B^{-1} g)$$ $$g^T g \leq (g^TBg) (g^T B^{-1} g)$$
which is what we needed to show to prove that the first monotonicity constraint which is what we needed to show to prove that the first gradient constraint
holds for $\tau \in [1,2]$. holds for $\tau \in [1,2]$.
**TBD** For the second constraint, we adopt a similar strategy as for before and we
define a new function $\hat{h}(\alpha) = m(\tilde{p}(1 + \alpha))$, thus
plugging the Dogleg step in the quadratic model:
$$\hat{h}(\alpha) = m(\tilde{p}(1+\alpha)) = f + g^T (p^U + \alpha (p^B - p^U)) +
\frac12 (p^U + \alpha (p^B - p^U))^T B (p^U + \alpha (p^B - p^U)) = $$$$ =
f + g^T p^U + \alpha g^T (p^B - p^U) + (p^U)^T B p^U + \frac12 \alpha (p^U)^T B
(p^B - p^U) + \frac12 \alpha (p^B - p^U)^T B p^U + \frac12 \alpha^2
(p^B - p^U)^T B (p^B - p^U)$$
We now derive $\hat{h}(\alpha)$:
$$\hat{h}'(\alpha) = g^T (p^B - p^U) + \frac12 (p^U)^T B (p^B - p^U) + \frac12
(p^B - p^U)^T B p^U + \alpha (p^B - p^U)^T B (p^B - p^U) = $$$$
= (p^B - p^U)^T g + \frac12 ((p^U)^T B (p^B - p^U))^T +
\frac12 (p^B - p^U)^T B p^U + \alpha (p^B - p^U)^T B (p^B - p^U) = $$$$
= (p^B - p^U)^T g + \frac12 (p^B - p^U) B^T (p^U)^T +
\frac12 (p^B - p^U)^T B p^U + \alpha (p^B - p^U)^T B (p^B - p^U) = $$$$
= (p^B - p^U)^T (g + \frac12 \cdot 2 \cdot B p^U) + \alpha (p^B - p^U)^T B
(p^B - p^U) \leq $$$$
\leq (p^B - p^U)^T(g + B p^U + B (p^B - p^U)) = $$$$
=(p^B - p^U)^T(g+Bp^B) = 0$$
and we therefore obtain $\hat{h}(\alpha) \leq 0$, thus finding that the
$m(\tilde{p})$ is indeed a decreasing function of $\tau$ or $\alpha = \tau - 1$
also for $\tau \in [1,2]$, thus completing the proof for the lemma.
<!--
$$\hat{h}'(\alpha) = g^T (p^B - p^U) + \frac12 (p^U)^T B (p^B - p^U) + \frac12
(p^B - p^U)^T B p^U + \alpha (p^B - p^U)^T B (p^B - p^U) = $$$$
= (p^B - p^U)^T g + \frac12 ((p^U)^T B (p^B - p^U))^T +
\frac12 (p^B - p^U)^T B p^U + \alpha (p^B - p^U)^T B (p^B - p^U) = $$$$
= (p^B - p^U)^T g + \frac12 (p^B - p^U) B^T (p^U)^T +
\frac12 (p^B - p^U)^T B p^U + \alpha (p^B - p^U)^T B (p^B - p^U) = $$$$
= (p^B - p^U)^T (g + \frac12 \cdot 2 \cdot B p^U) + \alpha (p^B - p^U)^T B
(p^B - p^U) \leq $$$$
\leq (p^B - p^U)^T(g + B p^U + B (p^B - p^U)) = $$$$
=(p^B - p^U)T(g+Bp^B) = 0$$
-->

View file

@ -1,3 +1,5 @@
% Discussed with: Gianmarco De Vita (MATLAB solver for determining \tau)
function pk = dogleg(B, g, deltak) function pk = dogleg(B, g, deltak)
pnewton = - (B \ g); pnewton = - (B \ g);