Next: A quasi-Newton method for
Up: Definitions and Conditions for
Previous: Theorem
Let
with
. For sufficiently small t we have
![\begin{displaymath}
f(\bold{x}^* + t\bold{u})=f(\bold{x}^*) + t\nabla f(\bold{x}...
...\frac{t^2}{2}\bold{u}^T\nabla^2 f(\bold{x}^*)\bold{u}
+ o(t^2).\end{displaymath}](img14.gif)
But
giving
![\begin{displaymath}
f(\bold{x}^* + t\bold{u})=f(\bold{x}^*) + \frac{t^2}{2}\bold{u}^T\nabla^2 f(\bold{x}^*)\bold{u} + o(t^2).\end{displaymath}](img15.gif)
If
is positive definite, its smallest eigenvalue
obeys
.So we have
![\begin{displaymath}
f(\bold{x}^* + t\bold{u})- f(\bold{x}^*) \geq \frac{\lambda}{2}\parallel t\bold{u}\parallel^2 + o(t^2) \gt 0.\end{displaymath}](img18.gif)
Then,
is a local minimizer for f.
We see that a sufficient condition for a local minimizer is
and
(Hessian) is positive definite.
These conditions are very important and should guide us in the choice of an optimization strategy.
Quadratic functions form the basis for most of the algorithms in optimization, in particular
for the quasi-Newton method detailed in this paper. It is then important to discuss some issues
involved with these functions.
Now, if we pose a quadratic objective function
![\begin{displaymath}
f(\bold{x})=-\bold{x}^T\bold{b}+\frac{1}{2}\bold{x}^T\bold{H}\bold{x},\end{displaymath}](img19.gif)
we see that we want to solve
![\begin{displaymath}
\nabla f(\bold{x})= -\bold{b} + \bold{H}\bold{x}=0.\end{displaymath}](img20.gif)
We may assume that the Hessian
is symmetric because
![\begin{displaymath}
\bold{x^T}\bold{H}\bold{x}=\bold{x^T}\frac{\bold{H^T}+\bold{H}}{2}\bold{x}.\end{displaymath}](img22.gif)
So, the unique global minimizer is the solution of the system above if
(the Hessian) is spd.
Next: A quasi-Newton method for
Up: Definitions and Conditions for
Previous: Theorem
Stanford Exploration Project
4/27/2000