# Difference between revisions of "Line search methods"

Line 28: | Line 28: | ||

==Wolfe Conditions== | ==Wolfe Conditions== | ||

+ | These conditions, developed in 1969 by Philip Wolfe, are an inexact line search stipulation that requires <math>\alpha_k</math> to decreased the objective function by significant amount. This amount is defined by | ||

+ | [[File:CodeCogsEqn (4).gif]] | ||

+ | |||

+ | where <math>c_1</math> is between 0 and 1. Another way of describing this condition is to say that the decrease in the objective function should be proportional to both the step length and the directional derivative of the function and step direction. This inequality is also known as the ''Armijo condition''. In general, <math>c_1</math> is a very small value, ~<math>10^-4</math>. | ||

+ | |||

+ | The ''Armijo condition'' must be paired with the ''curvature condition'' | ||

+ | |||

+ | [[File:CodeCogsEqn (5).gif]] | ||

+ | |||

+ | to keep the <math>\alpha_k</math> value from being too short. In this condition, <math>c_2</math> is greater than <math>c_1</math> but less than 1 | ||

Line 37: | Line 47: | ||

==Newton Method== | ==Newton Method== | ||

==Quasi-Newton Method== | ==Quasi-Newton Method== | ||

− | |||

− | |||

=Conclusion= | =Conclusion= | ||

− | |||

− | |||

=References= | =References= | ||

Line 50: | Line 56: | ||

3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664. | 3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664. | ||

+ | |||

+ | 4. Wolfe P (1969) Convergence Conditions for Ascent Methods. SIAM Review 11(2):226-235. |

## Revision as of 22:10, 24 May 2015

Author names: Elizabeth Conger

Steward: Dajun Yue and Fengqi You

## Contents |

# Introduction

An algorithm is a line search method if it seeks the minimum of a defined nonlinear function by selecting a reasonable direction vector that, when computed iteratively with a reasonable step size, will provide a function value closer to the absolute minimum of the function. Varying these will change the "tightness" of the optimization. For example, given the function , an initial is chosen. To find a lower value of , the value of is increased by the following iteration scheme

in which is a positive scalar known as the step length and defines the step direction.

# Step Length

Choosing an appropriate step length has a large impact on the robustness of a line search method. To select the ideal step length, the following function could be minimized:

but this is not used in practical settings generally. This may give the most accurate minimum, but it would be very computationally expensive if the function has multiple local minima or stationary points, as shown in Figure 1.

A common and practical method for finding a suitable step length that is not too near to the global minimum of the function is to require that the step length of reduces the value of the target function, or that

This method does not ensure a convergence to the function's minimum, and so two conditions are employed to require a *significant decrease condition* during every iteration.

## Wolfe Conditions

These conditions, developed in 1969 by Philip Wolfe, are an inexact line search stipulation that requires to decreased the objective function by significant amount. This amount is defined by

where is between 0 and 1. Another way of describing this condition is to say that the decrease in the objective function should be proportional to both the step length and the directional derivative of the function and step direction. This inequality is also known as the *Armijo condition*. In general, is a very small value, ~.

The *Armijo condition* must be paired with the *curvature condition*

to keep the value from being too short. In this condition, is greater than but less than 1

# Step Direction

## Steepest Descent Method

## Newton Method

## Quasi-Newton Method

# Conclusion

# References

1. Sun, W. & Yuan, Y-X. (2006) Optimization Theory and Methods: Nonlinear Programming (Springer US) p 688.

2. Anonymous (2014) Line Search. (Wikipedia). http://en.wikipedia.org/wiki/Line_search.

3. Nocedal, J. & Wright, S. (2006) Numerical Optimization (Springer-Verlag New York, New York) 2 Ed p 664.

4. Wolfe P (1969) Convergence Conditions for Ascent Methods. SIAM Review 11(2):226-235.