L-BFGS
L-BFGS is an algorithm for quasi-Newton optimization. L-BFGS uses the Broyden–Fletcher–Goldfarb–Shanno update to approximate the Hessian matrix (L-BFGS stands for 'limited memory BFGS'). L-BFGS is particularly well suited for optimization problems with a large number of dimensions. This is because L-BFGS never explicitly forms or stores the Hessian matrix, which can be quite expensive when the number of dimensions <math>n\,\!</math> becomes large. Instead, L-BFGS maintains a history of the past <math>m\,\!</math> updates of the position <math>x\,\!</math> and gradient <math>\nabla f(x)</math>, where generally the history <math>m\,\!</math> can be short, often less than 10. These updates are used to implicitly do operations requiring the Hessian (or its inverse). While strictly, a straight-forward BFGS implementation at the <math>i\,\!</math>th iteration would represent the Hessian approximation as informed by all updates on <math>0 \ldots i-1</math>, L-BFGS does quite well using updates from only the most recent iterations <math>i-m \ldots i-1</math>.
Representation
An L-BFGS implementation looks just like any other straight-forward quasi-Newton algorithm, with the exception that the process of getting the direction <math>d_k=-H_k g_k\,\!</math>. There are multiple published approaches to using a history of updates to form this direction vector. Here, we give a common approach, the so-called "two loop recursion" initially given by [1] and [2].
We'll take as given <math>x_k\,\!</math>, the position at the <math>k\,\!</math>th iteration, and <math>g_k=\nabla f(x_k)</math> where <math>f\,\!</math> is the function being minimized, where both are column vectors. Then we keep the updates <math>s_k = x_k - x_{k-1}\,\!</math> and <math>y_k = g_k - g_{k-1}\,\!</math>. We'll define <math>r_k = \frac{1}{y^T_k s_k} </math>. <math>H^0_k\,\!</math> will be the 'initial' approximate inverse Hessian that our estimate at iteration <math>k\,\!</math> begins with. Then we can compute the (uphill) direction as follows :
- <math>q = g_k\,\!</math>
- For <math>i=m-1 \ldots 0</math>
- <math>a_i = r_i s^T_i q\,\!</math>
- <math>q = q - a_i y_i\,\!</math>
- <math>z = H^{0}_k q</math>
- For <math>i=0 \ldots m-1</math>
- <math>b_i = r_i y^T_i z\,\!</math>
- <math>z = z + s_i (a_i - b_i)\,\!</math>
- <math>H_k g_k = z\,\!</math>
Commonly, the inverse Hessian <math>H^0_k\,\!</math> is represented as a diagonal matrix, so that initially setting <math>z\,\!</math> requires only an element-by-element multiplication.
Unfortunately, this two loop update only works for the inverse Hessian. Approaches to implementing L-BFGS using the direct approximate Hessian <math>B_k\,\!</math> have also been developed, as have other means of approximating the inverse Hessian[3].
Implementation and variants
An early, open source implementation of L-BFGS in Fortran exists. Multiple other open source implementations have been produced by as translations of this Fortran code (e.g. this one in java). Other implementations exists (e.g. Matlab), frequently as part of generic optimization libraries (e.g. Mathematica) A variant which can handle box-constraints on the variables, L-BFGS-B also exists as ACM TOMS algorithm 778,[4] and can be called from R's optim
general-purpose optimizer routing by using method="L-BFGS-B"
[5]
Works cited
- ↑ H. Matthies and G. Strang. The solution of non linear finite element equations (1979), International Journal for Numerical Methods in Engineering 14, pp. 1613-1626.
- ↑ J. Nocedal. Updating Quasi-Newton Matrices with Limited Storage (1980), Mathematics of Computation 35, pp. 773-782.
- ↑ Byrd, R.H, Nocedal,J., and Schnabel, R.B. "Representations of Quasi-Newton Matrices and their use in Limited Memory Methods", (1994) Mathematical Programming, 63, 4, pp. 129-156
- ↑ C. Zhu, R. H. Byrd and J. Nocedal. L-BFGS-B: Algorithm 778: L-BFGS-B, FORTRAN routines for large scale bound constrained optimization (1997), ACM Transactions on Mathematical Software, Vol 23, Num. 4, pp. 550-560
- ↑ "General-purpose Optimization". R documentation. Comprehensive R Archive Network. http://finzi.psych.upenn.edu/R/library/stats/html/optim.html.
Works referenced
- D. C. Liu and J. Nocedal. On the Limited Memory Method for Large Scale Optimization (1989), Mathematical Programming B, 45, 3, pp. 503-528.
- R. H. Byrd, P. Lu and J. Nocedal. A Limited Memory Algorithm for Bound Constrained Optimization (1995), SIAM Journal on Scientific and Statistical Computing, 16, 5, pp. 1190-1208.
External links
- Jorge Nocedal
- Software by Jorge Nocedal's Research Team.
- DotNumerics: Optimization for C# and VB.NET Unconstrained and bounded constrained optimization of multivariate functions (L-BFGS-B, Truncated Newton and Simplex methods).
If you like SEOmastering Site, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...