BFGS (O-BFGS) Just isn't Necessarily Convergent
Clayton Bevins урећивао ову страницу пре 3 недеља


Limited-memory BFGS (L-BFGS or LM-BFGS) is an optimization algorithm in the gathering of quasi-Newton methods that approximates the Broyden-Fletcher-Goldfarb-Shanno algorithm (BFGS) utilizing a limited quantity of pc memory. It is a well-liked algorithm for parameter estimation in machine learning. Hessian (n being the number of variables in the issue), L-BFGS stores only a few vectors that symbolize the approximation implicitly. Attributable to its ensuing linear memory requirement, the L-BFGS technique is particularly effectively suited for optimization problems with many variables. The two-loop recursion formulation is broadly utilized by unconstrained optimizers as a consequence of its efficiency in multiplying by the inverse Hessian. Nevertheless, it doesn't allow for the express formation of either the direct or inverse Hessian and is incompatible with non-field constraints. Another method is the compact representation, which entails a low-rank illustration for the direct and/or inverse Hessian. This represents the Hessian as a sum of a diagonal matrix and a low-rank update. Such a illustration permits the use of L-BFGS in constrained settings, for example, as a part of the SQP method.


Since BFGS (and hence L-BFGS) is designed to reduce smooth features without constraints, the L-BFGS algorithm must be modified to handle capabilities that embody non-differentiable elements or constraints. A well-liked class of modifications are called energetic-set methods, based on the concept of the lively set. The idea is that when restricted to a small neighborhood of the current iterate, the operate and constraints might be simplified. The L-BFGS-B algorithm extends L-BFGS to handle simple box constraints (aka sure constraints) on variables