Skip to content

Latest commit

 

History

History
18 lines (11 loc) · 1.34 KB

README.md

File metadata and controls

18 lines (11 loc) · 1.34 KB

filterSqp

Implementation of the FilterSQP constrained optimization algorithm. Using the FADBAD++ library for automatic differentiation.

2015/Sep - This is still very much WIP, and there is nothing related to constrained optimization yet. Just implementing trust-region optimization first, adapting my old Python code from the Corisco project.

But we do have preliminary results. Here are some basic demos of the solution of classic functions.

Himmelblau function optimization from different starting points, using a fixed step length trust region method

On the Rosenbrock "banana" function": Gradient descent, smooth but slow, too many iterations on the valley.

Newton's method. Faster but clumsy, better to have a smoother track.

Fixed step trust region with same starting point from Newton's method. Much smoother, following the valley.

Fixed step trust region with same starting point from gradient descent. Very similar track, but far fewer iterations.