Pallas Solver
0.1
C++ Global Optimization Algorithms
|
This project is a suite of global optimization algorithms for C++ inspired by SciPy's global optimization package. Currently supported functions are:
For more information on the options for each algorithm please see the documentation at: http://latture.github.io/pallas-solver
To use this library first install glog and CMake. Pallas is based off of the Google Ceres project which has extensive use of glog for logging and debugging features, and this functionality is carried over into Pallas. Follow the instructions to install Ceres at ceres-solver.org/building.html. Once CMake, Ceres, and glog are built and installed use the following steps to build Pallas:
README.md
, create a folder named build
.build
folder.cmake .. -DCMAKE_PREFIX_PATH=/path/to/CeresConfig.cmake
, where /path/to/CeresConfig.cmake
denotes the folder where the file CeresConfig.cmake
is located Currently on Linux, Ceres by default will place this file at /usr/local/share/Ceres
, though this may change in the future and may be different on your machine.make
in the terminal. This should build Pallas. The folder build/lib
will hold the libraryThe Rosenbrock function (shown below) is a commonly used benchmarking function for optimization algorithms. The global minimum is in the middle of a narrow valley at f(x, y) = 0
when x = y = 1
. Finding the valley is fairly easy; however, finding the global minimum is quite a bit harder...
After compiling and running, the console should display the following:
This example (among others) can be found in the examples folder.
Pallas global optimization algorithms take as inputs an Options
struct class specific to each optimizer, GradientProblem
which encapsulates the objective function to optimize, a const double*
pointing to the initial starting point for optimization (except for Brute which takes a range of parameters), and a summary in which details of the optimization are stored. The Options
struct is a subclass specific to each optimizer exposing the options that can be changed in order to customize the optimization procedure. If Basinhopping is being used as the global optimizer, creating an instance of the default options is as simple as:
The default options can be changed by accessing member variables:
If the global optimizer employs a local minimizer, the options for the local minimizer are accessed through the options.local_minimizer_options
minimizer variable. options.local_minimizer_options
is itself a struct containing the parameters to augment the functionality of the local minimization step(s). The local minimization options are from the ceres::GradientProblemSolver
renamed to pallas::GradientLocalMinimizer
to avoid confusion between the global and local solvers. If DifferentialEvolution
is being used as the global optimizer, the options
struct requires that upper and lower bounds be set for the current problem. Note, however, that if the final output is polished (options.polish = true
) the local optimization will not respect the bounds of the global optimization due to the restriction of the Ceres local optimization algorithms to purely unbounded problems. Both the SimulatedAnnealing
and Basinhopping
algorithms use the StepFunction
class to generate randomized candidate solutions. A pallas::scoped_ptr to a DefaultStepFunction
is created by default. This is not going to give optimal results for your problem. If either of these algorithms are being used a class should inherit from StepFunction
and implement the Step
method which takes as inputs a pointer to the current solution and the number of parameters, then modifies the current solution in place. If a StepFunction
is used by the global optimizer, then the options struct has a helper method set_step_function
that swaps the pointer to the default step function with the user defined functor. The following shows how to create a step functor and replace it as the step function pointer in the options struct:
Subclassing pallas::GradientCostFunction
and implementing the Evaluate
and NumParameters
methods defines your objective function. Create a GradientProblem
using:
The gradient problem is what is then passed to the solver. The parameters
for the global optimization represents an initial guess required for the Basinhopping
and SimulatedAnnealing
algorithms. It should be a double*
and contain the same number of values as the NumParameters
method returns. Each global optimizer contains a Summary
class used to store the results of the global optimization. The summary is created in the same manner as the options struct, i.e.:
This is then passed as the final parameter to the solver. There are 2 methods optimize a cost function. An instance of the solver can be created then optimized using the global_optimizer.Solve
method. There is also a pallas::Solve
function added for convenience. It is overloaded to create a global optimizer instance and run the optimization based on the parameters passed to the function. To summarize, the two method of optimization are given by:
This library uses the local minimization algorithms from Google's Ceres solver. Implementations of the global optimization algorithms are based on Scipy's optimize package. Because of the similarities between the Pallas algorithms and scipy.optimize, much of the documentation was adapted from their source.