Skip to content

Commit

Permalink
add basic docs for optimizers and loops (#125)
Browse files Browse the repository at this point in the history
* add basic docs for optimizers and loops
* add badges

---------

Signed-off-by: Grossberger Lukas (CR/AIR2.2) <Lukas.Grossberger@de.bosch.com>
  • Loading branch information
LGro authored Mar 1, 2024
1 parent ebbb2a3 commit 7306df7
Show file tree
Hide file tree
Showing 14 changed files with 151 additions and 60 deletions.
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ request model. For large contributions we do encourage you to file a ticket in
the GitHub issues tracking system prior to any code development to coordinate
with the blackboxopt development team early in the process. Coordinating up
front helps to avoid frustration later on.
Please follow the conventiones outlined by the pre-commit hooks mentioned in the
Please follow the conventions outlined by the pre-commit hooks mentioned in the
repository README and add tests for the functionality you would like to contribute.

Your contribution must be licensed under the Apache-2.0 license, the license
Expand Down
17 changes: 3 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@

[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](LICENSE)
[![CI/CD](https://github.com/boschresearch/blackboxopt/workflows/ci-cd-pipeline/badge.svg)](https://github.com/boschresearch/blackboxopt/actions?query=workflow%3Aci-cd-pipeline+branch%3Amain)
[![PyPI - Wheel](https://img.shields.io/pypi/wheel/blackboxopt)](https://pypi.org/project/blackboxopt/)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/blackboxopt)](https://pypi.org/project/blackboxopt/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

Various blackbox optimization algorithms with a common interface along with useful
helpers like parallel optimization loops, analysis and visualization scripts.
Expand Down Expand Up @@ -64,20 +67,6 @@ For HTML test coverage reports run
poetry run pytest tests/ --cov --cov-report html:htmlcov
```

### Custom Optimizers

When you develop an optimizer based on the interface defined as part of
`blackboxopt.base`, you can use `blackboxopt.testing` to directly test whether your
implementation follows the specification by adding a test like this to your test suite.

```python
from blackboxopt.testing import ALL_REFERENCE_TESTS

@pytest.mark.parametrize("reference_test", ALL_REFERENCE_TESTS)
def test_all_reference_tests(reference_test):
reference_test(CustomOptimizer, custom_optimizer_init_kwargs)
```

## Building Documentation

Make sure to install _all_ necessary dependencies:
Expand Down
2 changes: 1 addition & 1 deletion blackboxopt/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "5.0.3"
__version__ = "5.0.4"

from parameterspace import ParameterSpace

Expand Down
46 changes: 3 additions & 43 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@

[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](LICENSE)
[![CI/CD](https://github.com/boschresearch/blackboxopt/workflows/ci-cd-pipeline/badge.svg)](https://github.com/boschresearch/blackboxopt/actions?query=workflow%3Aci-cd-pipeline+branch%3Amain)
[![PyPI - Wheel](https://img.shields.io/pypi/wheel/blackboxopt)](https://pypi.org/project/blackboxopt/)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/blackboxopt)](https://pypi.org/project/blackboxopt/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

The `blackboxopt` Python package contains blackbox optimization algorithms with a common
interface, along with useful helpers like parallel optimization loops, analysis and
Expand All @@ -26,49 +29,6 @@ implementation.
BOHB is provided as a cleaner replacement of the former implementation in
[HpBandSter](https://github.com/automl/HpBandSter).

#### Fidelities for BOHB & Hyperband

You can calculate the fidelity schedule resulting from these parameters:

<script>
function calculateFidelitiesBOHB() {
const min_fidelity = document.getElementById('minFidelityBOHB').value;
const max_fidelity = document.getElementById('maxFidelityBOHB').value;
const eta = document.getElementById('etaBOHB').value;

const max_num_stages = 1 + Math.floor(
Math.log(max_fidelity / min_fidelity) / Math.log(eta)
);
const num_configs_first_stage = Math.ceil(Math.pow(eta, max_num_stages - 1));
const num_configs_per_stage = Array.from({ length: max_num_stages }, (_, i) =>
Math.floor(num_configs_first_stage / Math.pow(eta, i))
);
const fidelities_per_stage = Array.from({ length: max_num_stages }, (_, i) =>
max_fidelity / Math.pow(eta, max_num_stages - 1 - i)
);

document.getElementById('fidelitiesBOHB').innerHTML = `Fidelities: ${fidelities_per_stage}`;
}
</script>
<table>
<tr>
<td>min_fidelity</td>
<td><input type="text" id="minFidelityBOHB"></td>
</tr>
<tr>
<td>max_fidelity</td>
<td><input type="text" id="maxFidelityBOHB"></td>
</tr>
<tr>
<td>eta</td>
<td><input type="text" id="etaBOHB"></td>
</tr>
<tr>
<td></td><td><button onclick="calculateFidelitiesBOHB();">Submit</button></td>
</tr>
</table>
<p id="fidelitiesBOHB"></p>

### Optimization Loops

As part of the `blackboxopt.optimization_loops` module compatible implementations for
Expand Down
8 changes: 8 additions & 0 deletions docs/optimization-loops/dask-distributed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Dask Distributed Optimization Loop

In case you are working with [dask](https://github.com/dask/dask/), this optimization
loop can help you run `blackboxopt` based optimization leveraging your dask cluster.
See also the corresponding [example](../../examples/dask-distributed) for more
details.

::: blackboxopt.optimization_loops.dask_distributed
8 changes: 8 additions & 0 deletions docs/optimization-loops/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Optimization Loops

We include a handful optimization loop implementations for different scenarios from a
simple [sequential](sequential.md) loop to one [distributed](dask-distributed.md)
potentially across nodes in a cluster setup.
Additionally, a set of [reference tests](testing.md) are included for
anyone extending the selection of optimization loops as part of a contribution to
`blackboxopt` or in a separate project.
3 changes: 3 additions & 0 deletions docs/optimization-loops/sequential.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Sequential Optimization Loop

::: blackboxopt.optimization_loops.sequential
15 changes: 15 additions & 0 deletions docs/optimization-loops/testing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Reference Tests

To test an optimization loop implementation across various reference scenarios, follow:

```python
import pytest
from blackboxopt.optimization_loops.testing import ALL_REFERENCE_TESTS

@pytest.mark.parametrize("reference_test", testing.ALL_REFERENCE_TESTS)
def test_all_reference_tests(reference_test):
reference_test(custom_optimization_loop, {"opt_loop_specific_kwarg": 123})
```

where you can include custom keyword arguments that are passed to the optimization loop
calls in the reference tests.
60 changes: 60 additions & 0 deletions docs/optimizers/bohb.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# BOHB Optimizer

BOHB performs robust and efficient hyperparameter optimization at scale by combining the
speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian
Optimization.
Instead of sampling new configurations at random, BOHB uses kernel density estimators to
select promising candidates.

This implementation is meant to supersede the initial release of
[HpBandSter](https://github.com/automl/HpBandSter/).


## Fidelities

Here you can calculate the fidelity schedule resulting from BOHB's hyper-parameters:

<script>
function calculateFidelitiesBOHB() {
const min_fidelity = document.getElementById('minFidelityBOHB').value;
const max_fidelity = document.getElementById('maxFidelityBOHB').value;
const eta = document.getElementById('etaBOHB').value;

const max_num_stages = 1 + Math.floor(
Math.log(max_fidelity / min_fidelity) / Math.log(eta)
);
const num_configs_first_stage = Math.ceil(Math.pow(eta, max_num_stages - 1));
const num_configs_per_stage = Array.from({ length: max_num_stages }, (_, i) =>
Math.floor(num_configs_first_stage / Math.pow(eta, i))
);
const fidelities_per_stage = Array.from({ length: max_num_stages }, (_, i) =>
max_fidelity / Math.pow(eta, max_num_stages - 1 - i)
);

document.getElementById('fidelitiesBOHB').innerHTML = `Fidelities: ${fidelities_per_stage}`;
}
</script>
<table>
<tr>
<td>min_fidelity</td>
<td><input type="text" id="minFidelityBOHB"></td>
</tr>
<tr>
<td>max_fidelity</td>
<td><input type="text" id="maxFidelityBOHB"></td>
</tr>
<tr>
<td>eta</td>
<td><input type="text" id="etaBOHB"></td>
</tr>
<tr>
<td></td>
<td><button onclick="calculateFidelitiesBOHB();">Calculate</button></td>
</tr>
</table>
<p id="fidelitiesBOHB"></p>


## Reference

::: blackboxopt.optimizers.bohb
10 changes: 10 additions & 0 deletions docs/optimizers/botorch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# BoTorch Base Optimizer

This optimizer is a basic Gaussian Process based Bayesian optimization implementation
leveraging BoTorch in a way that is compatible with the `blackboxopt` interface.
While this is a functional optimizer, it is more intended as a basis for other BoTorch
based optimizer implementations.

## Reference

::: blackboxopt.optimizers.botorch_base
14 changes: 14 additions & 0 deletions docs/optimizers/space-filling.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Space Filling Optimizer

The `SpaceFilling` optimizer is a
[Sobol sequence](https://en.wikipedia.org/wiki/Sobol_sequence) based optimizer that
covers the search space based on a quasi-random low-discrepancy sequence.
This strategy requires a larger budget for evaluations but can be a good initial
approach to get to know the optimization problem at hand.
While this implementation follows the overall interface including the specification and
reporting of objectives and their values, the actual objective values are
inconsequential for the underlying Sobol sequence and do not guide the optimization.

## Reference

::: blackboxopt.optimizers.space_filling
14 changes: 14 additions & 0 deletions docs/optimizers/testing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Reference Tests

When you develop an optimizer based on the interface defined as part of
`blackboxopt.base`, you can use `blackboxopt.testing` to directly test whether your
implementation follows the specification by adding a test like this to your test suite:

```python
import pytest
from blackboxopt.testing import ALL_REFERENCE_TESTS

@pytest.mark.parametrize("reference_test", ALL_REFERENCE_TESTS)
def test_all_reference_tests(reference_test):
reference_test(CustomOptimizer, optional_optimizer_init_kwargs)
```
10 changes: 10 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,16 @@ nav:
- Overview: examples/overview.md
- examples/dask-distributed.md
- examples/multi-objective-multi-param.md
- Optimizers:
- Space Filling: optimizers/space-filling.md
- BOHB: optimizers/bohb.md
- BoTorch: optimizers/botorch.md
- optimizers/testing.md
- Optimization Loops:
- Overview: optimization-loops/overview.md
- Sequential: optimization-loops/sequential.md
- Dask Distributed: optimization-loops/dask-distributed.md
- optimization-loops/testing.md
- ...

plugins:
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "blackboxopt"
version = "5.0.3"
version = "5.0.4"
description = "A common interface for blackbox optimization algorithms along with useful helpers like parallel optimization loops, analysis and visualization scripts."
readme = "README.md"
repository = "https://github.com/boschresearch/blackboxopt"
Expand Down

0 comments on commit 7306df7

Please sign in to comment.