Skip to content

Commit

Permalink
Merge pull request #283 from bouthilx/release-v0.1.6rc2
Browse files Browse the repository at this point in the history
Release v0.1.6rc2
  • Loading branch information
bouthilx authored Sep 13, 2019
2 parents acb106a + 26fed57 commit cebe0bb
Show file tree
Hide file tree
Showing 112 changed files with 5,960 additions and 1,993 deletions.
9 changes: 5 additions & 4 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,11 +56,12 @@ before_install: |
pyenv install $PYTHON
export PYENV_VERSION=$PYTHON
export PATH="/Users/travis/.pyenv/shims:${PATH}"
pyenv-virtualenv venv
source venv/bin/activate
pyenv virtualenv venv
pyenv activate venv
python --version
brew install mongodb
brew services start mongodb
brew tap mongodb/brew
brew install mongodb-community
brew services start mongodb-community
fi
install:
Expand Down
10 changes: 5 additions & 5 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,12 @@ Oríon

|pypi| |py_versions| |license| |rtfd| |codecov| |travis|

.. |pypi| image:: https://img.shields.io/pypi/v/orion.core.svg
:target: https://pypi.python.org/pypi/orion.core
.. |pypi| image:: https://img.shields.io/pypi/v/orion.svg
:target: https://pypi.python.org/pypi/orion
:alt: Current PyPi Version

.. |py_versions| image:: https://img.shields.io/pypi/pyversions/orion.core.svg
:target: https://pypi.python.org/pypi/orion.core
.. |py_versions| image:: https://img.shields.io/pypi/pyversions/orion.svg
:target: https://pypi.python.org/pypi/orion
:alt: Supported Python Versions

.. |license| image:: https://img.shields.io/badge/License-BSD%203--Clause-blue.svg
Expand Down Expand Up @@ -78,7 +78,7 @@ Installation

Install Oríon by running:

``pip install orion.core``
``pip install orion``

For more information read the `full installation docs`_.

Expand Down
14 changes: 6 additions & 8 deletions ROADMAP.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,18 @@
# Roadmap
Last update July 5th, 2019
Last update August 26th, 2019

## Next releases - Short-Term

### v0.1.6

#### Auto-resolution of EVC

Make branching events automatically solved with sane defaults.

### v0.1.7

#### Preliminary Python API

Library API to simplify usage of algorithms without Oríon's worker.

#### Journal Protocol Plugins
Offering:
- no need to setup DB, can use one's existing backend
- Can re-use tools provided by backend for visualizations, etc.

## Next releases - Mid-Term

### v0.2: ETA End of summer 2019
Expand Down
4 changes: 3 additions & 1 deletion conda/conda_build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,9 @@ conda update -q conda
conda info -a
conda install conda-build anaconda-client

conda build conda
conda build conda --python 3.5
conda build conda --python 3.6
conda build conda --python 3.7

if [[ -n "${TRAVIS_TAG}" ]]
then
Expand Down
4 changes: 0 additions & 4 deletions conda/conda_build_config.yaml

This file was deleted.

3 changes: 3 additions & 0 deletions conda/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ requirements:
- python {{ python }}
- setuptools
- pytest-runner
- appdirs
run:
- python
- numpy
Expand All @@ -24,13 +25,15 @@ requirements:
- gitpython
- filelock
- tabulate
- AppDirs

test:
import:
- orion.core
- orion.core.cli
- orion.algo
- orion.client
- orion.storage
commands:
- orion --help

Expand Down
2 changes: 1 addition & 1 deletion docs/src/examples/pytorch_a2c_ppo.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Example with ikostrikov/pytorch-a2c-ppo-acktr

.. note ::
If Oríon not installed: pip install orion.core
If Oríon not installed: pip install orion
If the database is not setup, you can follow the instructions here:
:doc:`/install/database`.
Expand Down
2 changes: 1 addition & 1 deletion docs/src/examples/pytorch_cifar.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Example with pytorch-cifar

.. note ::
If Oríon not installed: pip install orion.core
If Oríon not installed: pip install orion
If the database is not setup, you can follow the instructions here:
:doc:`/install/database`.
Expand Down
1 change: 1 addition & 0 deletions docs/src/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
user/monitoring
user/searchspace
user/algorithms
user/script
user/evc

.. toctree::
Expand Down
4 changes: 2 additions & 2 deletions docs/src/install/core.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ Via PyPI
========

The easiest way to install Oríon is using the Python package manager. The core of Oríon is
registered on PyPI under `orion.core`.
registered on PyPI under `orion`.

.. code-block:: sh
pip install orion.core
pip install orion
This will install all the core components. Note that the only algorithm provided with it
is random search. To install more algorithms, you can look at section :doc:`/install/plugins`.
Expand Down
139 changes: 116 additions & 23 deletions docs/src/user/algorithms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@
Setup Algorithms
****************

.. contents::
:depth: 2
:local:

Default algorithm is a random search based on the probability
distribution given to a search parameter's definition.

Expand All @@ -27,6 +31,12 @@ yaml file as shown above with ``learning_rate``.
Included Algorithms
===================

.. contents::
:depth: 1
:local:

.. _random-search:

Random Search
-------------

Expand Down Expand Up @@ -64,10 +74,13 @@ to very optimal resource usage.

The most common way of using ASHA is to reduce the number of epochs,
but the algorithm is generic and can be applied to any multi-fidelity setting.
That is, you can use training time, specifying the fidelity with ``--epochs~fidelity()``
(assuming your script takes this argument in commandline), but you could also use other fidelity
such as dataset size ``--dataset-size~fidelity()`` (assuming your script takes this argument and
adapt dataset size accordingly). The placeholder ``fidelity()`` is a special prior for
That is, you can use training time, specifying the fidelity with
``--epochs~fidelity(low=1, high=100)``
(assuming your script takes this argument in commandline),
but you could also use other fidelity
such as dataset size ``--dataset-size~fidelity(low=500, high=50000)``
(assuming your script takes this argument and
adapt dataset size accordingly). The placeholder ``fidelity(low, high)`` is a special prior for
multi-fidelity algorithms.


Expand All @@ -83,33 +96,32 @@ Configuration

.. code-block:: yaml
algorithms:
asha:
seed: null
max_resources: 100
grace_period: 1
reduction_factor: 4
num_brackets: 1
algorithms:
asha:
seed: null
num_rungs: null
num_brackets: 1
``seed``
producer:
strategy: StubParallelStrategy
Seed for the random number generator used to sample new trials. Default is ``None``.
.. note::

``max_resources``
Notice the additional ``producer.strategy`` in configuration which is not mandatory for other
algorithms. See :ref:`StubParallelStrategy` for more information.

Maximum amount of resources that will be assigned to trials by ASHA. Only the best
performing trial will be assigned the maximum amount of resources. Default is 100.

``grace_period``
``seed``

Seed for the random number generator used to sample new trials. Default is ``None``.

The minimum number of resources assigned to each trial. Default is 1.

``reduction_factor``
``num_rungs``

The factor by which ASHA promotes trials. If the reduction factor is 4, it means
the number of trials from one fidelity level to the next one is roughly divided by 4, and
each fidelity level has 4 times more resources than the prior one. Default is 4.
Number of rungs for the largest bracket. If not defined, it will be equal to ``(base + 1)`` of the
fidelity dimension. In the original paper,
``num_rungs == log(fidelity.high/fidelity.low) / log(fidelity.base) + 1``.

``num_brackets``

Expand All @@ -118,10 +130,11 @@ converging trials that do not lead to best results at convergence (stragglers).
To overcome this, you can increase the number of brackets, which increases the amount of resources
required for optimisation but decreases the bias towards stragglers. Default is 1.


Algorithm Plugins
=================

.. _scikit-bayesopt:

Bayesian Optimizer
------------------

Expand Down Expand Up @@ -210,3 +223,83 @@ True if the target values' mean is expected to differ considerable from
zero. When enabled, the normalization effectively modifies the GP's
prior based on the data, which contradicts the likelihood principle;
normalization is thus disabled per default.

.. _parallel-strategies:

Parallel Strategies
===================

A parallel strategy is a method to improve parallel optimization
for sequential algorithms. Such algorithms can only observe
trials that are completed and have a corresponding objective.
To get around this, parallel strategies produces *lies*,
noncompleted trials with fake objectives, which are then
passed to a temporary copy of the algorithm that will suggest
a new point. The temporary algorithm is then discarded.
The original algorithm never obverses lies, and
the temporary copy always observes lies that are based on
most up-to-date data.
The strategies will differ in how they assign objectives
to the *lies*.

By default, the strategy used is :ref:`MaxParallelStrategy`

NoParallelStrategy
------------------

Does not return any lie. This is useful to benchmark parallel
strategies and measure how they can help compared to no
strategy.

.. _StubParallelStrategy:

StubParallelStrategy
--------------------

Assign to *lies* an objective of ``None`` so that
non-completed trials are observed and identifiable by algorithms
that can leverage parallel optimization.

The value of the objective is customizable with ``stub_value``.

.. code-block:: yaml
producer:
strategy:
StubParallelStrategy:
stub_value: 'custom value'
.. _MaxParallelStrategy:

MaxParallelStrategy
-------------------

Assigns to *lies* the best objective observed so far.

The default value assigned to objective when less than 1 trial
is completed is configurable with ``default_result``. It
is ``float('inf')`` by default.

.. code-block:: yaml
producer:
strategy:
MaxParallelStrategy:
default_result: 10000
MeanParallelStrategy
--------------------

Assigns to *lies* the mean of all objectives observed so far.

The default value assigned to objective when less than 2 trials
are completed is configurable with ``default_result``. It
is ``float('inf')`` by default.

.. code-block:: yaml
producer:
strategy:
MeanParallelStrategy:
default_result: 0.5
36 changes: 35 additions & 1 deletion docs/src/user/cli/info.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,16 @@ Here is an example of all the sections provided by the command.

.. code-block:: console
orion info orion-tutorial
orion info --name orion-tutorial
.. code-block:: bash
Identification
==============
name: orion-tutorial
version: 1
user: <username>
Commandline
===========
--lr~loguniform(1e-5, 1.0)
Expand Down Expand Up @@ -59,3 +65,31 @@ Here is an example of all the sections provided by the command.
The last section contains information about the best trial so far, providing its
hyperparameter values and the corresponding objective.

The ``--name`` argument
~~~~~~~~~~~~~~~~~~~~~~~
To get information on an experiment, you need to call `info` with the `--name` or `-n` argument like
shown in the previous example. This will fetch the latest version of the experiment with that name
inside the database and display its content.

The ``--version`` argument
~~~~~~~~~~~~~~~~~~~~~~~~~~
To specify which version of an experiment you wish to observe, you can use the `--version` argument.
If provided, this will fetch the experiment with a version number corresponding to that version
instead of fetching the latest one. Note that the `--version` argument cannot be used alone and that
an invalid version number, i.e. a version number greater than the latest version, will simply fetch
the latest one instead.

For example, suppose we have two experiments named `orion-tutorial` inside the database, one with
version `1` and one with version `2`. Then running the following command would simply give us the
latest version, so version `2`.

.. code-block:: console
orion info --name orion-tutorial
Whereas, running the next command will give us the first version instead:

.. code-block:: console
orion info --name orion-tutorial --version 1
Loading

0 comments on commit cebe0bb

Please sign in to comment.