Skip to content

Commit

Permalink
Merge pull request #611 from Epistimio/release-v0.1.15rc
Browse files Browse the repository at this point in the history
Release candidate v0.1.15rc
  • Loading branch information
bouthilx authored May 19, 2021
2 parents e8e198e + 4ed04b2 commit 9e6f283
Show file tree
Hide file tree
Showing 149 changed files with 7,084 additions and 1,939 deletions.
2 changes: 1 addition & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ confidence=
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=abstract-class-instantiated,useless-super-delegation,no-member,keyword-arg-before-vararg,unidiomatic-typecheck,redefined-outer-name,fixme,F0401,intern-builtin,wrong-import-position,wrong-import-order,
disable=abstract-class-instantiated,useless-super-delegation,no-member,keyword-arg-before-vararg,unidiomatic-typecheck,redefined-outer-name,fixme,F0401,intern-builtin,wrong-import-position,wrong-import-order,no-self-use,
C0415, F0010, R0205, R1705, R1711, R1720, W0106, W0107, W0127, W0706, C0330, C0326, W1203

# Enable the message, report, category or checker with the given id(s). You can
Expand Down
3 changes: 1 addition & 2 deletions MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,8 @@ prune conda/
prune .github/

# Include src, tests, docs
recursive-include docs *.rst *.py *.gitkeep *.png
recursive-include docs *.rst *.py *.gitkeep *.png *.html *.txt *.gif
recursive-include examples *.rst
include docs/requirements.txt
prune docs/build
prune docs/src/reference
recursive-include src *.py
Expand Down
39 changes: 30 additions & 9 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,22 @@ guide`_.

.. _installation guide: https://orion.readthedocs.io/en/stable/install/core.html

Presentations
=============

- 2021-07-14 - SciPy 2021
- 2021-05-19 - Dask Summit 2021
- 2021-03-16 - AICamp
(`Video
<https://www.youtube.com/watch?v=QQ69vxF3LTI>`__)
(`Slides
<https://docs.google.com/presentation/d/1Tq3KrWcp66wdlZJtCFaxfq1m5ydyhcPiDCGCOuh_REg/edit?usp=sharing>`__)
- 2019-11-28 - Tech-talk @ Mila
(`Video
<https:/bluejeans.com/playback/s/4WUezzFCmb9StHzYgB0RjVbTUCKnRcptBvzBMP7t2UpLyKuAq7Emieo911BqEMnI>`__)
(`Slides
<https://docs.google.com/presentation/d/18g7Q4xRuhMtcVbwmFwDfH7v9gKS252-laOi9HrEQ7a4/edit?usp=sharing>`__)

Contribute or Ask
=================

Expand All @@ -102,29 +118,34 @@ If you use Oríon for published work, please cite our work using the following b

.. code-block:: bibtex
@software{xavier_bouthillier_2020_4265424,
@software{xavier_bouthillier_2021_0_1_15,
author = {Xavier Bouthillier and
Christos Tsirigotis and
François Corneau-Tremblay and
Thomas Schweizer and
Lin Dong and
Pierre Delaunay and
Mirko Bronzi and
Lin Dong and
Reyhane Askari and
Dendi Suhubdy and
Hadrien Bertrand and
Reyhane Askari and
Michael Noukhovitch and
Chao Xua and
Satya Ortiz-Gagné and
Olivier Breuleux and
Arnaud Bergeron and
Olexa Bilaniuk and
Steven Bocco and
Hadrien Bertrand and
Guillaume Alain and
Dmitriy Serdyuk and
Peter Henderson and
Pascal Lamblin and
Christopher Beckham},
title = {{Epistimio/orion: Plotting API and Database
commands}},
month = nov,
year = 2020,
title = {{Epistimio/orion: Asynchronous Distributed Hyperparameter Optimization}},
month = may,
year = 2021,
publisher = {Zenodo},
version = {v0.1.14},
version = {v0.1.15},
doi = {10.5281/zenodo.3478592},
url = {https://doi.org/10.5281/zenodo.3478592}
}
Expand Down
8 changes: 2 additions & 6 deletions ROADMAP.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,13 @@
# Roadmap
Last update December 3rd, 2020
Last update May 19th, 2021

## Next releases - Short-Term

### v0.1.15
### v0.1.16

#### Quick release for bug fixes

### v0.2
#### Native Multi-Processing support

Added support for parallelism with auto-scaling. No more need to launch multiple workers
(though still supported) for parallelism, simply pass `--n-workers` or `workon(n_workers)`.

#### Generic `Optimizer` interface supporting various types of algorithms

Expand Down
1 change: 1 addition & 0 deletions conda/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ requirements:
- gunicorn
- scikit-learn
- psutil
- pytest >=3.0.0

test:
import:
Expand Down
6 changes: 6 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,10 @@
sphinx
sphinx_rtd_theme
sphinxcontrib.httpdomain
sphinx-autoapi
sphinx_gallery
numpydoc
plotly
matplotlib
kaleido
dask[complete]
171 changes: 171 additions & 0 deletions docs/scripts/build_database_and_plots.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,171 @@
import argparse
import glob
import os
import shutil
import subprocess
import sys

from orion.client import get_experiment
from orion.core.cli.db.rm import process_exp_rm
from orion.core.utils.singleton import update_singletons
from orion.core.worker.trial import Trial
from orion.storage.base import get_storage, setup_storage

ROOT_DIR = os.path.abspath(os.path.dirname(__file__))
DOC_SRC_DIR = os.path.join(ROOT_DIR, "..", "src")
os.chdir(DOC_SRC_DIR)

EXAMPLE_DIR = os.path.abspath("../../examples")

MAIN_DB_HOST = f"{EXAMPLE_DIR}/db.pkl"
BASE_DB_HOST = f"{EXAMPLE_DIR}/base_db.pkl"
TMP_DB_HOST = f"{EXAMPLE_DIR}/tmp.pkl"

custom_plots = {
"hyperband-cifar10": {
"name": "params",
"kwargs": {
"kind": "partial_dependencies",
"params": ["gamma", "learning_rate"],
},
},
"dask": {
"name": "params",
"kwargs": {
"kind": "partial_dependencies",
"params": ["C", "gamma"],
},
},
}

# CP base db to database.pkl (overwrite database.pkl)

CODE_PATH = f"{EXAMPLE_DIR}/tutorials/{{example}}.py"

paths = glob.glob(CODE_PATH.format(example="code_*"))
names = sorted(os.path.splitext(os.path.basename(path))[0] for path in paths)


def execute(example):
command = f"python {EXAMPLE_DIR}/tutorials/{example}.py"
os.chdir(f"{EXAMPLE_DIR}")
print("executing", command)
process = subprocess.Popen(command.split(" "))
return_code = process.wait()
tmp_files = "tmp_" + "_".join(example.split("_")[2:])
print("removing tmp files", tmp_files)
shutil.rmtree(tmp_files, ignore_errors=True)
os.chdir(DOC_SRC_DIR)
print("done")
return return_code


def prepare_dbs():

if os.path.exists(MAIN_DB_HOST):
print("Removing", MAIN_DB_HOST)
os.remove(MAIN_DB_HOST)

print("Copying", BASE_DB_HOST, "->", TMP_DB_HOST)
shutil.copy(BASE_DB_HOST, TMP_DB_HOST)


def setup_tmp_storage(host):
# Clear singletons
update_singletons()

setup_storage(
storage={
"type": "legacy",
"database": {
"type": "pickleddb",
"host": host,
},
}
)

return get_storage()


def load_data(host):
print("Loading data from", host)
storage = setup_tmp_storage(host)
experiment_names = set()
data = {"experiments": {}, "trials": {}}
for experiment in storage.fetch_experiments({}):
data["experiments"][experiment["_id"]] = experiment
data["trials"][experiment["_id"]] = storage.fetch_trials(uid=experiment["_id"])
experiment_names.add((experiment["name"], experiment["version"]))

return experiment_names, data


def copy_data(data, host=TMP_DB_HOST):
print("Copying data to", host)
storage = setup_tmp_storage(host)
for exp_id, experiment in data["experiments"].items():
del experiment["_id"]
storage.create_experiment(experiment)
assert exp_id != experiment["_id"]
trials = []
for trial in data["trials"][exp_id]:
trial.experiment = experiment["_id"]
trials.append(trial.to_dict())
storage._db.write("trials", trials)


def plot_exps(experiment_names, host=TMP_DB_HOST):
print("Plotting experiments from", host)
storage = setup_tmp_storage(host)
# Plot exps
for experiment_name, version in experiment_names:
print(f" {experiment_name}-v{version}")
experiment = get_experiment(experiment_name, version=version)
for plot in ["regret", "lpi", "partial_dependencies", "parallel_coordinates"]:
experiment.plot(kind=plot).write_html(
f"_static/{experiment.name}_{plot}.html"
)

if experiment_name in custom_plots:
custom_plot = custom_plots[experiment_name]
kwargs = custom_plot["kwargs"]
name = (
f"_static/{experiment.name}_{kwargs['kind']}_{custom_plot['name']}.html"
)
experiment.plot(**kwargs).write_html(name)


def main(argv=None):

parser = argparse.ArgumentParser()
parser.add_argument("examples", nargs="*", default=[], choices=names + [[]])
options = parser.parse_args(argv)

if options.examples:
run = {name for name in options.examples if name in names}
else:
run = set()

prepare_dbs()

for example in names:

example_db_host = f"{EXAMPLE_DIR}/{example}_db.pkl"

if example in run:
execute(example)
print("Moving", MAIN_DB_HOST, "->", example_db_host)
os.rename(MAIN_DB_HOST, example_db_host)

experiment_names, data = load_data(example_db_host)
copy_data(data, TMP_DB_HOST)

if example in run:
plot_exps(experiment_names, TMP_DB_HOST)

print("Moving", TMP_DB_HOST, "->", MAIN_DB_HOST)
os.rename(TMP_DB_HOST, MAIN_DB_HOST)


if __name__ == "__main__":
main()
90 changes: 90 additions & 0 deletions docs/scripts/filter_database.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
"""
Script to turn the database ``examples/plotting/database.pkl`` into a clean
version ``examples/base_db.pkl`` for the examples.
"""
import pprint
import shutil

from orion.client import get_experiment
from orion.core.io.orion_cmdline_parser import OrionCmdlineParser
from orion.storage.base import get_storage, setup_storage


shutil.copy("./examples/plotting/database.pkl", "./examples/base_db.pkl")

setup_storage(
dict(
type="legacy",
database=dict(type="pickleddb", host="./examples/base_db.pkl"),
)
)

filter_exps = {
("lateral-view-pa4", 1): "2-dim-exp",
("lateral-view-dualnet2", 1): "2-dim-shape-exp",
("lateral-view-multitask2", 1): "4-dim-cat-shape-exp",
("lateral-view-multitask3", 1): "3-dim-cat-shape-exp",
}

storage = get_storage()


def update_dropout(experiment_config):
metadata = experiment_config["metadata"]
user_script = metadata.get("user_script", "")
user_args = metadata.get("user_args", [])
try:
index = user_args.index("--dropout")
except ValueError:
print(
f"No dropout for {experiment_config['metadata']}-v{experiment_config['version']}"
)
return

user_args[index + 1] = (
user_args[index + 1]
.replace("5,", "0.5,")
.replace(", discrete=True", ", precision=1")
)
cmdline_parser = OrionCmdlineParser(allow_non_existing_files=True)
cmdline_parser.parse([user_script] + user_args)
metadata["parser"] = cmdline_parser.get_state_dict()
experiment_config["space"] = metadata["priors"] = dict(cmdline_parser.priors)

# Update config in db
storage.update_experiment(uid=experiment_config["_id"], **experiment_config)

# Update all trials in db (arf)
n_trials_before = len(storage.fetch_trials(uid=experiment_config["_id"]))
for trial in storage.fetch_trials(uid=experiment_config["_id"]):
previous_id = trial.id
for param in trial._params:
if param.name == "/dropout":
param.value /= 10
assert 0 <= param.value <= 0.5, param.value

storage.delete_trials(uid=experiment_config["_id"], where=dict(_id=previous_id))
storage.register_trial(trial)

trials = storage.fetch_trials(uid=experiment_config["_id"])
assert len(trials) == n_trials_before, len(trials)
for trial in trials:
assert 0 <= trial.params["/dropout"] <= 0.5, trial


for experiment_config in storage.fetch_experiments({}):
key = (experiment_config["name"], experiment_config["version"])

if key in filter_exps:
print(
f"Saving {experiment_config['name']}-v{experiment_config['version']} as {filter_exps[key]}"
)
update_dropout(experiment_config)

storage.update_experiment(
uid=experiment_config["_id"], name=filter_exps[key], version=1
)
else:
print(f"Deleting {experiment_config['name']}-v{experiment_config['version']}")
storage.delete_experiment(uid=experiment_config["_id"])
storage._db.remove("trials", query={"experiment": experiment_config["_id"]})
3 changes: 3 additions & 0 deletions docs/scripts/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
torch
torchvision

Binary file added docs/src/_static/cmdline.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 9e6f283

Please sign in to comment.