Skip to content

Commit

Permalink
Merge pull request #189 from dgasmith/models
Browse files Browse the repository at this point in the history
Record objects as base document classes
  • Loading branch information
dgasmith authored Mar 4, 2019

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
2 parents dfe151b + 58169e8 commit 15962b7
Showing 42 changed files with 1,960 additions and 1,660 deletions.
16 changes: 14 additions & 2 deletions docs/source/changelog.rst
Original file line number Diff line number Diff line change
@@ -16,15 +16,27 @@ Changelog
.. +++++++++
0.?.? / 2019-0?-??
0.5.1 / 2019-03-04
------------------

New Features
++++++++++++
- (:pr:`177`) Adds a new ``qcfractal-template`` command to generate ``qcfractal-manager`` scripts.
- (:pr:`181`) Pagination is added to queries, defaults to 1000 matches.
- (:pr:`185`) Begins setup documentation.
- (:pr:`186`) Begins database design documentation.
- (:pr:`187`) Results add/update is now simplified to always store entire objects rather than update partials.
- (:pr:`189`) All database compute records now go through a single ``BaseRecord`` class that validates and hashes the objects.

Enhancements
++++++++++++

- (:pr:`175`) Refactors query massaging logic to a single function, ensures all program queries are lowercase, etc.
- (:pr:`175`) Keywords are now lazy reference fields.
- (:pr:`182`) Reworks models to have strict fields, and centralizes object hashing with many tests.
- (:pr:`183`) Centralizes duplicate checking so that accidental mixed case duplicate results could go through.
- (:pr:`190`) Adds QCArchive sphinx theme to the documentation.

Bug Fixes
+++++++++

@@ -65,7 +77,7 @@ New Features
- (:pr:`125`) QCElemental common pydantic models have been integrated throughout the QCFractal code base, making a common model repository for the prevalent ``Molecule`` object (and others) come from a single source.
Also converted QCFractal to pass serialized pydantic objects between QCFractal and QCEngine to allow validation and (de)serialization of objects automatically.
- (:pr:`130`, :pr:`142`, and :pr:`145`) Pydantic serialization has been added to all REST calls leaving and entering both QCFractal Servers and QCFractal Portals. This allows automatic REST call validation and formatting on both server and client sides.
- (:pr:`141` and :pr:`152`) A new GridOptimization service has been added to QCFractal. This feature supports relative starting positions from the input molecule.
- (:pr:`141` and :pr:`152`) A new GridOptimizationRecord service has been added to QCFractal. This feature supports relative starting positions from the input molecule.

Enhancements
++++++++++++
5 changes: 5 additions & 0 deletions docs/source/install.rst
Original file line number Diff line number Diff line change
@@ -38,6 +38,11 @@ or use ``pip`` for a local install::

pip install -e .

It is recommended to setup a testing environment using ``conda``. This can be accomplished by::

cd qcfractal
python devtools/scripts/conda_env.py -n=qcf_test -p=3.7 devtools/conda-envs/openff.yaml


Test
----
2 changes: 1 addition & 1 deletion examples/parsl_torsiondrive/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# TorsionDrive example using the Parsl backend
# TorsionDriveRecord example using the Parsl backend

This example computes a torsiondrive using a custom Parsl configuration. As
Parsl is not currently available to be configured from the command line this
27 changes: 15 additions & 12 deletions qcfractal/cli/qcfractal_template_generator.py
Original file line number Diff line number Diff line change
@@ -49,7 +49,7 @@
# -- Note ---
# Different Managers interpret this slightly differently, but that should not be your concern, just treat
# each item as though it were a CLI entry and the manager block will interpret
# ------------
# ------------
SCHEDULER_OPTS = []
# Additional commands to start each task with. E.g. Activating a conda environment
@@ -91,7 +91,7 @@
fractal_client = None
else:
fractal_client = portal.FractalClient(FRACTAL_ADDRESS, verify=False)
# Build a manager
@@ -179,15 +179,15 @@ def dask_templates():

code_skeletal = dedent("""\
{BUILDER}
# Setup up adaption
# Workers are distributed down to the cores through the sub-divided processes
# Optimization may be needed
cluster.adapt(minimum=0, maximum=MAX_NODES)
# Integrate cluster with client
dask_client = Client(cluster)
""")

# SLURM
@@ -207,16 +207,17 @@ def dask_templates():
queue=SLURM_PARTITION,
processes=MAX_TASKS_PER_NODE, # This subdivides the cores by the number of processes we expect to run
walltime="00:10:00",
# Additional queue submission flags to set
job_extra=SCHEDULER_OPTS,
# Not sure of the validity of this, but it seems to be the only terminal-invoking way
# so python envs may be setup from there
# Commands to execute before the Dask
env_extra=TASK_STARTUP_COMMANDS,
# Uncomment and set this if your cluster uses non-standard ethernet port names
# Uncomment and set this if your cluster uses non-standard ethernet port names
# for communication between the head node and your compute nodes
# interface="eth0"
extra=['--resources process=1'],
)
""")

@@ -233,16 +234,17 @@ def dask_templates():
project=TORQUE_ACCOUNT,
processes=MAX_TASKS_PER_NODE, # This subdivides the cores by the number of processes we expect to run
walltime="00:10:00",
# Additional queue submission flags to set
job_extra=SCHEDULER_OPTS,
# Not sure of the validity of this, but it seems to be the only terminal-invoking way
# so python envs may be setup from there
# Commands to execute before the Dask
env_extra=TASK_STARTUP_COMMANDS,
# Uncomment and set this if your cluster uses non-standard ethernet port names
# Uncomment and set this if your cluster uses non-standard ethernet port names
# for communication between the head node and your compute nodes
# interface="eth0"
extra=['--resources process=1'],
)
""")

@@ -259,16 +261,17 @@ def dask_templates():
project=LSF_PROJECT,
processes=MAX_TASKS_PER_NODE, # This subdivides the cores by the number of processes we expect to run
walltime="00:10:00",
# Additional queue submission flags to set
job_extra=SCHEDULER_OPTS,
# Not sure of the validity of this, but it seems to be the only terminal-invoking way
# so python envs may be setup from there
# Commands to execute before the Dask
env_extra=TASK_STARTUP_COMMANDS,
# Uncomment and set this if your cluster uses non-standard ethernet port names
# Uncomment and set this if your cluster uses non-standard ethernet port names
# for communication between the head node and your compute nodes
# interface="eth0"
extra=['--resources process=1'],
)
""")

@@ -329,7 +332,7 @@ def parsl_templates():
cores_per_worker=CORES_PER_NODE // MAX_TASKS_PER_NODE,
max_workers = MAX_NODES*MAX_TASKS_PER_NODE
)
],
)"""
)
8 changes: 8 additions & 0 deletions qcfractal/extras.py
Original file line number Diff line number Diff line change
@@ -20,3 +20,11 @@ def get_information(key):
raise KeyError("Information key '{}' not understood.".format(key))

return __info[key]


def provenance_stamp(routine):
"""Return dictionary satisfying QCSchema,
generating routine's name is passed in through `routine`.
"""
return {'creator': 'QCFractal', 'version': get_information('version'), 'routine': routine}
Loading

0 comments on commit 15962b7

Please sign in to comment.