Skip to content

Commit

Permalink
Merge pull request #104 from matthiaskoenig/develop
Browse files Browse the repository at this point in the history
pkdb-v0.2.6
  • Loading branch information
matthiaskoenig authored Aug 31, 2018
2 parents c6fc623 + 7488f63 commit 8afb80a
Show file tree
Hide file tree
Showing 4 changed files with 47 additions and 178 deletions.
116 changes: 43 additions & 73 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
[![DOI](https://zenodo.org/badge/131752339.svg)](https://zenodo.org/badge/latestdoi/131752339)

# PKDB - Pharmacokinetics database

<b><a href="https://orcid.org/0000-0002-4588-4925" title="0000-0002-4588-4925"><img src="./docs/images/orcid.png" height="15"/></a> Jan Grzegorzewski</b>
and
<b><a href="https://orcid.org/0000-0003-1725-179X" title="https://orcid.org/0000-0003-1725-179X"><img src="./docs/images/orcid.png" height="15" width="15"/></a> Matthias König</b>

Database and web interface for storing pharmacokinetics information including
Database and web interface for storing (pharmaco-)kinetics information including
- study data (publication data)
- trial design
- subjects information
Expand All @@ -14,102 +16,70 @@ Database and web interface for storing pharmacokinetics information including
- timecourse data

<img src="./docs/images/data_extraction.png" width="600"/>
Figure 1: Overview over data extraction.

## Access
http://localhost:8000/api/v1/
http://localhost:8000/api/v1/studies/17955229/

Figure 1: Overview over data extraction and curation work flow.

# Installation
The database with backend and frontend is available as docker container for local installation.


## Data model
Pharmacokinetics data is a special type of experimental data.
Pharmacokinetics data like clearance, halflife, ... (with units and error measurements) are either directly reported in publications
or can be calculated from time course data.
* the reported value (mean, median) as well as the error terms associated with the values (SD, SE, CV, Range) can vary
* Important information is the number of subjects (n) underlying the measurement, which is required to convert between different error
measurements.

# Setup & Installation
## Requirements
- [Docker](https://docs.docker.com/docker-for-mac/install/)
- Python3.6

## Virtual Environment
Setting up a virtual environment
```
mkvirtualenv pkdb --python=python3.6
(pkdb) pip install -r requirements.txt
## Build docker container
To build the dev server for local development:
```bash
git clone https://github.com/matthiaskoenig/pkdb.git
cd pkdb
docker-compose up build
```
add your virtual environment to jupyter kernels:
To update an existing version use
```bash
docker-compose down
./remove_migrations.sh
docker-compose up build
```
(pkdb) ipython kernel install --user --name=pkdb
```
# Initialize the project

Start the dev server for local development:
To run an existing version use
```bash
docker-compose up
```

Create a superuser to login to the admin:
PKDB can than be accessed via the locally running server at
http://localhost:8000/api/v1/

To run commands inside the docker container use
```bash
docker-compose run --rm web ./manage.py createsuperuser
docker-compose run --rm web [command]
```

# Update after code change
For instance create a superuser to login to the admin:
```bash
docker-compose run --rm web ./manage.py createsuperuser
```
# reset migrations
sudo find . -path "*/migrations/*.py" -not -name "__init__.py" -delete && sudo find . -path "*/migrations/*.pyc" -delete && sudo rm -r media/study/
# rebuild container
docker-compose down
docker-compose up --build
or to run migrations
```bash
docker-compose run --rm web python manage.py makemigrations
```


# Fill database
From console use the following
## Fill database
The database can be filled via the `fill_database` scripts using curated data folders.
The curated data is currently not made available, but only accessible via the REST API.
```
workon pkdb
(pkdb) pip install -r requirements.txt --upgrade
(pkdb) python ./pkdb_app/data_management/fill_database.py
```


# Connect database pycharm
```
DataSource -> postgres
```
Use port defined in `docker-compose.yml` (5433), database name and password in `docker-compose.yml`

# Local Development
Start the dev server for local development:

```bash
docker-compose up
```
Run a command inside the docker container:

```bash
docker-compose run --rm web [command]
## Python (Virtual Environment)
Setting up a virtual environment to interact with the data base via python
```
Example:

mkvirtualenv pkdb --python=python3.6
(pkdb) pip install -r requirements.txt
```
docker-compose run --rm web python manage.py makemigrations
add your virtual environment to jupyter kernels:
```
(pkdb) ipython kernel install --user --name=pkdb
```

----
# Client
check out ./client/README.md.
## Requirements
- node.js
- npm
- vue.js


## Frontend
Documentation of the `vue.js` frontend is available in
./pkdb_client/README.md


&copy; 2018 Jan Grzegorzewski & Matthias König.
&copy; 2017-2018 Jan Grzegorzewski & Matthias König.
104 changes: 0 additions & 104 deletions TODO.md

This file was deleted.

2 changes: 1 addition & 1 deletion pkdb_app/_version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
"""
Definition of version string.
"""
__version__ = "0.2.5"
__version__ = "0.2.6"
3 changes: 3 additions & 0 deletions pkdb_app/subjects/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,6 +154,9 @@ def group_to_internal_value(group, study_sid):
except ObjectDoesNotExist:
msg = f'group: {group} in study: {study_sid} does not exist'
raise serializers.ValidationError(msg)
else:
msg = {"group": f'group is required on individual',"detail":group}
raise serializers.ValidationError(msg)
return group

def to_internal_value(self, data):
Expand Down

0 comments on commit 8afb80a

Please sign in to comment.