Skip to content

project-confero/confero

Repository files navigation

Confero

Tracking FEC Contribution Data

Local Development

Docker is nice for deploying, CI, and debugging issues in those two environments. But it can be a pain for local development. And if you're using docker to run the server locally, you'll want a local environment for your editor and quality checkers. So let's set that up first.

Prerequisites

Install Python 3.7

Also, for some formatter tooling, install Node

You'll also need prettier for formatting:

npm install -g prettier

Setup

Install and use pipenv:

Install pipenv

brew install pipenv

or

pip install --user pipenv

Then setup the virtual env and install dependencies

pipenv install --dev

And enter the virtual env

pipenv shell

Setup Database

Since postgres can be a little finicky to set up and upgrade, it may be easier to use the docker version.

Start up the docker database

./bin/db

The docker DB runs on port 5429 (so it won't conflict with the local postgres port, 5432). So you also need to make sure the DB_PORT is set in your .env file:

(See Configuration for more info on the .env file.)

DB_PORT=5429

Finally, make sure to run migrations:

python manage.py migrate

Run Server

To start the local server, with the docker db and pipenv, run:

./bin/start

Configuration

You can use environment variables to configure the application, (the database in particular). In production that generally means setting up environment variables in the server. In development, we're using dotenv to make it easier to configure the environment.

First run

cp .env.example .env

to create your .env file. Then edit the values, and restart the server to see the changes.

To use an environment variable in code, use

os.getenv("VARIABLE_NAME", "defaultValue")

Adding Dependencies

pipenv install NAME
pipenv install NAME --dev # for non-production dependencies

Local Docker

You can use Docker to run the project in the same environment it'll be in for production. This is particularly useful for debugging weird production errors.

To help with that, we're using docker-compose, which allows you to spin up the whole project, with a database, in a single command.

Prerequisites

Install Docker and docker-compose:

brew cask install docker

Or see the docs for Ubuntu or Windows.

Setup

This will build the docker containers needed to run the app.

docker-compose build

Run Server

Starts up the docker-compose cluster, and runs the app on localhost:8000.

docker-compose up

Run a manage.py command

docker-compose run django COMMAND

# Example: run migrations:
docker-compose run django migrate

Run a system command

docker-compose run django passes commands to the python manage.py entrypoint by default. To override that:

docker-compose run --entrypoint COMMAND django

Code Quality

We're using tests, linters, and formatters to make sure everything is working as it should.

All these things will run in CI, and will have to pass before you can merge and deploy code.

To run all the quality checks locally and ensure CI will pass, run:

./bin/quality

Testing

See the Django Docs for info on writing tests.

Run all tests

./bin/test

Get a code coverage report

This will run all the tests, and then report on which lines of code are uncovered by tests.

After running ./bin/coverage, you can see a detailed report by opening ./htmlcov/index.html in your browser.

./bin/coverage

Run all tests from within docker

If tests are failing in CI, running the tests in docker could help figure out what's up.

Warning: Running docker-compose commands may not work from within a virtualenv.

./bin/docker-test

or

./bin/docker-coverage

Linting/Formatting

Worrying about code style is lame, so let's make robots do it for us.

We're using a few code quality tools:

  • yapf - Google auto-formatter
  • prospector - Runs a bunch of linters with reasonable defaults
  • prettier - A JavaScript formatter. Just used for .md files for this project.
  • pre-commit - run the quality checks on every commit

Pre-Commit Hook

When you're committing, the pre-commit hook will run the linter and formatters on all the staged files. If anything fails, the hook will fail.

Note that if the yapf formatter fixes anything, that will cause the hook to fail. Check that the changes look good, then apply the commit again to see it pass.

To set up the pre-commit hook:

pre-commit install

Note that once this is set up, you'll get an error if you try to commit outside of your virtualenv.

Run All

To run the quality checks on the whole project:

./bin/lint

Auto-format

If you just want to auto-format the project, run:

./bin/format

Local Admin

Django comes with a built-in admin system, that lets you create and change database records.

For info on how to set up admin users, see the Django Docs

Loading Data

A script for loading FEC bulk data files into the database lives in /fec/management/commands/load_bulk.py. To use it, download Bulk Data for Candidates Master, Committee Master, or Contributions by individuals.

To get started, there is some starter data under /fec/data.

Once you have data downloaded, you can load data with:

pipenv shell
./manage.py load_bulk candidates fec/data/candidates_2018.txt
./manage.py load_bulk committees fec/data/committees_2018.txt
./manage.py load_bulk contributions fec/data/contributions_samples_2018.txt

To add the ability to handle a new bulk report type, save its header file under fec/headers, and edit the LoadBulk._record_creator function to support the new report.