Skip to content

Contributing

Summary

PRs welcome!

  • Consider starting a discussion to see if there's interest in what you want to do.
  • Submit PRs from feature branches on forks to the develop branch.
  • Ensure PRs pass all CI checks.
  • Maintain test coverage at 100%.

Git

Code style

  • Python code is formatted with Black. Configuration for Black is stored in pyproject.toml.
  • Python imports are organized automatically with isort.
    • The isort package organizes imports in three sections:
      1. Standard library
      2. Dependencies
      3. Project
    • Within each of those groups, import statements occur first, then from statements, in alphabetical order.
    • You can run isort from the command line with poetry run isort ..
    • Configuration for isort is stored in pyproject.toml.
  • Other web code (JSON, Markdown, YAML) is formatted with Prettier.
  • Code style is enforced with pre-commit, which runs Git hooks.

    • Configuration is stored in .pre-commit-config.yaml.
    • Pre-commit can run locally before each commit (hence "pre-commit"), or on different Git events like pre-push.
    • Pre-commit is installed in the Poetry environment. To use:

      # after running `poetry install`
      path/to/inboard
      ❯ poetry shell
      
      # install hooks that run before each commit
      path/to/inboard
      .venv ❯ pre-commit install
      
      # and/or install hooks that run before each push
      path/to/inboard
      .venv ❯ pre-commit install --hook-type pre-push
      
    • Pre-commit is also useful as a CI tool. The hooks GitHub Actions workflow runs pre-commit hooks with GitHub Actions.

Python

Poetry

This project uses Poetry for dependency management.

Install project with all dependencies: poetry install -E all.

Highlights

  • Automatic virtual environment management: Poetry automatically manages the virtualenv for the application.
  • Automatic dependency management: rather than having to run pip freeze > requirements.txt, Poetry automatically manages the dependency file (called pyproject.toml), and enables SemVer-level control over dependencies like npm. Poetry also manages a lockfile (called poetry.lock), which is similar to package-lock.json for npm. Poetry uses this lockfile to automatically track specific versions and hashes for every dependency.
  • Dependency resolution: Poetry will automatically resolve any dependency version conflicts. pip did not have dependency resolution until the end of 2020.
  • Dependency separation: Poetry can maintain separate lists of dependencies for development and production in the pyproject.toml. Production installs can skip development dependencies to speed up Docker builds.
  • Builds: Poetry has features for easily building the project into a Python package.

Installation

The recommended installation method is through the Poetry custom installer, which vendorizes dependencies into an isolated environment, and allows you to update Poetry with poetry self update:

You can also install Poetry however you prefer to install your user Python packages (pipx install poetry, pip install --user poetry, etc). Use the standard update methods with these tools (pipx upgrade poetry, pip install --user --upgrade poetry, etc).

Key commands

# Basic usage: https://python-poetry.org/docs/basic-usage/
poetry install  # create virtual environment and install dependencies
poetry show --tree  # list installed packages
poetry add PACKAGE@VERSION # add package production dependencies
poetry add PACKAGE@VERSION --dev # add package to development dependencies
poetry update  # update dependencies (not available with standard tools)
poetry version  # list or update version of this package
poetry shell  # activate the virtual environment, like source venv/bin/activate
poetry run COMMAND  # run a command within the virtual environment
poetry env info  # https://python-poetry.org/docs/managing-environments/
poetry config virtualenvs.in-project true  # install virtualenvs into .venv
poetry export -f requirements.txt > requirements.txt --dev  # export deps

Running the development server

The easiest way to get started is to run the development server locally with the VSCode debugger. The debugger config is stored in launch.json. After installing the Poetry environment as described above, start the debugger. Uvicorn enables hot-reloading and addition of debug breakpoints while the server is running. The Microsoft VSCode Python extension also offers a FastAPI debugger configuration, added in version 2020.12.0, which has been customized and included in launch.json. To use it, simply select the FastAPI config and start the debugger.

As explained in the VSCode docs, if developing on Linux, note that non-root users may not be able to expose ports less than 1024.

Testing with pytest

Docker

Docker basics

Expand this details element for more useful Docker commands.
# Log in with Docker Hub credentials to pull images
docker login
# List images
docker images
# List running containers: can also use `docker container ls`
docker ps
# View logs for the most recently started container
docker logs -f $(docker ps -q -n 1)
# View logs for all running containers
docker logs -f $(docker ps -aq)
# Inspect a container (web in this example) and return the IP Address
docker inspect web | grep IPAddress
# Stop a container
docker stop # container hash
# Stop all running containers
docker stop $(docker ps -aq)
# Remove a downloaded image
docker image rm # image hash or name
# Remove a container
docker container rm # container hash
# Prune images
docker image prune
# Prune stopped containers (completely wipes them and resets their state)
docker container prune
# Prune everything
docker system prune
# Open a shell in the most recently started container (like SSH)
docker exec -it $(docker ps -q -n 1) /bin/bash
# Or, connect as root:
docker exec -u 0 -it $(docker ps -q -n 1) /bin/bash
# Copy file to/from container:
docker cp [container_name]:/path/to/file destination.file

Building development images

To build the Docker images for each stage:

git clone git@github.com:br3ndonland/inboard.git

cd inboard

docker build . --rm --target base -t localhost/br3ndonland/inboard:base && \
docker build . --rm --target fastapi -t localhost/br3ndonland/inboard:fastapi && \
docker build . --rm --target starlette -t localhost/br3ndonland/inboard:starlette

Running development containers

# Run Docker container with Uvicorn and reloading
cd inboard

docker run -d -p 80:80 \
  -e "BASIC_AUTH_USERNAME=test_user" \
  -e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
  -e "LOG_LEVEL=debug" \
  -e "PROCESS_MANAGER=uvicorn" \
  -e "WITH_RELOAD=true" \
  -v $(pwd)/inboard:/app/inboard localhost/br3ndonland/inboard:base

docker run -d -p 80:80 \
  -e "BASIC_AUTH_USERNAME=test_user" \
  -e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
  -e "LOG_LEVEL=debug" \
  -e "PROCESS_MANAGER=uvicorn" \
  -e "WITH_RELOAD=true" \
  -v $(pwd)/inboard:/app/inboard localhost/br3ndonland/inboard:fastapi

docker run -d -p 80:80 \
  -e "BASIC_AUTH_USERNAME=test_user" \
  -e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
  -e "LOG_LEVEL=debug" \
  -e "PROCESS_MANAGER=uvicorn" \
  -e "WITH_RELOAD=true" \
  -v $(pwd)/inboard:/app/inboard localhost/br3ndonland/inboard:starlette

# Run Docker container with Gunicorn and Uvicorn
docker run -d -p 80:80 \
  -e "BASIC_AUTH_USERNAME=test_user" \
  -e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
  localhost/br3ndonland/inboard:base
docker run -d -p 80:80 \
  -e "BASIC_AUTH_USERNAME=test_user" \
  -e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
  localhost/br3ndonland/inboard:fastapi
docker run -d -p 80:80 \
  -e "BASIC_AUTH_USERNAME=test_user" \
  -e "BASIC_AUTH_PASSWORD=r4ndom_bUt_memorable" \
  localhost/br3ndonland/inboard:starlette

# Test HTTP Basic auth when running the FastAPI or Starlette images:
http :80/status -a test_user:r4ndom_bUt_memorable

Change the port numbers to run multiple containers simultaneously (-p 81:80).

GitHub Actions workflows

GitHub Actions is a continuous integration/continuous deployment (CI/CD) service that runs on GitHub repos. It replaces other services like Travis CI. Actions are grouped into workflows and stored in .github/workflows. See Getting the Gist of GitHub Actions for more info.

Maintainers

  • The default branch is develop.
  • PRs should be merged into develop. Head branches are deleted automatically after PRs are merged.
  • The only merges to main should be fast-forward merges from develop.
  • Branch protection is enabled on develop and main.
    • develop:
      • Require signed commits
      • Include administrators
      • Allow force pushes
    • main:
      • Require signed commits
      • Include administrators
      • Do not allow force pushes
      • Require status checks to pass before merging (commits must have previously been pushed to develop and passed all checks)
  • To create a release:
    • Bump the version number in pyproject.toml with poetry version and commit the changes to develop.
    • Push to develop and verify all CI checks pass.
    • Fast-forward merge to main, push, and verify all CI checks pass.
    • Create an annotated and signed Git tag
      • Follow SemVer guidelines when choosing a version number.
      • List PRs and commits in the tag message:
        git log --pretty=format:"- %s (%h)" \
          "$(git describe --abbrev=0 --tags)"..HEAD
        
      • Omit the leading v (use 1.0.0 instead of v1.0.0)
      • Example: git tag -a -s 1.0.0
    • Push the tag. GitHub Actions will build and push the Python package and Docker images.
Back to top