Dev Environment Setup

This is a minimal Python development setup for me as a backend developer.


On macOS, first install Xcode command line utils with xcode-select --install then install homebrew .

On Windows I prefer to set up a WSLg environment.

For my editor, I prefer Visual Studio Code. It will auto-install Python extensions when opened in a Python project.

When deploying I end up using Docker and having docker compose is pretty handy for local development. Recent versions come with the compose plugin so there’s no need to install it separately.


Use pyenv to install Python instead of relying on a system installed Python. It doesn’t support Windows but does work in a WSLg Linux environment.

This allows me to use multiple versions of Python easily.

I like the idea of being able to use a fully portable Python standalone however there are a lot of issues. This gives me enough flexibility for my needs without running into issues like static linking build failures.

I prefer using the latest stable Python for most projects but have noticed that certain communities tend to run one version behind. As a result, my default is to keep the two latest versions of Python on hand with the latest as my default.


I use poetry to manage dependencies and virtual environments for each of my projects.

I don’t particularly like the project but it has a lot of community support and is one of the two dependency management tools that can create platform-agnostic lock files right now. I don’t have an Apple Silicon Mac yet and I want to deploy to ARM instances in AWS when I can so having a platform-agnostic lock file is handy.

My main gripes are:

I actually quite liked PDM , the other platform-agnostic tool, but when I last tried it the project did not support virtualenv. This has changed since PEP 582 was rejected so PDM could be a viable, mostly standards compliant alternative to Poetry. It just doesn’t have community mindshare and I’m not sure how much better it is.

The community seems to love rye these days. I like how it wraps known entities like pip-tools/uv, virtualenv, twine, etc. but since the lockfiles are platform specific it a non-starter for me now.

I put my virtual environments inside my project to keep things tidy.

Since poetry is a pretty heavy install I try to avoid it in deployments. Instead, I tend to use the export plugin to generate a requirements file and build wheels I can distribute. I could use the bundle plugin but I find it easier to rebuild wheels only when my dependencies change.


I use pytest mostly because it used to be a default in Poetry and I’m more familiar with it than with unittest style tests. Even in the past, I would use nose2 to extend unittest.

I almost always start off with pytest-mock so I don’t need to mix and match testing styles to get mocks. However, I try to use mocks judiciously.

Historically, I could use coverage directly but have started trying to use pytest-cov, which handles pytest-xdist much better than coverage alone. I rarely need python-xdist but on larger, older projects it is a timesaver.

Linting and Formatting

I use ruff in all new projects to handle linting and formatting.

My default config is:

target-version = "py312"
extend-select = ["I", "B", "Q", "T20", "PT"]
ignore-init-module-imports = true

If I want to be very proscriptive (for team projects, etc.) I would use the following:

target-version = "py312"
extend-select = [
    "I",    #
    "B",    #
    "Q",    #
    "T20",  #
    "PT",   #
    "SIM",  #
    "A",    #
    "N",    #
    "UP",   #
    "DTZ",  #
    "LOG",  #
ignore-init-module-imports = true

I used to use black, isort, and flake8 with flake8-bugbear. ruff can replace all those tools. I do feel bad that ruff effectively rewrote all of those projects in a faster language, leveraging the work from the previous developers instead of collaborating with them. But ruff is extremely fast, to the point I can literally lint on save without noticing.

Type Checking

For new projects where I want to add type-checking I use pyright. This is mostly because of the superior integration with Visual Studio Code over mypy.

I generally don’t rely on type-checking for personal projects but use it in projects where I work with others.

I leverage the in-project .venv to work around annoying type-checking pathing limitations:

pythonVersion = "3.12"
venvPath = "."
venv = ".venv"


I use pre-commit for pre-commit git hooks. I use the following config that covers all of the previous tools.

Make sure to update all the tags to the latest versions or versions that you have installed in your project.

    python: python3.12
  - repo:
    rev: v4.5.0
      - id: check-yaml
      - id: check-toml
      - id: end-of-file-fixer
      - id: trailing-whitespace
  - repo:
    rev: v0.4.3
      - id: ruff
      - id: ruff-format
        args: ["--check"]
  - repo:
	rev: v1.1.361
	  - id: pyright
		additional_dependencies: []
  - repo:
    rev: 1.8.2
      - id: poetry-check
      - id: poetry-lock
        args: ["--no-update"]
        files: (^poetry.lock$)|(^pyproject.toml$)]

Because pre-commit uses its own virtual environment for each repo you’ll sometimes run into issues with dependencies for type checking. The configuration mentioned above should avoid issues but installing additional dependencies is a tried and true method I’ve carried over from mypy.


I almost always include this Makefile in my projects. I’ve become too used to having make lint and make test. It still uses coverage out of simplicity.

.PHONY: all lint test types
all: lint types test
	ruff check --fix
	ruff format
	coverage run -m pytest

I usually add two extra targets for lint-ci and test-ci that are read-only and output reports in formats that CI/CD pipelines can use.


I use following Dockerfile and compose.yaml template for nearly all my projects.

FROM python:3.12 as poetry
RUN python3 -m venv $VIRTUAL_ENV
RUN pip install --no-cache-dir -U pip wheel; \
    pip install --no-cache-dir poetry=="$POETRY_VERSION" poetry-plugin-export
COPY pyproject.toml pyproject.toml
COPY poetry.lock poetry.lock
RUN poetry export --without-hashes -f requirements.txt --with dev -o requirements-dev.txt
RUN poetry export --without-hashes -f requirements.txt -o requirements.txt
RUN pip wheel -r requirements-dev.txt --wheel-dir=$VIRTUAL_ENV/wheels
FROM python:3.12 as dev
RUN useradd -ms /bin/sh -u 1001 dev
WORKDIR /home/dev
USER dev
RUN python3 -m venv /home/dev/.venv
ENV PATH="/home/dev/.venv/bin:$PATH"
ENV PYTHONPATH="/home/dev/"
RUN --mount=type=bind,from=poetry,source=/opt/venv/wheels,target=/opt/venv/wheels \
    --mount=type=bind,from=poetry,source=requirements-dev.txt,target=requirements.txt \
    pip install --no-cache-dir --disable-pip-version-check --no-index -f /opt/venv/wheels -r requirements.txt
COPY --chown=dev:dev . .
FROM python:3.12-slim
RUN useradd -ms /bin/sh -u 1001 app
WORKDIR /home/app
USER app
ENV PATH="/home/app/.local/bin/:${PATH}"
RUN --mount=type=bind,from=poetry,source=/opt/venv/wheels,target=/opt/venv/wheels \
    --mount=type=bind,from=poetry,source=requirements.txt,target=requirements.txt \
    pip install --user --no-cache-dir --disable-pip-version-check --no-index -f /opt/venv/wheels -r requirements.txt
COPY --chown=app:app my_app /home/app/my_app
CMD ["uvicorn", "my_app.main:app", "--host", "", "--port", "8080"]
      context: .
      dockerfile: Dockerfile
      target: dev
    command: python -m my_app
        - action: sync
          path: ./
          target: /home/dev
        - action: rebuild
          path: poetry.lock
        - action: rebuild
          path: Dockerfile
        - action: rebuild
          path: compose.yaml
        - action: sync+restart
          path: ./my_app/
          target: /home/dev/my_app/
        - action: sync+restart
          path: .env
          target: .env
      - "8000:8000"

This the file is based on tips from Python Speed with a bit of extra nicety to work with docker compose.

The first stage is used to build the application and all of its dependencies into wheels. Any other build dependencies can be installed at this point (e.g. build-essentials or libpq-dev).

The final stage is the actual distribution image. We install dependencies using the requirements file and wheels generated in the first stage. We can also install non-build requirements (e.g. libpq5 to match libpq-dev). I’ve considered using the Google Distroless python3 image but it doesn’t let you pin specific versions of python which is a bit of a hassle. We could use the process documented here to make our own but for my needs the slim image is good enough.

The second stage is purely used for local development in with docker compose. We copy the entire environment into docker container and rely on the watch command to keep it in sync with our local files.