Dev Environment Setup
This is a minimal Python development setup for me as a backend developer.
Prerequisites
On macOS, first install Xcode command line utils with xcode-select --install
then install homebrew .
On Windows I prefer to set up a WSLg environment.
For my editor, I prefer Visual Studio Code or a derivative. It will auto-install Python extensions when opened in a Python project.
When deploying I end up using Docker and having docker compose
is pretty handy for local development. Recent versions come with the compose plugin so there’s no need to install it separately.
Python
I use mise
to install Python instead of relying on a system installed Python. The Windows experience is a bit lacking but viable.
This allows me to use multiple versions of Python easily.
I prefer using the latest stable Python after the first patch release comes out for most projects but have noticed that certain communities tend to run one version behind. As a result, my default is to keep the two latest versions of Python on hand with the latest as my default.
UV
I use uv
to manage dependencies and virtual environments for each of my projects.
I used to use poetry
despite all the following gripes
- doesn’t follow PEP 621 (partially because it predates it)
- the 5% CI-breaking 1.x move
- library hostile
- huge install footprint (50-ish packages)
- kinda slow.
Now that uv
has universal lockfiles I can code on macOS ARM and deploy on Linux X86 (or ARM) trivially just like I did with poetry
, and have no real reason to use use it.
Maintenance Guidelines
uv
has good docs on how to actually manage dependencies. One thing that is a bit annoying is that there is no good way to only upgrade a group of dependencies - you can only upgrade all or a single package at a time.
Any library that is used explicitly in a project should be added to pyproject.toml
, either as a direct dependency or an extra to another dependency. If possible, favor extras since that will allow upstream maintainers to specify version compatibility. Do not rely on transitive dependencies since they can change at any time. Add transitive dependencies explicitly even if you are using their features indirectly (e.g. via environment variables) to guarantee the availability of those features.
Development dependencies should be in the dev group. Third party type stubs (e.g. those from typeshed) should only be added as development dependencies. Development dependencies should not be used in non-development code nor should they be used in the final stage of a Dockerfile
. typing.TYPE_CHECKING
can be surprising, so try to avoid using it or importing type stubs directly when possible.
typing-extensions, which is effectively an extension to typing
in the standard library, is the sole typing library that can be used outside of the development group.
deptry has a good list of rules to run as a baseline, even if you don’t use the tool explicitly.
At any given time, try to keep your pyproject.toml
in a place where it is safe to uv lock --upgrade
. Tighten or even fully pin versions so that they do not change if you do not want them to. Keep in mind that not all libraries follow semver.
Similarly, make sure you update the pyproject.toml
to new minimum versions if you are relying on newly added features (e.g. third party SDK upgrades). This may mean adjusting “minor” or “patch” versions even if they safely lock.
Testing
I use pytest
mostly because it used to be a default in Poetry and I’m more familiar with it than with unittest
style tests. Even in the past, I would use nose2
to extend unittest
.
I almost always start off with pytest-mock
so I don’t need to mix and match testing styles to get mocks. However, I try to use mocks judiciously.
Historically, I have used coverage
directly but have started trying to use pytest-cov
, which handles pytest-xdist
much better than coverage
alone. I rarely need python-xdist
but on larger, older projects it is a timesaver.
Linting and Formatting
I use ruff
in all new projects to handle linting and formatting.
My default config is:
[tool.ruff]
target-version = "py313"
[tool.ruff.lint]
extend-select = ["I", "B", "Q", "T20", "PT"]
If I want to be very proscriptive (for team projects, etc.) I would use the following, and remove any rules that aren’t relevant:
[tool.ruff]
target-version = "py313"
[tool.ruff.lint]
extend-select = [
"I", # https://docs.astral.sh/ruff/rules/#isort-i
"B", # https://docs.astral.sh/ruff/rules/#flake8-bugbear-b
"Q", # https://docs.astral.sh/ruff/rules/#flake8-quotes-q
"T20", # https://docs.astral.sh/ruff/rules/#flake8-print-t20
"PT", # https://docs.astral.sh/ruff/rules/#flake8-pytest-style-pt
"SIM", # https://docs.astral.sh/ruff/rules/#flake8-simplify-sim
"A", # https://docs.astral.sh/ruff/rules/#flake8-builtins-a
"EM", # https://docs.astral.sh/ruff/rules/#flake8-errmsg-em
"N", # https://docs.astral.sh/ruff/rules/#pep8-naming-n
"UP", # https://docs.astral.sh/ruff/rules/#pyupgrade-up
"FAST", # https://docs.astral.sh/ruff/rules/#fastapi-fast
"ASYNC",# https://docs.astral.sh/ruff/rules/#flake8-async-async
"DTZ", # https://docs.astral.sh/ruff/rules/#flake8-datetimez-dtz
"LOG", # https://docs.astral.sh/ruff/rules/#flake8-logging-log
"TRY", # https://docs.astral.sh/ruff/rules/#tryceratops-try
]
I used to use black
, isort
, and flake8
with flake8-bugbear
. ruff
can replace all those tools. I do feel bad that ruff
effectively rewrote all of those projects in a faster language, leveraging the work from the previous developers instead of collaborating with them. But ruff
is extremely fast, to the point I can literally lint on save without noticing.
Type Checking
For new projects where I want to add type-checking I use pyright
. This is mostly because of the superior integration with Visual Studio Code over mypy
.
I generally don’t rely on type-checking for personal projects but use it in projects where I work with others.
I leverage the in-project .venv
to work around annoying type-checking pathing limitations:
[tool.pyright]
pythonVersion = "3.13"
venvPath = "."
venv = ".venv"
Pre-commit
I use pre-commit
for pre-commit git hooks. I use the following config that covers all of the previous tools.
Make sure to update all the tags to the latest versions or versions that you have installed in your project. I used to prefer when pre-commit
did not change things for me (esp. during rebases) but I’ve come around to just letting the tool make all the changes while working with AI - fewer commands mean fewer tokens, etc.
default_language_version:
python: python3.13
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-yaml
- id: check-toml
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.12.8
hooks:
- id: ruff-check
args: [ "--fix" ]
- id: ruff-format
- repo: https://github.com/RobertCraigie/pyright-python
rev: v1.1.403
hooks:
- id: pyright
additional_dependencies: []
- repo: https://github.com/astral-sh/uv-pre-commit
rev: 0.8.8
hooks:
- id: uv-lock
Because pre-commit
uses its own virtual environment for each repo you’ll sometimes run into issues with dependencies for type checking. The configuration mentioned above should avoid issues but installing additional dependencies is a tried and true method I’ve carried over from mypy
.
Makefile
I almost always include this Makefile
in my projects. I’ve become too used to having make lint
and make test
. It still uses coverage
out of simplicity but pytest-cov
is superior when you use pytest-xdist
.
.PHONY: all lint test types
all: lint types test
lint:
ruff check --fix
ruff format
types:
pyright
test:
coverage run -m pytest
I usually add two extra targets for lint-ci
and test-ci
that are read-only and output reports in formats that CI/CD pipelines can use.
Deployment
I use following Dockerfile
template for nearly all my projects.
FROM python:3.13 as build
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
ENV UV_LINK_MODE=copy \
UV_COMPILE_BYTECODE=1 \
UV_PYTHON_DOWNLOADS=never \
UV_PYTHON=python3.13 \
UV_PROJECT_ENVIRONMENT=/app
# creates venv in /app and installs only deps
RUN --mount=type=cache,target=/root/.cache \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync \
--locked \
--no-dev \
--no-install-project
# assumes you can build your app as a package, and then installs that
COPY . /src
WORKDIR /src
RUN --mount=type=cache,target=/root/.cache \
uv sync \
--locked \
--no-dev \
--no-editable
FROM python:3.13-slim
RUN useradd -ms /bin/sh -u 1001 app
WORKDIR /home/app
USER app
# sets python path
ENV PATH="/home/app/bin/:${PATH}"
COPY --from=build --chown=app:app /app /home/app
CMD ["uvicorn", "my_app.main:app", "--host", "0.0.0.0", "--port", "8080"]
This the file is based on tips from Python Speed and Hynek’s blog.
The first stage is used to build the application and all of its dependencies. Any other build dependencies can be installed at this point (e.g. build-essentials
or libpq-dev
).
The second stage is the actual distribution image. We can just copy the whole virtual environment over because we built the whole thing. We can also install non-build requirements (e.g. libpq5
to match libpq-dev
).
The base images used are the official Python images from Docker that are based on Debian. Historically, these have been the simplest to work with.
The biggest issue with the official image is that the Debian base often has multiple high severity CVEs. The easiest way to mitigate this is to use a distroless image, but they all have downsides. The Python3 image from Google hasn’t been updated in a while and doesn’t let you pin specific versions of python. Chainguard only offers the latest Python version, which works until you have a dependency that hasn’t upgraded and now you have to pay to build your application. Rapidfort does not offer a free option, while the Iron Bank image is based on Red Hat and requires verification to access. Obviously, we could build our own following guides like this one but that comes with maintenance burden (probably why Google hasn’t updated).
I used to try to use docker compose
for local dev but found it painful when working with AI libraries. Things like built-dependencies (pytorch
, nltk
, etc.) made the rebuild cycle too slow and tedious.