Dev Environment Setup
This is a minimal Python development setup for me as a backend developer.
Prerequisites
On macOS, first install Xcode command line utils with xcode-select --install
then install homebrew .
On Windows I prefer to set up a WSLg environment.
For my editor, I prefer Visual Studio Code. It will auto-install Python extensions when opened in a Python project.
When deploying I end up using Docker and having docker compose
is pretty handy for local development. Recent versions come with the compose plugin so there’s no need to install it separately.
Python
Use pyenv
to install Python instead of relying on a system installed Python. It doesn’t support Windows but does work in a WSLg Linux environment.
This allows me to use multiple versions of Python easily.
I like the idea of being able to use a fully portable Python standalone however there are a lot of issues. This gives me enough flexibility for my needs without running into issues like static linking build failures.
I prefer using the latest stable Python for most projects but have noticed that certain communities tend to run one version behind. As a result, my default is to keep the two latest versions of Python on hand with the latest as my default.
Poetry
I use poetry
to manage dependencies and virtual environments for each of my projects.
I don’t particularly like the project but it has a lot of community support and is one of the two dependency management tools that can create platform-agnostic lock files right now. I don’t have an Apple Silicon Mac yet and I want to deploy to ARM instances in AWS when I can so having a platform-agnostic lock file is handy.
My main gripes are:
- doesn’t follow PEP 621 (partially because it predates it)
- the 5% CI-breaking 1.x move
- library hostile
- huge install footprint (50-ish packages)
- kinda slow.
I actually quite liked PDM , the other platform-agnostic tool, but when I last tried it the project did not support virtualenv
. This has changed since PEP 582 was rejected so PDM could be a viable, mostly standards compliant alternative to Poetry. It just doesn’t have community mindshare and I’m not sure how much better it is.
The community seems to love rye
and uv
these days. Now that uv
finally has universal lockfiles I’d definitely consider trying it for my next project.
I put my virtual environments inside my project to keep things tidy.
Since poetry is a pretty heavy install I try to avoid it in deployments. Instead, I tend to use the export
plugin to generate a requirements file and build wheels I can distribute. I could use the bundle
plugin but I find it easier to rebuild wheels only when my dependencies change.
Testing
I use pytest
mostly because it used to be a default in Poetry and I’m more familiar with it than with unittest
style tests. Even in the past, I would use nose2
to extend unittest
.
I almost always start off with pytest-mock
so I don’t need to mix and match testing styles to get mocks. However, I try to use mocks judiciously.
Historically, I have used coverage
directly but have started trying to use pytest-cov
, which handles pytest-xdist
much better than coverage
alone. I rarely need python-xdist
but on larger, older projects it is a timesaver.
Linting and Formatting
I use ruff
in all new projects to handle linting and formatting.
My default config is:
If I want to be very proscriptive (for team projects, etc.) I would use the following:
I used to use black
, isort
, and flake8
with flake8-bugbear
. ruff
can replace all those tools. I do feel bad that ruff
effectively rewrote all of those projects in a faster language, leveraging the work from the previous developers instead of collaborating with them. But ruff
is extremely fast, to the point I can literally lint on save without noticing.
Type Checking
For new projects where I want to add type-checking I use pyright
. This is mostly because of the superior integration with Visual Studio Code over mypy
.
I generally don’t rely on type-checking for personal projects but use it in projects where I work with others.
I leverage the in-project .venv
to work around annoying type-checking pathing limitations:
Pre-commit
I use pre-commit
for pre-commit git hooks. I use the following config that covers all of the previous tools.
Make sure to update all the tags to the latest versions or versions that you have installed in your project.
Because pre-commit
uses its own virtual environment for each repo you’ll sometimes run into issues with dependencies for type checking. The configuration mentioned above should avoid issues but installing additional dependencies is a tried and true method I’ve carried over from mypy
.
Makefile
I almost always include this Makefile
in my projects. I’ve become too used to having make lint
and make test
. It still uses coverage
out of simplicity but pytest-cov
is superior when you use pytest-xdist
.
I usually add two extra targets for lint-ci
and test-ci
that are read-only and output reports in formats that CI/CD pipelines can use.
Deployment
I use following Dockerfile
and compose.yaml
template for nearly all my projects.
This the file is based on tips from Python Speed with a bit of extra nicety to work with docker compose.
The first stage is used to build the application and all of its dependencies into wheels. Any other build dependencies can be installed at this point (e.g. build-essentials
or libpq-dev
).
The final stage is the actual distribution image. We install dependencies using the requirements file and wheels generated in the first stage. We can also install non-build requirements (e.g. libpq5
to match libpq-dev
). I’ve considered using the Google Distroless python3 image gcr.io/distroless/python3
but it doesn’t let you pin specific versions of python which is a bit of a hassle. We could use the process documented here to make our own but for my needs the slim
image is good enough.
The second stage is purely used for local development in with docker compose
. We copy the entire environment into docker container and rely on the watch
command to keep it in sync with our local files.