# Makefile .DELETE_ON_ERROR: SHELL := /bin/bash WITH_VENV := source venv/bin/activate && .PHONY: clean clean: rm -f .make.* rm -rf venv* # Environment: venv/bin/activate: /usr/bin/python3.8 --version virtualenv --python=/usr/bin/python3.8 venv .make.venv: venv/bin/activate touch .make.venv .make.venv.pip-tools: .make.venv requirements/pip-tools.txt ${WITH_VENV} pip install -r requirements/pip-tools.txt touch .make.venv.pip-tools .make.venv.dev: .make.venv.pip-tools .make.venv.dev: requirements/pip-tools.txt requirements/base.txt requirements/dev.txt @ echo 'NOTE: `touch requirements/{base,deploy,dev}.txt` to snooze dependency upgrade when `.in` files are modified.' ${WITH_VENV} pip-sync requirements/pip-tools.txt requirements/base.txt requirements/dev.txt touch .make.venv.dev # Requirements: requirements/base.txt: requirements/pip-tools.txt requirements/base.txt: requirements/base.in requirements/base.txt: | .make.venv.pip-tools ${WITH_VENV} pip-compile requirements/base.in requirements/deploy.txt: requirements/pip-tools.txt requirements/base.txt requirements/deploy.txt: requirements/deploy.in requirements/deploy.txt: | .make.venv.pip-tools ${WITH_VENV} pip-compile requirements/deploy.in requirements/dev.txt: requirements/pip-tools.txt requirements/base.txt requirements/deploy.txt requirements/dev.txt: requirements/dev.in requirements/dev.txt: | .make.venv.pip-tools ${WITH_VENV} pip-compile requirements/dev.in .PHONY: requirements requirements: requirements/base.txt requirements/dev.txt requirements/deploy.txt @ echo 'NOTE: `rm requirements/{base,deploy,dev}.txt` before `make requirements` to upgrade all the dependencies.' # Entrypoints: .PHONY: test-unit test-unit: .make.venv.dev @ ${WITH_VENV} python -c 'import pytest; print("pytest would run as version " + pytest.__version__ + "!")'
Makefile
above automates pip-tools
dependency management workflow during
development.
Say we just cloned a repo set up this way. We can run unit tests straight away* by make test-unit
.
It will:
venv
.pip-tools
in it.pip-compile
all the .in
requirements (application, deployment and development)
to .txt
in correct order.pip-sync
all the requirements.pytest
.venv
environment.make test-unit
will simply invoke pytest
since
everything would be already set up and up to date.virtualenv
dependency and correct
version of Python, here - 3.8.
Now, let's say we're testing out some new library, say google-cloud-speech
.
All we have to do is echo 'google-cloud-speech~=1.3.2' >> requirements/base.in
and
then simply call make test-unit
again:
base.in
will be detected and it will be compiled to base.txt
.base.txt
, dev.in
will get recompiled too
to respect new application requirements.pip-sync
-ed with new requirements.from google.cloud import speech_v1
will work.
GNU Make
was designed precisely for compilation tasks and therefore it works really well with
pip-tools
.
After all, pip-compile
is pip-compile
…
Application requirements base.in
compile to corresponding .txt
file, upon which
development requirements depend - dev.in
must respect what's inside already compiled
base.txt
and it is
very easy to express such relationship with GNU Make
. All we have to do is configure prerequisites
correctly.
The "uncompiled" requirement files might look like this:
# Note that requirement files are tucked inside "requirements" # directory to avoid flooding project root. # requirements/base.in -c pip-tools.txt Flask~=1.1 loginpass~=0.3 numpy~=1.1 tensorflow~=2.0 google-cloud-storage~=1.2 google-cloud-ndb~=1.1 # requirements/deploy.in -c pip-tools.txt -c base.txt gunicorn~=19.8 # requirements/dev.in -c pip-tools.txt -c base.txt -c deploy.txt pytest~=5.4 coverage~=5.0 freezegun~=0.3.13 ipython~=7.1 pandas~=0.23Note that
Makefile
prerequisites directly map to -c
constraints.pip-tools.in
doesn't "participate" in requirements compilation flow to
avoid chicken and egg problem - we couldn't pip-compile pip-tools.in
without having
pip-tools
in place already.pip-tools
as "base" environment, upon which we can build workflow
automation. Therefore, pip-tools.txt
is generated manually only once when setting up the project.
Why not Pipenv
or Poetry
?
Pipenv
reviews on the web were so unfavorable that I ruled out that option straight away without
even testing it out.
As for Poetry
- I did try it out, but encountered a deal-breaker bug.
I created an issue (#2080), which didn't
attract
any attention for several months.
Recently, just as I was starting to roll up my sleeves to take up the holy challenge of contributing to open
source by fixing that bug myself, I noticed a PR from project author
linking to my issue. Two PRs actually: a refactor in the core and an actual fix. (Gotta appreciate it when
people refactor their code! That PR was actually just a first step of larger scale refactor, as author put
it in comment.)
Needless to say, my ambition of contributing such a fix was not based in reality, to put it mildly.
But the point is - the project is being actively developed and well maintained.
I believe that Poetry
will eventually become Python packaging and dependency management
standard.
I still appreciate simple setups like above though.
For one, pip-tools
is a battle tested software and relying on it is never a bad idea. I've been
using this setup for quite some time already and I don't seem to have any serious complaints with it.
Also, it's very simple and does only what I need it to. Packaging and distribution is not one of them.
Third, the Makefile
is out there in the project to be fiddled with. If something breaks you can fix
it right there.
But there are downsides too of course: for teams it might be easier to onboard new developers with
Poetry
- it works much like npm
and that's much less intimidating than my setup. As
Poetry
matures and gains more adoption it will become a de facto standard. At that point
deviating from it will no longer be pragmatic, especially once all the bugs are dealt with. (Which I don't
think there are many left of.)
(A template of sorts available on my gh repo.)
In this blog
James Cooke describes his Makefile
. It's cleaner and more elegant by virtue of using wildcards.