Why are there no Makefiles for automation in Python projects?

Question:

As a long time Python programmer, I wonder, if a central aspect of Python culture eluded me a long time: What do we do instead of Makefiles?

Most ruby-projects I’ve seen (not just rails) use Rake, shortly after node.js became popular, there was cake. In many other (compiled and non-compiled) languages there are classic Make files.

But in Python, no one seems to need such infrastructure. I randomly picked Python projects on GitHub, and they had no automation, besides the installation, provided by setup.py.

What’s the reason behind this?

Is there nothing to automate? Do most programmers prefer to run style checks, tests, etc. manually?

Some examples:

  • dependencies sets up a virtualenv and installs the dependencies
  • check calls the pep8 and pylint commandlinetools.
  • the test task depends on dependencies enables the virtualenv, starts selenium-server for the integration tests, and calls nosetest
  • the coffeescript task compiles all coffeescripts to minified javascript
  • the runserver task depends on dependencies and coffeescript
  • the deploy task depends on check and test and deploys the project.
  • the docs task calls sphinx with the appropiate arguments

Some of them are just one or two-liners, but IMHO, they add up. Due to the Makefile, I don’t have to remember them.

To clarify: I’m not looking for a Python equivalent for Rake. I’m glad with paver. I’m looking for the reasons.

Asked By: keppla

||

Answers:

The original PEP where this was raised can be found here. Distutils has become the standard method for distributing and installing Python modules.

Why? It just happens that python is a wonderful language to perform the installation of Python modules with.

Answered By: Lewis Norton

Is there nothing to automate?

Not really. All but two of the examples are one-line commands.

tl;dr Very little of this is really interesting or complex. Very little of this seems to benefit from “automation”.

Due to documentation, I don’t have to remember the commands to do this.

Do most programmers prefer to run stylechecks, tests, etc. manually?

Yes.

generation documentation,
the docs task calls sphinx with the appropiate arguments

It’s one line of code. Automation doesn’t help much.
sphinx-build -b html source build/html. That’s a script. Written in Python.

We do this rarely. A few times a week. After “significant” changes.

running stylechecks (Pylint, Pyflakes and the pep8-cmdtool).
check calls the pep8 and pylint commandlinetools

We don’t do this. We use unit testing instead of pylint.
You could automate that three-step process.

But I can see how SCons or make might help someone here.

tests

There might be space for “automation” here. It’s two lines: the non-Django unit tests (python test/main.py) and the Django tests. (manage.py test). Automation could be applied to run both lines.

We do this dozens of times each day. We never knew we needed “automation”.

dependecies sets up a virtualenv and installs the dependencies

Done so rarely that a simple list of steps is all that we’ve ever needed. We track our dependencies very, very carefully, so there are never any surprises.

We don’t do this.

the test task depends on dependencies enables the virtualenv, starts selenium-server for the integration tests, and calls nosetest

The start server & run nosetest as a two-step “automation” makes some sense. It saves you from entering the two shell commands to run both steps.

the coffeescript task compiles all coffeescripts to minified javascript

This is something that’s very rare for us. I suppose it’s a good example of something to be automated. Automating the one-line script could be helpful.

I can see how SCons or make might help someone here.

the runserver task depends on dependencies and coffeescript

Except. The dependencies change so rarely, that this seems like overkill. I supposed it can be a good idea of you’re not tracking dependencies well in the first place.

the deploy task depends on check and test and deploys the project.

It’s an svn co and python setup.py install on the server, followed by a bunch of customer-specific copies from the subversion area to the customer /www area. That’s a script. Written in Python.

It’s not a general make or SCons kind of thing. It has only one actor (a sysadmin) and one use case. We wouldn’t ever mingle deployment with other development, QA or test tasks.

Answered By: S.Lott

Any decent test tool has a way of running the entire suite in a single command, and nothing is stopping you from using rake, make, or anything else, really.

There is little reason to invent a new way of doing things when existing methods work perfectly well – why re-invent something just because YOU didn’t invent it? (NIH).

Answered By: Arafangion

There is a number of options for automation in Python. I don’t think there is a culture against automation, there is just not one dominant way of doing it. The common denominator is distutils.

The one which is closed to your description is buildout. This is mostly used in the Zope/Plone world.

I myself use a combination of the following: Distribute, pip and Fabric. I am mostly developing using Django that has manage.py for automation commands.

It is also being actively worked on in Python 3.3

Answered By: nfg

Setuptools can automate a lot of things, and for things that aren’t built-in, it’s easily extensible.

  • To run unittests, you can use the setup.py test command after having added a test_suite argument to the setup() call. (documentation)
  • Dependencies (even if not available on PyPI) can be handled by adding a install_requires/extras_require/dependency_links argument to the setup() call. (documentation)
  • To create a .deb package, you can use the stdeb module.
  • For everything else, you can add custom setup.py commands.

But I agree with S.Lott, most of the tasks you’d wish to automate (except dependencies handling maybe, it’s the only one I find really useful) are tasks you don’t run everyday, so there wouldn’t be any real productivity improvement by automating them.

Answered By: mdeous

Actually, automation is useful to Python developers too!

Invoke is probably the closest tool to what you have in mind, for automation of common repetitive Python tasks: https://github.com/pyinvoke/invoke

With invoke, you can create a tasks.py like this one (borrowed from the invoke docs)

from invoke import run, task

@task
def clean(docs=False, bytecode=False, extra=''):
    patterns = ['build']
    if docs:
        patterns.append('docs/_build')
    if bytecode:
        patterns.append('**/*.pyc')
    if extra:
        patterns.append(extra)
    for pattern in patterns:
        run("rm -rf %s" % pattern)

@task
def build(docs=False):
    run("python setup.py build")
    if docs:
        run("sphinx-build docs docs/_build")

You can then run the tasks at the command line, for example:

$ invoke clean
$ invoke build --docs

Another option is to simply use a Makefile. For example, a Python project’s Makefile could look like this:

docs:
    $(MAKE) -C docs clean
    $(MAKE) -C docs html
    open docs/_build/html/index.html

release: clean
    python setup.py sdist upload

sdist: clean
    python setup.py sdist
    ls -l dist
Answered By: coffee-grinder

The make utility is an optimization tool which reduces the time spent building a software image. The reduction in time is obtained when all of the intermediate materials from a previous build are still available, and only a small change has been made to the inputs (such as source code). In this situation, make is able to perform an “incremental build”: rebuild only a subset of the intermediate pieces that are impacted by the change to the inputs.

When a complete build takes place, all that make effectively does is to execute a set of scripting steps. These same steps could just be deposited into a flat script. The -n option of make will in fact print these steps, which makes this possible.

A Makefile isn’t “automation”; it’s “automation with a view toward optimized incremental rebuilds.” Anything scripted with any scripting tool is automation.

So, why would Python project eschew tools like make? Probably because Python projects don’t struggle with long build times that they are eager to optimize. And, also, the compilation of a .py to a .pyc file does not have the same web of dependencies like a .c to a .o.

A C source file can #include hundreds of dependent files; a one-character change in any one of these files can mean that the source file must be recompiled. A properly written Makefile will detect when that is or is not the case.

A big C or C++ project without an incremental build system would mean that a developer has to wait hours for an executable image to pop out for testing. Fast, incremental builds are essential.

In the case of Python, probably all you have to worry about is when a .py file is newer than its corresponding .pyc, which can be handled by simple scripting: loop over all the files, and recompile anything newer than its byte code. Moreover, compilation is optional in the first place!

So the reason Python projects tend not to use make is that their need to perform incremental rebuild optimization is low, and they use other tools for automation; tools that are more familiar to Python programmers, like Python itself.

Answered By: Kaz

Here are few examples of makefile usage with python:

https://blog.horejsek.com/makefile-with-python/

https://krzysztofzuraw.com/blog/2016/makefiles-in-python-projects.html

I think that a most of people is not aware “makefile for python” case. It could be useful, but “sexiness ratio” is too small to propagate rapidly (just my PPOV).

Answered By: Quant Christo
Categories: questions Tags: , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.