Decorate behaviour of subclass methods of an abstract base class

Question:

I have an inheritance scenario that looks like this:

import abc


class A(metaclass=abc.ABCMeta):
    def __init__(self, x: int | None = None) -> None:
        self.x = x

    @abc.abstractmethod
    def foo(self, x: int | None = None) -> None:
        pass


class B(A):
    def foo(self, x: int | None = None) -> None:
        x = x or self.x
        if x is None:
            raise ValueError("x must be defined")

        print(x)


class C(B):
    ...  # defines a bunch of other stuff

The idea is that it will be possible for users (other developers) to use my class in two different ways. Either they interact directly with B (or another equivalent subclass of A) and then they have to pass certain values to its methods (in this case, they need to pass x to the B.foo method. Alternatively, they can instead subclass B (like C in this example) and then set x directly in the init and then subsequently there is no need to pass x to C.foo.

In the end, both of these usages should be valid:

b = B()
b.foo(x=1)

c = C(x=1)
c.foo()

Now, the problem is that I would like to avoid having to repeat this clause in different methods and different subclasses of A:

x = x or self.x
if x is None:
    ValueError("x must be defined")

Is there any way I can do something in A that will essentially run this check, and effectively pass the along the value of x to the foo method if it has not been given a value for x?

I would essentially want to make self.x the default value for x, but I don’t want to have to re-write the if x is None check if I can avoid it.

Something that comes to mind would be if I could somehow automatically register a decorator onto the methods of my subclass, but I don’t think that it is possible?

Asked By: Fredrik Nilsson

||

Answers:

It’s not foo you want to override, it’s a method used by foo.

Let’s start by updating A.

class A(metaclass=abc.ABCMeta):
    def __init__(self, x: int | None = None) -> None:
        self.x = x

    def foo(self, x: int | None = None) -> None:
        if x is None:
            if self.x is None:
                raise ValueError("x must be provided")
            x = self.x
        self.foo_work(x)

    @abstractmethod
    def foo_work(self, x: int) -> None:
        ...

Note that foo_work must receive an x; it won’t accept None as an argument. But that’s OK, because your end user will never call foo_work directly: they will only call foo, which either finds a valid argument for foo_work or raises an exception.

class B(A):
    def foo_work(self, x: int) -> None:       
        print(x)


B().foo()  # ValueError
B(1).foo()  # OK
B().foo(1)  # OK

This might seem a little awkward: you have what is essentially an implementation detail of A being exposed as part of its public interface. This stems from the fact that foo is really two methods being forced together into one:

  1. One that takes a value as an argument
  2. One that takes a value from its own internal state

Arguably, you should keep the two separate:

class A(metaclass=ABCMeta):
    def __init__(self, x: int = None):
        self.x = x

    @abstractmethod
    def foo(self, x: int) -> None:
        ...

    def foo_with_default(self) -> None:
        if self.x is None:
            raise ValueError("x must be defined")
        return self.foo(self.x)

Now, both methods are part of the public interface, subclasses are required to implement foo, and foo_with_default is essentially final and should not be overridden (though abc doesn’t provide a way to mark it as such). The cost is that the end-user must be aware of which function to use.

Answered By: chepner
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.