Where to place shared test annotation/code when using pytest?

Question:

I’m using pytest to write some unit tests and I have some tests that can only be run when the tests are running in cloud under some special runtime (Databricks cluster).

I want to automatically skip these tests when I run the tests locally. I know how to find if I’m running locally or not programmatically.

This is my project structure.

.
├── poetry.lock
├── poetry.toml
├── pyproject.toml
├── README.md
└── src
    ├── pkg1
    │   ├── __init__.py
    │   ├── conftest.py
    │   ├── module1.py
    │   ├── module2.py
    │   ├── test_module1.py
    │   ├── test_module2.py
    │   └── utils
    │       ├── aws.py
    │       └── common.py
    └── pkg2
        ├── __init__.py
        ├── ...

test_module1.py:

from pkg1 import module1
from common import skip_if_running_locally

def test_everywhere(module1_instance):
   pass # do test..

@skip_if_running_locally
def test_only_in_cloud(module1_instance):
   pass # do test..

common.py:

import pytest
from pyspark.sql import SparkSession

my_spark = SparkSession.getActiveSession()
running_locally = my_spark is None or 
                  my_spark.conf.get('spark.app.name') != 'Databricks Shell'

skip_if_running_locally = pytest.mark.skipif(running_locally, reason='running locally')

And I do the same in test_module2.py to mark tests that should be skipped locally.

  • I don’t really like to put this in common.py because it contains the common application code (not test code).
  • I thought about putting it in a base class, but then it has to be a Class attribute (not self. instance attr).
  • If I put it in a test_common.py then it’ll be picked up by pytest as a file containing test cases.
  • If I put it in conftest.py how do I import it? from conftest import skip_... ?

What is the right way of doing this? Where do I store common code/annotations dedicated to testing and how do I use it?

Asked By: Kashyap

||

Answers:

Generally, conftest.py is the place to put common test logic. There is nothing wrong with using util/common modules, but the conftest.py has two advantages:

  1. It is executed automatically by pytest.
  2. It is the standard place, so developers would often check it.

With that said, I believe that you can use the approach mentioned here to have custom markers enabled/disabled according to the environment.

Your tests would look like so (note that there is no import, just using the locally vs cloud markers):

import pytest


@pytest.mark.locally
def test_only_runs_locally():
    pass


@pytest.mark.cloud
def test_only_runs_on_the_cloud():
    pass


def test_runs_everywhere():
    pass

Then inside the conftest.py you enable/disable the proper tests:

from pyspark.sql import SparkSession

import pytest

ALL = set("locally cloud".split())
my_spark = SparkSession.getActiveSession()
running_on = "locally" if (
        my_spark is None
        or my_spark.conf.get('spark.app.name') != 'Databricks Shell'
) else "cloud"

# runs before every test
def pytest_runtest_setup(item):
    # look for all the relevant markers of the test
    supported_platforms = ALL.intersection(mark.name for mark in item.iter_markers())

    if supported_platforms and running_on not in supported_platforms:
        pytest.skip(
            f"We're running on {running_on}, cannot run {supported_platforms} tests")
Answered By: Peter K
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.