Test Pydantic settings in FastAPI
Question:
Suppose my main.py
is like this (this is a simplified example, in my app I use an actual database and I have two different database URIs for development and testing):
from fastapi import FastAPI
from pydantic import BaseSettings
app = FastAPI()
class Settings(BaseSettings):
ENVIRONMENT: str
class Config:
env_file = ".env"
case_sensitive = True
settings = Settings()
databases = {
"dev": "Development",
"test": "Testing"
}
database = databases[settings.ENVIRONMENT]
@app.get("/")
def read_root():
return {"Environment": database}
while the .env
is
ENVIRONMENT=dev
Suppose I want to test my code and I want to set ENVIRONMENT=test
to use a testing database. What should I do? In FastAPI documentation (https://fastapi.tiangolo.com/advanced/settings/#settings-and-testing) there is a good example but it is about dependencies, so it is a different case as far as I know.
My idea was the following (test.py
):
import pytest
from fastapi.testclient import TestClient
from main import app
@pytest.fixture(scope="session", autouse=True)
def test_config(monkeypatch):
monkeypatch.setenv("ENVIRONMENT", "test")
@pytest.fixture(scope="session")
def client():
return TestClient(app)
def test_root(client):
response = client.get("/")
assert response.status_code == 200
assert response.json() == {"Environment": "Testing"}
but it doesn’t work.
Furthermore I get this error:
ScopeMismatch: You tried to access the 'function' scoped fixture 'monkeypatch' with a 'session' scoped request object, involved factories
test.py:7: def test_config(monkeypatch)
envlibsite-packages_pytestmonkeypatch.py:16: def monkeypatch()
while from pytest
official documentation it should work (https://docs.pytest.org/en/3.0.1/monkeypatch.html#example-setting-an-environment-variable-for-the-test-session). I have the latest version of pytest
installed.
I tried to use specific test environment variables because of this: https://pydantic-docs.helpmanual.io/usage/settings/#field-value-priority.
To be honest I’m lost, my only real aim is to have a different test configuration (in the same way Flask works: https://flask.palletsprojects.com/en/1.1.x/tutorial/tests/#setup-and-fixtures). Am I approaching the problem the wrong way?
Answers:
It’s really tricky to mock environment with pydantic involved.
I only achieved desired behaviour with dependency injection in fastapi and making get_settings
function, which itself seems to be good practice since even documentation says to do so.
Suppose you have
...
class Settings(BaseSettings):
ENVIRONMENT: str
class Config:
env_file = ".env"
case_sensitive = True
def get_settings() -> Settings:
return Settings()
databases = {
"dev": "Development",
"test": "Testing"
}
database = databases[get_settings().ENVIRONMENT]
@app.get("/")
def read_root():
return {"Environment": database}
And in your tests you would write:
import pytest
from main import get_settings
def get_settings_override() -> Settings:
return Settings(ENVIRONMENT="dev")
@pytest.fixture(autouse=True)
def override_settings() -> None:
app.dependency_overrides[get_settings] = get_settings_override
You can use scope session if you’d like.
This would override your ENVIRONMENT
variable and wouldn’t touch rest of configuration variables.
PydanticSettings
are mutable, so you can simply override them in your test.py
:
from main import settings
settings.ENVIRONMENT = 'test'
This is a simple way that works for me. Consider that you have a configuration file named APPNAME.cfg with the following settings:
DEV_DSN='DSN=my_dev_dsn; UID=my_dev_user_id; PWD=my_dev_password'
PROD_DSN='DSN=my_prod_dsn; UID=my_prod_user_id; PWD=my_prod_password'
Set your environment according to your OS or Docker variable. For Linux you could enter:
export MY_ENVIORONMENT=DEV
Now consider the following settings.py:
from pydantic import BaseSettings
import os
class Settings(BaseSettings):
DSN: str
class Config():
env_prefix = f"{os.environ['MY_ENVIORONMENT']}_"
env_file = "APPNAME.cfg"
Your app would simply need to do the following:
from settings import Settings
s = Settings()
db = pyodbc.connect(s.DSN)
Bumping an old thread because I found a solution that was a bit cleaner for my use case. I was having trouble getting test specific dotenv files to load only while tests were running and when I had a local development dotenv in the project dir.
You can do something like the below where test.enviornment
is a special dotenv file that is NOT an env_file
path in the settings class Config. Because env vars > dotenv for BaseSettings, this will override any settings from a local .env as long as this is run in conftest.py before your settings class is imported. It also guarantees that your test environment is only active when tests are being run.
#conftest.py
from dotenv import load_dotenv
load_dotenv("tests/fixtures/test.environment", override=True)
from app import settings # singleton instance of BaseSettings class
Suppose my main.py
is like this (this is a simplified example, in my app I use an actual database and I have two different database URIs for development and testing):
from fastapi import FastAPI
from pydantic import BaseSettings
app = FastAPI()
class Settings(BaseSettings):
ENVIRONMENT: str
class Config:
env_file = ".env"
case_sensitive = True
settings = Settings()
databases = {
"dev": "Development",
"test": "Testing"
}
database = databases[settings.ENVIRONMENT]
@app.get("/")
def read_root():
return {"Environment": database}
while the .env
is
ENVIRONMENT=dev
Suppose I want to test my code and I want to set ENVIRONMENT=test
to use a testing database. What should I do? In FastAPI documentation (https://fastapi.tiangolo.com/advanced/settings/#settings-and-testing) there is a good example but it is about dependencies, so it is a different case as far as I know.
My idea was the following (test.py
):
import pytest
from fastapi.testclient import TestClient
from main import app
@pytest.fixture(scope="session", autouse=True)
def test_config(monkeypatch):
monkeypatch.setenv("ENVIRONMENT", "test")
@pytest.fixture(scope="session")
def client():
return TestClient(app)
def test_root(client):
response = client.get("/")
assert response.status_code == 200
assert response.json() == {"Environment": "Testing"}
but it doesn’t work.
Furthermore I get this error:
ScopeMismatch: You tried to access the 'function' scoped fixture 'monkeypatch' with a 'session' scoped request object, involved factories
test.py:7: def test_config(monkeypatch)
envlibsite-packages_pytestmonkeypatch.py:16: def monkeypatch()
while from pytest
official documentation it should work (https://docs.pytest.org/en/3.0.1/monkeypatch.html#example-setting-an-environment-variable-for-the-test-session). I have the latest version of pytest
installed.
I tried to use specific test environment variables because of this: https://pydantic-docs.helpmanual.io/usage/settings/#field-value-priority.
To be honest I’m lost, my only real aim is to have a different test configuration (in the same way Flask works: https://flask.palletsprojects.com/en/1.1.x/tutorial/tests/#setup-and-fixtures). Am I approaching the problem the wrong way?
It’s really tricky to mock environment with pydantic involved.
I only achieved desired behaviour with dependency injection in fastapi and making get_settings
function, which itself seems to be good practice since even documentation says to do so.
Suppose you have
...
class Settings(BaseSettings):
ENVIRONMENT: str
class Config:
env_file = ".env"
case_sensitive = True
def get_settings() -> Settings:
return Settings()
databases = {
"dev": "Development",
"test": "Testing"
}
database = databases[get_settings().ENVIRONMENT]
@app.get("/")
def read_root():
return {"Environment": database}
And in your tests you would write:
import pytest
from main import get_settings
def get_settings_override() -> Settings:
return Settings(ENVIRONMENT="dev")
@pytest.fixture(autouse=True)
def override_settings() -> None:
app.dependency_overrides[get_settings] = get_settings_override
You can use scope session if you’d like.
This would override your ENVIRONMENT
variable and wouldn’t touch rest of configuration variables.
PydanticSettings
are mutable, so you can simply override them in your test.py
:
from main import settings
settings.ENVIRONMENT = 'test'
This is a simple way that works for me. Consider that you have a configuration file named APPNAME.cfg with the following settings:
DEV_DSN='DSN=my_dev_dsn; UID=my_dev_user_id; PWD=my_dev_password'
PROD_DSN='DSN=my_prod_dsn; UID=my_prod_user_id; PWD=my_prod_password'
Set your environment according to your OS or Docker variable. For Linux you could enter:
export MY_ENVIORONMENT=DEV
Now consider the following settings.py:
from pydantic import BaseSettings
import os
class Settings(BaseSettings):
DSN: str
class Config():
env_prefix = f"{os.environ['MY_ENVIORONMENT']}_"
env_file = "APPNAME.cfg"
Your app would simply need to do the following:
from settings import Settings
s = Settings()
db = pyodbc.connect(s.DSN)
Bumping an old thread because I found a solution that was a bit cleaner for my use case. I was having trouble getting test specific dotenv files to load only while tests were running and when I had a local development dotenv in the project dir.
You can do something like the below where test.enviornment
is a special dotenv file that is NOT an env_file
path in the settings class Config. Because env vars > dotenv for BaseSettings, this will override any settings from a local .env as long as this is run in conftest.py before your settings class is imported. It also guarantees that your test environment is only active when tests are being run.
#conftest.py
from dotenv import load_dotenv
load_dotenv("tests/fixtures/test.environment", override=True)
from app import settings # singleton instance of BaseSettings class