FASTAPI run in conjunction with Alembic, but autogenerate does not detect the models

Question:

I am relatively new to FASTAPI but decided to setup a project with Postgres and Alembic. I managed to get the migrations create new versions everytime i use an automigrate, but for some reason I do not get any updates from my models, alas they stay blank. I am kind of lost what is going wrong.

Main.py

from fastapi import FastAPI
import os
app = FastAPI()


@app.get("/")
async def root():
    return {"message": os.getenv("SQLALCHEMY_DATABASE_URL")}


@app.get("/hello/{name}")
async def say_hello(name: str):
    return {"message": f"Hello {name}"}

Database.py

from sqlalchemy import  create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
import os

SQLALCHEMY_DATABASE_URL = os.getenv("SQLALCHEMY_DATABASE_URL")

engine = create_engine("postgresql://postgres:mysuperpassword@localhost/rodney")
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)

Base = declarative_base()

def get_db():
    db = SessionLocal()
    try:
        yield db
    except:
        db.close()

My only model so far

from sqlalchemy import Integer, String
from sqlalchemy.sql.schema import Column
from ..db.database import  Base


class CounterParty(Base):
    __tablename__ = "Counterparty"

    id = Column(Integer, primary_key=True)
    Name = Column(String, nullable=False)

env.py (alembic)

from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool

from alembic import context

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
from app.db.database import Base
target_metadata = Base.metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.


def run_migrations_offline():
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url,
        target_metadata=target_metadata,
        literal_binds=True,
        dialect_opts={"paramstyle": "named"},
    )

    with context.begin_transaction():
        context.run_migrations()


def run_migrations_online():
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """
    connectable = engine_from_config(
        config.get_section(config.config_ini_section),
        prefix="sqlalchemy.",
        poolclass=pool.NullPool,
    )

    with connectable.connect() as connection:
        context.configure(
            connection=connection, target_metadata=target_metadata
        )

        with context.begin_transaction():
            context.run_migrations()


if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

Now Alembic creates ampty migrations when I run "alembic revision –autogenerate -m "initial setup""
enter image description here

My folder structure
enter image description here

If anyone has any idea I would be very greatful. Cheers!

Asked By: Rodney Wormsbecher

||

Answers:

In my case I used Transformer BERT model to deploy on FastApi, but fastapi wasnt able to recognise my model, as well as not taking the Model inputs and outputs.
Code I used for my Case:

from fastapi import FastAPI
from pydantic import BaseModel

class Entities(BaseModel):
    text: str

class EntitesOut(BaseModel):
    headings: str
    Probability: str
    Prediction: str

model_load = load_model('BERT_HATESPEECH')
tokenizer = DistilBertTokenizerFast.from_pretrained('BERT_HATESPEECH_TOKENIZER')
file_to_read = open("label_encoder_bert_hatespeech.pkl", "rb")
label_encoder = pickle.load(file_to_read)

app = FastAPI()

@app.post('/predict', response_model=EntitesOut)
def prep_data(text:Entities):
    text = text.text
    tokens = tokenizer(text, max_length=150, truncation=True, 
                       padding='max_length', 
                       add_special_tokens=True, 
                       return_tensors='tf')
    tokens = {'input_ids': tf.cast(tokens['input_ids'], tf.float64), 'attention_mask': tf.cast(tokens['attention_mask'], tf.float64)}
    headings = '''Non-offensive', 'identity_hate', 'neither', 'obscene','offensive', 'sexism'''
    probs = model_load.predict(tokens)[0]
    pred = label_encoder.inverse_transform([np.argmax(probs)])
    return {"headings":headings,
            "Probability":str(np.round(probs,3)),
            "Prediction":str(pred)}

Above code is using BaseModel from pydantic and i created classes for baseModel to take text:str as input and headings, Probability, and prediction as Outputs in EntitiesOut class
After that somehow it recognised by Model and save 200 status code with output

Answered By: Sarim Sikander

The env.py file does not find the models because you haven’t imported them. One solution to it, you just import them right away in your env.py file as:

from ..models import *

However, you need to have an init.py file in your models directory, and include there all your models.

Another way (however, not recommended): if you have only one model, you can import it directly as:

from ..models.counterPartyModel import

Answered By: Fjoralb

That’s a bit late for a response, but I just met same issue, maybe my answer will help someone in future 🙂
In my case it was due to database state – it was already in consistent state which means that alembic is not trying to create it’s modification (it doesn’t see any diff). As I worked with sqlite I simply deleted sqlite file (dropping tables should work as well) and ran revision once again. This time it worked as expected, upgrade and downgrade functions where filled with autogenerated code.

Answered By: Stasiukevich Ilya
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.