How to handle user uploaded Image files in a FastAPI app?

Question:

I am writing an API for an AI model. This AI model gets images from a folder named input, does the process on the image or images and outputs the result in another folder named output in CSV format, which is then converted into JSON and returned to the user. This API is going to be used in a website, which will be used by many users in the future.

This is the code that I wrote so far and I know it’s not a good practice, for example what if 100 users want to use this feature concurrently?

The AI model is a script in main.py file.

from fastapi import FastAPI, File, UploadFile
import pandas as pd
import csv 
import json 



IMAGEDIR = "inputs/"
app = FastAPI()


@app.post("/upload_image/")
async def upload_image(img: UploadFile = File(...)):
    img.filename = f"new.jpg"
    contents = await img.read()

    with open(f"{IMAGEDIR}{img.filename}", "wb") as f:
        f.write(contents)
    
    result = {}
    try:
        import main
        result = csv_to_json('outputs/Results/Results_1.csv')
        return result
    except:
        raise('Server Error')
        result = {'error': 'server error'}
    


def csv_to_json(csvFilePath):
    jsonArray = []
    #read csv file
    with open(csvFilePath, encoding='utf-8') as csvf: 
        #load csv file data using csv library's dictionary reader
        csvReader = csv.DictReader(csvf) 

        #convert each csv row into python dict
        for row in csvReader: 
            #add this python dict to json array
            jsonArray.append(row)


    return jsonArray

What is the correct way to solve this problem?

Asked By: Nima

||

Answers:

You are using blocking operations inside an async function, which causes the event loop (created by FastApi) get blocked and be unable to handle more requests. Using libraries like aiofile might improve the performance; Although I’m not sure how much improvement you can get, since the Disk i/o operations might be the bottleneck then.

Answered By: mehran si