Prevent another process from writing to the same file

Question:

I am using streamlit to show some results of my Python app. In this app, I need to update some files every 30 minutes. In the process of updating, if two or more users load the page, the updating files will be corrupted because they may be overwritten. So, how can I lock the process of updating files in python so that corruption never happens?

Here is part of the file that I use it to update:

config = f_getConfigs()
str_ftime = "%Y-%m-%d %H:%M:%S%Z:%z"

dt_lastup = datetime.now(timezone.utc)) 
if (dt_lastup - datetime.strptime(config["last_update"], str_ftime)).total_seconds() < 30 * 60:
    return #Use the available files
else:
    ... #Get update files from API
Asked By: Mahdi Amrollahi

||

Answers:

I’m guessing you are really trying to ask "if I have two or more processes which run this code, how can I prevent them from trampling each other?"

One common approach is to use a lock directory:

def fetch_newer(datadir, new_data_file):
    lockdir = os.path.join(datadir, "fetch_newer")
    try:
        os.mkdir(lockdir)
    except:
        # maybe check if the directory is older than, say, 20 minutes
        # and if so, just remove it? or raise an error?
        logging.info("fetch already in progress; not starting another")
        return
    r = requests.get("https://location.example/data/" + new_data_file)
    with open(os.path.join(datadir, new_data_file), "rw") as dest:
        dest.write(r.text)
    os.rmdir(lockdir)
Answered By: tripleee
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.