Objects created in a thread can only be used in that same thread
Question:
I can’t find the problem:
@app.route('/register', methods=['GET', 'POST'])
def register():
form = RegisterForm(request.form)
if request.method=='POST' and form.validate():
name = form.name.data
email = form.email.data
username = form.username.data
password = sha256_crypt.encrypt(str(form.password.data))
c.execute("INSERT INTO users(name,email,username,password)
VALUES(?,?,?,?)", (name, email, username, password))
conn.commit
conn.close()
Error:
File "C:Usersapp.py", line 59, in register c.execute("INSERT INTO
users(name,email,username,password) VALUES(?,?,?,?)", (name, email,
username, password)) ProgrammingError: SQLite objects created in a
thread can only be used in that same thread.The object was created
in thread id 23508 and this is thread id 22640
Does this mean I can’t use the name, email username & password in an HTML file? How do I solve this?
Answers:
Your cursor ‘c’ is not created in the same thread; it was probably initialized when the Flask app was run.
You probably want to generate SQLite objects (the conneciton, and the cursor) in the same method, such as:
@app.route('/')
def dostuff():
with sql.connect("database.db") as con:
name = "bob"
cur = con.cursor()
cur.execute("INSERT INTO students (name) VALUES (?)",(name))
con.commit()
msg = "Done"
Where you make your connection to the database add the following.
conn = sqlite3.connect('your.db', check_same_thread=False)
In my case, I have the same issue with two python files creating sqlite engine and therefore possibly operating on different threads. Reading SQLAlchemy doc here, it seems it is better to use singleton technique in both files:
# maintain the same connection per thread
from sqlalchemy.pool import SingletonThreadPool
engine = create_engine('sqlite:///mydb.db',
poolclass=SingletonThreadPool)
It does not solve all cases, meaning I occasionally getting the same error, but i can easily overcome it, refreshing the browser page. Since I’m only using this to debug my code, this is OK for me. For more permanent solution, should probably choose another database, like PostgreSQL or other database
engine = create_engine(
'sqlite:///restaurantmenu.db',
connect_args={'check_same_thread': False}
)
Works for me
You can try this:
engine=create_engine('sqlite:///data.db', echo=True, connect_args={"check_same_thread": False})
It worked for me
I had the same problem and I fixed it by closing my connection after every call:
results = session.query(something, something).all()
session.close()
As mentioned in https://docs.python.org/3/library/sqlite3.html and pointed out by @Snidhi Sofpro in a comment
By default, check_same_thread is True and only the creating thread may use the connection. If set False, the returned connection may be shared across multiple threads. When using multiple threads with the same connection writing operations should be serialized by the user to avoid data corruption.
One way to achieve serialization:
import threading
import sqlite3
import queue
import traceback
import time
import random
work_queue = queue.Queue()
def sqlite_worker():
con = sqlite3.connect(':memory:', check_same_thread=False)
cur = con.cursor()
cur.execute('''
CREATE TABLE IF NOT EXISTS test (
id INTEGER PRIMARY KEY AUTOINCREMENT,
text TEXT,
source INTEGER,
seq INTEGER
)
''')
while True:
try:
(sql, params), result_queue = work_queue.get()
res = cur.execute(sql, params)
con.commit()
result_queue.put(res)
except Exception as e:
traceback.print_exc()
threading.Thread(target=sqlite_worker, daemon=True).start()
def execute_in_worker(sql, params):
# you might not really need the results if you only use this
# for writing unless you use something like https://www.sqlite.org/lang_returning.html
result_queue = queue.Queue()
work_queue.put(((sql, params), result_queue))
return result_queue.get(timeout=5)
def insert_test_data(seq):
time.sleep(random.randint(0, 100) / 100)
execute_in_worker(
'INSERT INTO test (text, source, seq) VALUES (?, ?, ?)',
['foo', threading.get_ident(), seq]
)
threads = []
for i in range(10):
thread = threading.Thread(target=insert_test_data, args=(i,))
threads.append(thread)
thread.start()
for thread in threads:
thread.join()
for res in execute_in_worker('SELECT * FROM test', []):
print(res)
# (1, 'foo', 139949462500928, 9)
# (2, 'foo', 139949496071744, 5)
# (3, 'foo', 139949479286336, 7)
# (4, 'foo', 139949487679040, 6)
# (5, 'foo', 139949854099008, 3)
# (6, 'foo', 139949470893632, 8)
# (7, 'foo', 139949862491712, 2)
# (8, 'foo', 139949845706304, 4)
# (9, 'foo', 139949879277120, 0)
# (10, 'foo', 139949870884416, 1)
As you can see, the data is inserted out of order but it’s still all handled one by one in a while
loop.
The error doesn’t lie on the variables called in your .execute()
, but rather the object instances that SQLite uses to access the DB.
I assume that you have:
conn = sqlite3.connect('your_database.db')
c = conn.cursor()
somewhere at the top of the Flask script, & this would be initialized when you first run the script.
When the register
function is called, a new thread, different from the initial script run handles the process. Thus, in this new thread, you’re utilizing object instances that are from a different thread, which SQLite captures as an error: rightfully so, because this may lead to data corruption if you anticipate for your DB to be accessed by different threads during the app run.
So a different method, instead of disabling the check-same-thread SQLite functionality, you could try initializing your DB connection & cursor within the HTTP Methods that are being called.
With this, the SQLite objects & utilization will be on the same thread at runtime.
The code would be redundant, but it might save you in situations where the data is being accessed asynchronously, & will also prevent data corruption.
Create "database.py":
import sqlite3
def dbcon():
return sqlite3.connect("your.db")
Then import:
from database import dbcon
db = dbcon()
db.execute("INSERT INTO users(name,email,username,password)
VALUES(?,?,?,?)", (name, email, username, password))
You probably won’t need to close it because the thread will be killed right away.
I was having this problem and I just use the answer in this post. Which I will repost here:
creator = lambda: sqlite3.connect('file::memory:?cache=shared', uri=True)
engine = sqlalchemy.create_engine('sqlite://', creator=creator)
engine.connect()
Which bypasses the problem that you can’t give this string "file::memory:?cache=shared" as URL to sqlalchemy. I seen a lot of answers but this solved all my problems of using a SQLite inmemory database that is shared among multiple threads. I initialize the database by creating two tables with two threads for speed. Before this the only way I could do this was with an file backed DB. However that was giving me latency issues in a Cloud deployment.
Are these true?:
- The
app
in @app.route
is a Flask app
- You get this error while running a local development server
- You run the server by calling
app.run()
If so, the issue is that the Flask development server is multithreaded by default. You can fix this by calling run
with threaded=False
:
app.run(threaded=False)
I can’t find the problem:
@app.route('/register', methods=['GET', 'POST'])
def register():
form = RegisterForm(request.form)
if request.method=='POST' and form.validate():
name = form.name.data
email = form.email.data
username = form.username.data
password = sha256_crypt.encrypt(str(form.password.data))
c.execute("INSERT INTO users(name,email,username,password)
VALUES(?,?,?,?)", (name, email, username, password))
conn.commit
conn.close()
Error:
File "C:Usersapp.py", line 59, in register c.execute("INSERT INTO
users(name,email,username,password) VALUES(?,?,?,?)", (name, email,
username, password)) ProgrammingError: SQLite objects created in a
thread can only be used in that same thread.The object was created
in thread id 23508 and this is thread id 22640
Does this mean I can’t use the name, email username & password in an HTML file? How do I solve this?
Your cursor ‘c’ is not created in the same thread; it was probably initialized when the Flask app was run.
You probably want to generate SQLite objects (the conneciton, and the cursor) in the same method, such as:
@app.route('/')
def dostuff():
with sql.connect("database.db") as con:
name = "bob"
cur = con.cursor()
cur.execute("INSERT INTO students (name) VALUES (?)",(name))
con.commit()
msg = "Done"
Where you make your connection to the database add the following.
conn = sqlite3.connect('your.db', check_same_thread=False)
In my case, I have the same issue with two python files creating sqlite engine and therefore possibly operating on different threads. Reading SQLAlchemy doc here, it seems it is better to use singleton technique in both files:
# maintain the same connection per thread
from sqlalchemy.pool import SingletonThreadPool
engine = create_engine('sqlite:///mydb.db',
poolclass=SingletonThreadPool)
It does not solve all cases, meaning I occasionally getting the same error, but i can easily overcome it, refreshing the browser page. Since I’m only using this to debug my code, this is OK for me. For more permanent solution, should probably choose another database, like PostgreSQL or other database
engine = create_engine(
'sqlite:///restaurantmenu.db',
connect_args={'check_same_thread': False}
)
Works for me
You can try this:
engine=create_engine('sqlite:///data.db', echo=True, connect_args={"check_same_thread": False})
It worked for me
I had the same problem and I fixed it by closing my connection after every call:
results = session.query(something, something).all()
session.close()
As mentioned in https://docs.python.org/3/library/sqlite3.html and pointed out by @Snidhi Sofpro in a comment
By default, check_same_thread is True and only the creating thread may use the connection. If set False, the returned connection may be shared across multiple threads. When using multiple threads with the same connection writing operations should be serialized by the user to avoid data corruption.
One way to achieve serialization:
import threading
import sqlite3
import queue
import traceback
import time
import random
work_queue = queue.Queue()
def sqlite_worker():
con = sqlite3.connect(':memory:', check_same_thread=False)
cur = con.cursor()
cur.execute('''
CREATE TABLE IF NOT EXISTS test (
id INTEGER PRIMARY KEY AUTOINCREMENT,
text TEXT,
source INTEGER,
seq INTEGER
)
''')
while True:
try:
(sql, params), result_queue = work_queue.get()
res = cur.execute(sql, params)
con.commit()
result_queue.put(res)
except Exception as e:
traceback.print_exc()
threading.Thread(target=sqlite_worker, daemon=True).start()
def execute_in_worker(sql, params):
# you might not really need the results if you only use this
# for writing unless you use something like https://www.sqlite.org/lang_returning.html
result_queue = queue.Queue()
work_queue.put(((sql, params), result_queue))
return result_queue.get(timeout=5)
def insert_test_data(seq):
time.sleep(random.randint(0, 100) / 100)
execute_in_worker(
'INSERT INTO test (text, source, seq) VALUES (?, ?, ?)',
['foo', threading.get_ident(), seq]
)
threads = []
for i in range(10):
thread = threading.Thread(target=insert_test_data, args=(i,))
threads.append(thread)
thread.start()
for thread in threads:
thread.join()
for res in execute_in_worker('SELECT * FROM test', []):
print(res)
# (1, 'foo', 139949462500928, 9)
# (2, 'foo', 139949496071744, 5)
# (3, 'foo', 139949479286336, 7)
# (4, 'foo', 139949487679040, 6)
# (5, 'foo', 139949854099008, 3)
# (6, 'foo', 139949470893632, 8)
# (7, 'foo', 139949862491712, 2)
# (8, 'foo', 139949845706304, 4)
# (9, 'foo', 139949879277120, 0)
# (10, 'foo', 139949870884416, 1)
As you can see, the data is inserted out of order but it’s still all handled one by one in a while
loop.
The error doesn’t lie on the variables called in your .execute()
, but rather the object instances that SQLite uses to access the DB.
I assume that you have:
conn = sqlite3.connect('your_database.db')
c = conn.cursor()
somewhere at the top of the Flask script, & this would be initialized when you first run the script.
When the register
function is called, a new thread, different from the initial script run handles the process. Thus, in this new thread, you’re utilizing object instances that are from a different thread, which SQLite captures as an error: rightfully so, because this may lead to data corruption if you anticipate for your DB to be accessed by different threads during the app run.
So a different method, instead of disabling the check-same-thread SQLite functionality, you could try initializing your DB connection & cursor within the HTTP Methods that are being called.
With this, the SQLite objects & utilization will be on the same thread at runtime.
The code would be redundant, but it might save you in situations where the data is being accessed asynchronously, & will also prevent data corruption.
Create "database.py":
import sqlite3
def dbcon():
return sqlite3.connect("your.db")
Then import:
from database import dbcon
db = dbcon()
db.execute("INSERT INTO users(name,email,username,password)
VALUES(?,?,?,?)", (name, email, username, password))
You probably won’t need to close it because the thread will be killed right away.
I was having this problem and I just use the answer in this post. Which I will repost here:
creator = lambda: sqlite3.connect('file::memory:?cache=shared', uri=True)
engine = sqlalchemy.create_engine('sqlite://', creator=creator)
engine.connect()
Which bypasses the problem that you can’t give this string "file::memory:?cache=shared" as URL to sqlalchemy. I seen a lot of answers but this solved all my problems of using a SQLite inmemory database that is shared among multiple threads. I initialize the database by creating two tables with two threads for speed. Before this the only way I could do this was with an file backed DB. However that was giving me latency issues in a Cloud deployment.
Are these true?:
- The
app
in@app.route
is a Flask app - You get this error while running a local development server
- You run the server by calling
app.run()
If so, the issue is that the Flask development server is multithreaded by default. You can fix this by calling run
with threaded=False
:
app.run(threaded=False)