125

I can't find the problem:

@app.route('/register', methods=['GET', 'POST'])
def register():
    form = RegisterForm(request.form)
    if request.method=='POST' and form.validate():
        name =  form.name.data 
        email = form.email.data
        username = form.username.data
        password = sha256_crypt.encrypt(str(form.password.data))

        c.execute("INSERT INTO users(name,email,username,password) 
        VALUES(?,?,?,?)", (name, email, username, password))

        conn.commit

        conn.close()

Error:

File "C:\Users\app.py", line 59, in register c.execute("INSERT INTO users(name,email,username,password) VALUES(?,?,?,?)", (name, email, username, password)) ProgrammingError: SQLite objects created in a thread can only be used in that same thread.The object was created in thread id 23508 and this is thread id 22640

Does this mean I can't use the name, email username & password in an HTML file? How do I solve this?

user4157124
  • 2,809
  • 13
  • 27
  • 42
Tania
  • 1,263
  • 2
  • 7
  • 5
  • 2
    No, they're clearly python objects defined right above it. The error message talks about the connection and the cursor. – ndrix Jan 12 '18 at 01:18

11 Answers11

169

Where you make your connection to the database add the following.

conn = sqlite3.connect('your.db', check_same_thread=False)
cmrussell
  • 1,912
  • 2
  • 8
  • 14
  • 64
    is this safe to use? – softmarshmallow Apr 04 '18 at 10:35
  • 4
    @uzu, I don't see why not, as long as you do your own synchronization to ensure only one thread uses the object at the same time. – merlin2011 Jun 21 '18 at 03:39
  • 29
    Some additional info for future readers of this thread. Per https://docs.python.org/3/library/sqlite3.html: By default, check_same_thread is True and only the creating thread may use the connection. If set False, the returned connection may be shared across multiple threads. When using multiple threads with the same connection writing operations should be serialized by the user to avoid data corruption. – Snidhi Sofpro Jun 26 '18 at 10:33
  • 5
    @merlin2011 could you expand on that? What are things not to do? I'm not sure I completely understand – jokoon Jun 07 '20 at 12:13
  • 4
    As long as only as a single thread is writing through the connection in a given time, this is safe to use. – Ramesh-X Sep 13 '20 at 16:46
  • 1
    @SnidhiSofpro So, I take it that it is safe to use the connection from multiple threads without any locking as long as all threads are only reading, not writing? – mxmlnkn Jul 05 '21 at 21:52
  • 1
    @softmarshmallow from SQLModel docs https://sqlmodel.tiangolo.com/tutorial/fastapi/simple-hero-api/#sqlmodel-code-models-engine , it is safe to use check_same_thread = False when the same session object is not shared in more than one request. Some applications set it to True to prevent misuse. – mcgusty May 16 '22 at 10:46
68

Your cursor 'c' is not created in the same thread; it was probably initialized when the Flask app was run.

You probably want to generate SQLite objects (the conneciton, and the cursor) in the same method, such as:

  @app.route('/')
  def dostuff():
    with sql.connect("database.db") as con:
      name = "bob"
      cur = con.cursor()
      cur.execute("INSERT INTO students (name) VALUES (?)",(name))
      con.commit()
      msg = "Done"
Mendelt
  • 36,795
  • 6
  • 74
  • 97
ndrix
  • 1,191
  • 12
  • 18
  • 2
    I ran into this. I'm going through a udacity full stack dev course that appears to have some python2 stuff. So it's actually turning out to be a good learning opportunity. Either way worked for me. I had the following at the top of the app.py file and then I literally copied and pasted into that function (path) and voila problem solved. ```python DBSession = sessionmaker(bind=engine) session = DBSession() ``` – Joseph Drane Sep 20 '18 at 05:04
  • 6
    Does one also need a `con.close()` or does the `with` take care of this? – Cleb Jan 27 '19 at 20:04
  • 3
    but isn't connecting to the db all over again on each request unnecessarily expensive? – Marcos Pereira Jan 25 '23 at 23:07
51
engine = create_engine(
'sqlite:///restaurantmenu.db',
connect_args={'check_same_thread': False}
)

Works for me

Mikey T.K.
  • 1,112
  • 18
  • 43
J J
  • 627
  • 5
  • 4
  • 6
    Hi J J, welcome to Stack Overflow! Please can you improve this answer? Use single backquotes \` to show code like this \`code\`, and explain why the code in question answers the question/does the job the person asked.. – user230910 Feb 18 '19 at 04:47
8

You can try this:

engine=create_engine('sqlite:///data.db', echo=True, connect_args={"check_same_thread": False})

It worked for me

Asgar
  • 431
  • 6
  • 20
7

In my case, I have the same issue with two python files creating sqlite engine and therefore possibly operating on different threads. Reading SQLAlchemy doc here, it seems it is better to use singleton technique in both files:

# maintain the same connection per thread
from sqlalchemy.pool import SingletonThreadPool
engine = create_engine('sqlite:///mydb.db',
                poolclass=SingletonThreadPool)

It does not solve all cases, meaning I occasionally getting the same error, but i can easily overcome it, refreshing the browser page. Since I'm only using this to debug my code, this is OK for me. For more permanent solution, should probably choose another database, like PostgreSQL or other database

ng10
  • 1,660
  • 1
  • 15
  • 19
5

As mentioned in https://docs.python.org/3/library/sqlite3.html and pointed out by @Snidhi Sofpro in a comment

By default, check_same_thread is True and only the creating thread may use the connection. If set False, the returned connection may be shared across multiple threads. When using multiple threads with the same connection writing operations should be serialized by the user to avoid data corruption.

One way to achieve serialization:

import threading
import sqlite3
import queue
import traceback
import time
import random

work_queue = queue.Queue()

def sqlite_worker():
    con = sqlite3.connect(':memory:', check_same_thread=False)
    cur = con.cursor()
    cur.execute('''
        CREATE TABLE IF NOT EXISTS test (
            id INTEGER PRIMARY KEY AUTOINCREMENT,
            text TEXT,
            source INTEGER,
            seq INTEGER
        )
    ''')
    while True:
        try:
            (sql, params), result_queue = work_queue.get()
            res = cur.execute(sql, params)
            con.commit()
            result_queue.put(res)
        except Exception as e:
            traceback.print_exc()

threading.Thread(target=sqlite_worker, daemon=True).start()

def execute_in_worker(sql, params):
    # you might not really need the results if you only use this
    # for writing unless you use something like https://www.sqlite.org/lang_returning.html
    result_queue = queue.Queue()
    work_queue.put(((sql, params), result_queue))
    return result_queue.get(timeout=5)

def insert_test_data(seq):
    time.sleep(random.randint(0, 100) / 100)
    execute_in_worker(
        'INSERT INTO test (text, source, seq) VALUES (?, ?, ?)',
        ['foo', threading.get_ident(), seq]
    )

threads = []
for i in range(10):
    thread = threading.Thread(target=insert_test_data, args=(i,))
    threads.append(thread)
    thread.start()

for thread in threads:
    thread.join()

for res in execute_in_worker('SELECT * FROM test', []):
    print(res)

# (1, 'foo', 139949462500928, 9)
# (2, 'foo', 139949496071744, 5)
# (3, 'foo', 139949479286336, 7)
# (4, 'foo', 139949487679040, 6)
# (5, 'foo', 139949854099008, 3)
# (6, 'foo', 139949470893632, 8)
# (7, 'foo', 139949862491712, 2)
# (8, 'foo', 139949845706304, 4)
# (9, 'foo', 139949879277120, 0)
# (10, 'foo', 139949870884416, 1)

As you can see, the data is inserted out of order but it's still all handled one by one in a while loop.

siikamiika
  • 276
  • 3
  • 6
3

I had the same problem and I fixed it by closing my connection after every call:

results = session.query(something, something).all()
session.close()
Azametzin
  • 5,223
  • 12
  • 28
  • 46
2

Are these true?:

  • The app in @app.route is a Flask app
  • You get this error while running a local development server
  • You run the server by calling app.run()

If so, the issue is that the Flask development server is multithreaded by default. You can fix this by calling run with threaded=False:

app.run(threaded=False)
  • You don't want to work with Flask in single-threaded mode, as it creates a bottleneck during connect phase. New connection will have to wait until the previous is released. – Andrey Jun 01 '23 at 17:32
1

The error doesn't lie on the variables called in your .execute(), but rather the object instances that SQLite uses to access the DB.
I assume that you have:

conn = sqlite3.connect('your_database.db')
c = conn.cursor()

somewhere at the top of the Flask script, & this would be initialized when you first run the script.
When the register function is called, a new thread, different from the initial script run handles the process. Thus, in this new thread, you're utilizing object instances that are from a different thread, which SQLite captures as an error: rightfully so, because this may lead to data corruption if you anticipate for your DB to be accessed by different threads during the app run.
So a different method, instead of disabling the check-same-thread SQLite functionality, you could try initializing your DB connection & cursor within the HTTP Methods that are being called.
With this, the SQLite objects & utilization will be on the same thread at runtime.

The code would be redundant, but it might save you in situations where the data is being accessed asynchronously, & will also prevent data corruption.

  • 1
    As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Feb 12 '22 at 14:05
0

I was having this problem and I just use the answer in this post. Which I will repost here:

creator = lambda: sqlite3.connect('file::memory:?cache=shared', uri=True)
engine = sqlalchemy.create_engine('sqlite://', creator=creator)
engine.connect()

Which bypasses the problem that you can't give this string "file::memory:?cache=shared" as URL to sqlalchemy. I seen a lot of answers but this solved all my problems of using a SQLite inmemory database that is shared among multiple threads. I initialize the database by creating two tables with two threads for speed. Before this the only way I could do this was with an file backed DB. However that was giving me latency issues in a Cloud deployment.

Fabio
  • 555
  • 3
  • 9
  • 24
-3

Create "database.py":

import sqlite3

def dbcon():
    return sqlite3.connect("your.db")

Then import:

from database import dbcon

db = dbcon()

db.execute("INSERT INTO users(name,email,username,password) 
        VALUES(?,?,?,?)", (name, email, username, password))

You probably won't need to close it because the thread will be killed right away.