7

I watched an excellent presentation on the GIL, and how when running in the interpreter only 1 single thread can run at a time. It also seemed that python is not very intelligent about switching between threads.

If i am threading some operation that only runs in the interpreter, and it is not particularly CPU heavy, and I use a thread lock where only 1 thread can run at a time for this relatively short interpreter-bound operation, will that lock actually make anything run slower? as opposed to if the lock were not necessary and all threads could run concurrently.

If all but 1 threads are locked, will the python interpreter know not to context switch?

Edit: by 'making things run slower' I mean if python is context switching to a bunch of locked threads, that will (maybe) be a performance decrease even if the threads don't actually run

e wagness
  • 301
  • 2
  • 13

2 Answers2

12

Larry Hastings (a core CPython Developer) has a great talk that covers this subject called "Python's Infamous GIL". If you skip to 11:40ish he gives the answer to your question.

From the talk: The way Python threads work with the GIL is with a simple counter. With every 100 byte codes executed the GIL is supposed to be released by the thread currently executing in order to give other threads a chance to execute code. This behavior is essentially broken in Python 2.7 because of the thread release/acquire mechanism. It has been fixed in Python 3.

When you use a thread lock Python will only execute the threads that are not locked. So if you have several threads sharing 1 lock, then only one thread will execute at the same time. Python will not start executing a locked thread until the thread can acquire the lock. Locks are there so you can have shared state between threads without introducing bugs.

If you have several threads and only 1 can run at a time because of a lock, then in theory your program will take longer to execute. In practice you should benchmark, because the results will surprise you.

tipsqueal
  • 183
  • 5
5

python is not very intelligent about switching between threads

Python threads work a certain way :-)

if I use a thread lock where only 1 thread can run at a time... will that lock actually make anything run slower

Err, no because there is nothing else runnable, so nothing else could run slower.

If all but 1 threads are locked, will the python interpreter know not to context switch?

Yes. The kernel knows which threads are runnable. If no other threads can run then logically speaking (as far as the thread is concerned) the python interpreter won't context switch away from the only runnable thread. The thread doesn't know when it has been switched away from (how can it, it isn't running).

  • So the python intepreter will never context switch to a locked thread? I guess that makes sense, because that would negatively affect performance any time you used locking. I think I got a bit confused by how python2.6 handles signals and threading, specifically when you hit ctrl-c when the main program is blocked on a thread join, which made me think python does un-intelligent context switching. Anyways, thanks for clearing it up. – e wagness Oct 26 '15 at 19:52
  • @ewagness (I'm a bit late to the party) and future dwellers: I suggest a quite standard piece about threads: Chapter 2 of Modern Operating Systems by Andrew S. Tanenbaum. It has all the basics (including those you inquire about) and some real world scenarios. Python threads are implemented in user space, of course, but many of the same problems - and solutions - apply. – rrrrrrrrrrrrrrrr Apr 12 '21 at 20:17