The simple answer, when asking how to use threads in Python is: "Don't. Use processes, instead." The multiprocessing module lets you create processes with similar syntax to creating threads, but I prefer using their convenient Pool object.
Using the code that David Beazley first used to show the dangers of threads against the GIL, we'll rewrite it using multiprocessing.Pool:
from threading import Thread
import time
def countdown(n):
while n > 0:
n -= 1
COUNT = 10000000
t1 = Thread(target=countdown,args=(COUNT/2,))
t2 = Thread(target=countdown,args=(COUNT/2,))
start = time.time()
t1.start();t2.start()
t1.join();t2.join()
end = time.time()
print end-start
Re-written using multiprocessing.Pool:
import multiprocessing
import time
def countdown(n):
while n > 0:
n -= 1
COUNT = 10000000
start = time.time()
with multiprocessing.Pool as pool:
pool.map(countdown, [COUNT/2, COUNT/2])
pool.close()
pool.join()
end = time.time()
print(end-start)
Instead of creating threads, this creates new processes. Since each process is its own interpreter, there are no GIL collisions. multiprocessing.Pool will open as many processes as there are cores on the machine, though in the example above, it would only need two. In a real-world scenario, you want to design your list to have at least as much length as there are processors on your machine. The Pool will run the function you tell it to run with each argument, up to the number of processes it creates. When the function finishes, any remaining functions in the list will be run on that process.
I've found that, even using the with
statement, if you don't close and join the pool, the processes continue to exist. To clean up resources, I always close and join my pools.