0

I have a function that encrypts a number and stores it in an list

encrypted = [[0]*10]*1000

def encrypt(i):
        encrypted[i]=bin(i)[2:].zfill(10).decode('hex')

The expression is much more complex than this. I am just stating an example.

Now I want to call the encrypt function inside a for loop with multiple calls in different processes or threads - however due to GIL for CPU bound process, threads wont help - correct me if i am wrong.

for i in xrange(1000):
     encrypt(i)

So The loop should not wait for the encryption of one value to get over, for the next to start.

So when i=1 and encryption of 1 is taking place, For loop should increment and start encrypting 2, and then 3 simultaneously.

The results of encryption should be stored in encrypted list (order of results is not important).

Abhinav
  • 15
  • 1
  • 7
  • http://stackoverflow.com/questions/15143837/how-to-multi-thread-an-operation-within-a-loop-in-python – nitimalh Aug 02 '15 at 20:40
  • I think this example should help you batch up your values and have an optimal number of threads working in parallel – nitimalh Aug 02 '15 at 20:41

2 Answers2

1

Alright, first some advice. Depending on the number of threads you need to run you should check out PyPy this sounds like the kind of project that could benefit heavily from pypy's features.

Here Is an edited example from the Queue docs if I understand what you need than this should point you in the right direction.

This code assumes that you have a list of encrypted numbers and that your encrypt function handles adding the results to a list or storing them somehow.

def worker():
    while True:
        number = q.get()
        encrypt(number)
        q.task_done()

q = Queue()
for i in range(num_worker_threads):
    t = Thread(target=worker)
    t.daemon = True
    t.start()

for number in numbers:
    q.put(number)

q.join()       # block until all tasks are done
Fyrn
  • 116
  • 1
  • 8
  • Thanks @fern - though I read somewhere that for CPU bound processes, due to Global interpreter lock , running multiple threads will be of no use. Is there any other way to achieve the same task using multi-processing – Abhinav Aug 03 '15 at 14:21
  • I don't have enough rep to comment on the selected answer but if Pool's processes is set to None it will use the value returned by the cpu_count(). – Fyrn Aug 05 '15 at 17:05
1

You can use multithreading.Pool

from multiprocessing import Pool

def encrypt(i):
    return bin(i)[2:].zfill(10).decode('hex')

if __name__ == '__main__':
    pool = Pool(processes=4)  # adjust to number of cores
    result = pool.map(encrypt, range(1000))
    print result
Daniel Hepper
  • 28,981
  • 10
  • 72
  • 75
  • is there any other way to call encrypt concurrently without the use of map - as result of encrypt goes to an if statement .. if (len(encrypt(i))<16)): end all processes ....just stating an example; iterating for true or false in results array would make unnecessary overhead. – Abhinav Aug 03 '15 at 15:40
  • got it.. i used imap_unordered instead of map Thanks ! – Abhinav Aug 03 '15 at 16:19