0

I am trying to add concurrent ability to some scripts, have been able to do it using pool.map call for a script that uses a static item list by the following code:

import concurrent.futures
from concurrent.futures import ThreadPoolExecutor, as_completed
import time


def do_some_stuff(hop):

    print ('Processing do_some_stuff function for hop ', hop)


def task_test(hop):

    print ("\n::[task_test] Device acting as seed: " + hop + " :")

    # processing each hop under some other stuff function
    output = do_some_stuff (hop)



def main():

    hoplist = ['1.1.1.1', '2.2.2.2', '3.3.3.3', '4.4.4.4']

    pool = ThreadPoolExecutor(max_workers=100)
    pool.map(task_test, hoplist)


if __name__ == "__main__":
    main()

Now, I am trying to achieve the same multi-thread result for another script that scans the network looking for connected neighbors (CDP and LLDP network protocols).

The objective is to have these discovered elements being processed (executing other functions) simultaneously, as soon as the list is updated (extended/appended):

import concurrent.futures
from concurrent.futures import ThreadPoolExecutor, as_completed
import time

# example of the function discovering the new hops
def discover_new_hops(hop, hoplist):

    hops_discovered = ['2.2.2.2', '3.3.3.3', '4.4.4.4']

    print ('neighbors have been found for ', hop, ':', hops_discovered)

    hoplist.remove(hop) # updating hoplist removing current hop 

    return hops_discovered


def task_test(hoplist):


    while len(hoplist) > 0:

        seedlist = [] # this list will be used to store the hops already processed

        for hop in hoplist:

            print ("\n::[task_test] Device acting as seed: " + hop + " :")

            # do some stuff and discover new hops

            seedlist.append(hop) # Adding hop to seedlist to track the ones already processed
            hoplist.extend(discover_new_hops(hop, hoplist)) # Calling function to discover the network neighbors connected devices for current hop, if any
            hoplist = list(set(hoplist)-(set(seedlist))) # Updating hoplist removing duplication and hops already processed

            time.sleep(1)



def main():

    hoplist = ['1.1.1.1']

    # calling the function directly in a serial method
    #task_test(hoplist) 

    # trying to achieve multi-thread ability to new devices discovered without success
    with ThreadPoolExecutor(max_workers=100) as executor:
        executor.submit(task_test, hoplist)


if __name__ == "__main__":
    main()

Have seen some approaches adding the discovered items to a Queue. Other ideas / suggestions ? Thank you !

Fabiano Lima
  • 67
  • 2
  • 11
  • Try it with a Queue and see how it works out. Are you reluctant to use a Queue? – wwii Dec 27 '19 at 18:03
  • @wwii, the static list (first code) works on both sequential and multi-thread methods. the discovered list (second code) works on a sequential manner and trying to achieve multi-thread ability. – Fabiano Lima Dec 27 '19 at 18:06
  • Sounds like a producer-consumer model. Related: https://stackoverflow.com/a/20794945/2823755 – wwii Dec 27 '19 at 18:07
  • others - https://stackoverflow.com/questions/16914665/how-to-use-queue-with-concurrent-future-threadpoolexecutor-in-python-3 … https://stackoverflow.com/a/57026514/2823755 .. and others with `python producer consumer` search terms – wwii Dec 27 '19 at 18:13
  • thank you @wwii! Have done some tests, but didn´t get it working yet, keep trying. Tks. – Fabiano Lima Dec 30 '19 at 13:17

0 Answers0