I'm trying to set up a neural network to work on multiple cameras simultaneously (or almost at least..). First of all i'm trying to get to stream 2 cameras at once with OpenCV and the python threading module.
I've come up with this code:
import cv2
import threading
import queue
def multistream(stream, q):
ret, frame = stream.read()
q.put(frame)
if __name__ == "__main__":
camlink1 = "rtsp://......link1"
camlink2 = "rtsp://......link2"
stream1 = cv2.VideoCapture(camlink1)
stream2 = cv2.VideoCapture(camlink2)
print("stream is opened")
while True:
q = queue.Queue()
cam1 = threading.Thread(target=multistream, args=(stream1, q))
cam2 = threading.Thread(target=multistream, args=(stream2, q))
cam1.start()
cam2.start()
cam1.join()
cam2.join()
while not q.empty():
cv2.imshow("video", q.get())
The problem is that cv2.imshow shows an empty window instead of the frame while, if i add to the code print(q.get())
, it prints the mats of which the frame is made, so the frame is correctly returned from the multistream function to the main thread. What's the correct fix for this?