I've got client and server programs set up to work perfectly fine over localhost, that become nearly completely useless when piped through an ip. The server sends the video data and it is received flawlessly by the client through localhost, but once it's put into a real-life test it can only get about 1 useful frame in for every other second. The rest are heavily stretched upwards. Example of it at it's worst >> https://i.stack.imgur.com/WxoXg.jpg
Before the try/except was put into the client file it would play one frame of colorful vertical bars and then crash while complaining that the image given had no height or width.
cameraClient.py
import socket
import cv2
import numpy as np
ip = "insert ip"
port = 4001
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(3)
sock.connect((ip, port))
while True:
try:
buff = sock.recv(921664)
nparr = np.frombuffer(buff, np.uint8)
newFrame = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
cv2.imshow("s", newFrame)
except:
pass
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cameraServer.py
import cv2
import socket
host = ''
port = 4001
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
print("Preparing webcam.")
vid = cv2.VideoCapture(0)
print("Webcam ready, waiting for client connection.")
sock.bind((host, port))
sock.listen(3)
conn, addr = sock.accept()
while True:
if vid.isOpened():
empty, frame = vid.read()
data = cv2.imencode('.jpg', frame)[1].tostring()
conn.send(data)
Surely this isn't an unsolvable issue. Even if I've got to completely revamp the way I send and receive data that's perfectly acceptable. If you can help me understand why it's doing this as well as how to fix it that would be great.