I have a Keras model working on Python and I want to send frames to this model from Unity 3D camera. I can easily transport strings among them by using an external library. This external library sends a byte array to Python. So, I converted Unity's camera frames to a byte array. However, I do not know how to read the sent images (byte array) on Python.
In order to handle this problem, I have tried some Sender-Receiver codes using only Python. However, those did not work.
Here are the codes:
Sender :
import zmq
import base64
context = zmq.Context()
socket = context.socket(zmq.REP)
socket.bind("tcp://*:5555")
f = open("sample.png", 'rb')
bytes = bytearray(f.read())
strng = base64.b64encode(bytes)
socket.send(strng)
f.close()
Receiver :
import zmq
import base64
context = zmq.Context()
socket = context.socket(zmq.REQ)
socket.connect("tcp://localhost:5555")
import base64
message = socket.recv()
f = open("sample.png", 'wb')
ba = bytearray(base64.b64decode(message))
f.write(ba)
f.close()
Is that a good way for sending frames from Unity to Python? Or are there other ways? How can I handle my problem? What is wrong with the codes above?