I am trying to make a browser based remote desktop controller. For this what I am doing is taking the screenshots of the remote desktop and transmitting them periodically to the client. But this method is highly inefficient as the screen doesn't change much. So what I thought of doing was to transmit only the difference of consecutive screenshots. This way I will be able to increase the fps too. Initially I converted the screenshot to pixel array using PIL library of python and then compared the two arrays to find the pixels which differ and send only those. But it was taking a lot of time to find the pixel array of the screenshot. Then I came across byte array which can also be used to represent an image. Converting an image to byte array was way faster than converting it to pixel array(rgba). But how do I interpret the byte array. What is stored in the byte array of an image? I converted the byte array into array of 8 bit integers using this piece of code:
bytes = readimage("./scimg1.png")
i = 0
bytearr = []
while i < len(bytes):
bytearr.append(bytes[i])
i+=1
How to compare two images using this bytearray and take their difference. I found out that the length of this array comes out to be different for two images of same dimension(length, width).