I'm programming a little multi-protocol image streaming server (in Python), and all protocols work well enough, except for the Multicast protocol that makes my CPU usage go up to 150% !
Here's the multicast code:
delay = 1./self.flux.ips
imgid = 0
lastSent = 0
while self.connected:
#self.printLog("Getting ready to fragment {}".format(imgid))
fragments = fragmentImage(self.flux.imageFiles[imgid], self.fragmentSize)
#self.printLog("Fragmented {} ! ".format(imgid))
# Checking if the delay has passed, to respected the framerate
while (time.time() - lastSent) < delay:
pass
# Sending the fragments
for fragmentid in range(len(fragments)):
formatedFragment = formatFragment(fragments[fragmentid], fragmentid*self.fragmentSize, len(self.flux.imageFiles[imgid]), imgid)
self.sendto(formatedFragment, (self.groupAddress, self.groupPort))
lastSent = time.time()
imgid = (imgid + 1) % len(self.flux.images)
The UDP protocol also sends images as fragments, and I don't have any CPU usage problems. Note that the client also have some latency to get those images.