I'm working on a processing platform for images and my server is currently accepting images as byte arrays from the client using pythons PIL.Image methods. I'm also currently using Java as a front end to grab image frames from video with the FrameGrab utility and returning them as a BufferedImage object. What I don't understand is how I am meant to convert from this buffered image object to a jpg byte array and could use some help.'
I've found an example of writing out a
Here is the base portion of my code.
BufferedImage frame;
for (int i = 1; i < 100; i++) {
try {
frame = AWTUtil.toBufferedImage(FrameGrab.getFrameAtSec(videoFile, i * .3));
} catch (IOException ex) {
Logger.getLogger(FXMLDocumentController.class.getName()).log(Level.SEVERE, null, ex);
} catch (JCodecException ex) {
Logger.getLogger(FXMLDocumentController.class.getName()).log(Level.SEVERE, null, ex);
}
}
My python server backend is currently just attempting to save the file with the aforementioned library like so:
img = Image.open(io.BytesIO(data))
img.save('test.jpg')