With the introduction of an image stream in the flutter camera version 0.2.8, I've tried to integrate it into my project to use alongside AWS.
Amazon requires the image is in the format:
- Blob of image bytes up to 5 MBs.
- Type: Base64-encoded binary data object
- Length Constraints: Minimum length of 1. Maximum length of 5242880.
Previously I used the Camera package to take a picture, load the picture and then convert it as amazon required, but using an ImageStream is much more suited for what I'd like to do. My previous approach was:
// Take the picutre
await _cameraController.takePicture(path);
// Load it from my filesystem
File imagefile = new File(path);
// Convert to amazon requirements
List<int> imageBytes = imagefile.readAsBytesSync();
String base64Image = base64Encode(imageBytes);
However, using an image stream, I cannot find any easy way to convert a CameraImage
to the format that amazon requires. I don't have much experience with images so I'm quite stuck.
I attempted to manipulate the code used in the firebase ml & camera stream demo
final int numBytes =
image.planes.fold(0, (count, plane) => count += plane.bytes.length);
final Uint8List allBytes = Uint8List(numBytes);
int nextIndex = 0;
for (int i = 0; i < image.planes.length; i++) {
allBytes.setRange(nextIndex, nextIndex + image.planes[i].bytes.length,
image.planes[i].bytes);
nextIndex += image.planes[i].bytes.length;
}
// Convert as done previously
String base64Image = base64Encode(allBytes);
However, AWS responded with a InvalidImageFormatException
. If someone knows how to correctly encode the image that would be awesome! Thanks