I've got a fun little widget that invites users to choose from a list of 114 animals. The UI includes small thumbnails of each species, all the same size (100x70). The sum total of the thumbnails is 2.1M.
I'm not eager to make 114 server requests, so I used ImageMagick to make one VERY long image (11400x70), which comes down to 960K. So the challenge is to chop up that long image into 114 small images on the client. (EDIT: I need each animal as its own image because the UI filters the list down, e.g. "show me only herbivores").
I originally had a convoluted approach where I painted that large image to a canvas after loading a base64 string of the combined image, pulled the ImageData for each animal, painting THAT to a second canvas, and then pulled the dataURL:
var combined_canvas = document.createElement('canvas');
var combined_ctx = combined_canvas.getContext('2d');
var indv_canvas = document.createElement('canvas');
var indv_ctx = indv_canvas.getContext('2d');
combined_canvas.width = 11400; combined_canvas.height = 70;
indv_canvas.width = 100; indv_canvas.height = 70;
var image = new Image();
image.onload = function() {
combined_ctx.drawImage(image, 0, 0); // paint combined image
// loop through the list of 114 animals
Object.keys(master_list).forEach((d, i) => {
// get the image data just for this animal in sequence
var imageData = combined_ctx.getImageData(i * 100, 0, 100, 70);
// paint that image data to the second canvas
indv_ctx.putImageData(imageData, 0, 0);
var img = document.createElement('img');
img.src = indv_canvas.toDataURL();
document.getElementById("animal_thumbnails").appendChild(img);
});
};
image.src = combined_images; // the base64 of the combined image
Naturally, it would be MUCH easier to make 114 CSS Sprites, which would also prevent a small loss in quality from the way Canvas paints images. But what I don't understand well is how browsers store the same image placed many times with different offsets.
In the event that the client duplicates the combined image in memory, I'm looking at about 100Mb for all these thumbnails. I would HOPE even the worst browser is smarter than that, but that's a degree too far under the hood for me to know.
The reason I'm loading the image as base64 is that one other idea I had was to decode the base64 with atob()
and chop it up with jpg-js, per this Stackoverflow suggestion. But jpg-js adds about 300K when minified and I'm not confident it was ever really intended for the client.