So far, the only way i've found to do this is
In order to display an image in rust using egui, I have to
- Use the image crate to read in a DynamicImage from a file
- Convert that DynamicImage to an ImageBuffer
- Convert that ImageBuffer into a FlatSamples<&[u8]>
- Convert that FlatSamples to a ColorImage
- Convert that ColorImage into an instance of ImageData
- Create a TextureHandle from that ImageData
- Create an egui image widget from that TextureHandle
- Add the widget to my ui
This was a massive headache to figure out.
Heres the code for how I did it
use egui::ImageData;
use std::path::Path;
pub fn get_image(filepath: &str, ix: u32, iy: u32, iw: u32, ih: u32) -> ImageData {
let fp = Path::new(filepath);
let color_image = load_image_from_path(&fp).unwrap();
let img = ImageData::from(color_image);
img
}
fn load_image_from_path(path: &std::path::Path) -> Result<egui::ColorImage, image::ImageError> {
let image = image::io::Reader::open(path)?.decode()?;
let size = [image.width() as _, image.height() as _];
let image_buffer = image.to_rgba8();
let pixels = image_buffer.as_flat_samples();
Ok(egui::ColorImage::from_rgba_unmultiplied(
size,
pixels.as_slice(),
))
}
//inside my update function
let img = ui.ctx().load_texture(
"my-image",
get_image("./image.png", 0, 0, 100, 100),
Default::default()
);
ui.add(egui::Image::new(&img, img.size_vec2()));
Is there any simpler way to do this?