I use a Gtk.Image
backed by a GdkPixbuf.Pixbuf
to display a photo. I scale the photo to fit the available space in response to the size-allocate
signal of the parent widget. The problem is that on HiDPI displays, every dimension is in logical pixels and not device pixels, therefore one the following happens:
- If I scale the
GdkPixbuf.Pixbuf
to the dimensions I get from thesize-allocate
signal, the photo takes up the desired physical space but it only uses half of the resolution of the display (assuming a 2:1 device pixel to logical pixel ratio). - If I scale the
GdkPixbuf.Pixbuf
toget_scale_factor()
times the dimensions I get from thesize-allocate
signal, then the photo will take up twice the phyisical space (i.e., it will not fit in the window).
How can I let GTK know that the Pixbuf
I supply to the Image corresponds to device pixels and not logical pixels?
Update: This is what I have tried (these are simplified examples without any scaling on my part, just trying to display an image unscaled):
Attempt 1:
image = Gtk.Image()
image.set_from_file("subpixel.png")
Attempt 2:
surface = cairo.ImageSurface.create_from_png("subpixel.png")
image = Gtk.Image()
image.set_from_surface(surface)
Attempt 3:
pb = GdkPixbuf.Pixbuf.new_from_file("subpixel.png")
image = Gtk.Image.new_from_pixbuf(pb)
self.add(image)
Attempt 4:
pb = GdkPixbuf.Pixbuf.new_from_file("subpixel.png")
surface = cairo.ImageSurface(cairo.FORMAT_ARGB32, pb.get_width(), pb.get_height())
context = cairo.Context(surface)
Gdk.cairo_set_source_pixbuf(context, pb, 0, 0)
context.paint()
image = Gtk.Image()
image.set_from_surface(surface)