8

I use a Gtk.Image backed by a GdkPixbuf.Pixbuf to display a photo. I scale the photo to fit the available space in response to the size-allocate signal of the parent widget. The problem is that on HiDPI displays, every dimension is in logical pixels and not device pixels, therefore one the following happens:

  • If I scale the GdkPixbuf.Pixbuf to the dimensions I get from the size-allocate signal, the photo takes up the desired physical space but it only uses half of the resolution of the display (assuming a 2:1 device pixel to logical pixel ratio).
  • If I scale the GdkPixbuf.Pixbuf to get_scale_factor() times the dimensions I get from the size-allocate signal, then the photo will take up twice the phyisical space (i.e., it will not fit in the window).

How can I let GTK know that the Pixbuf I supply to the Image corresponds to device pixels and not logical pixels?

Update: This is what I have tried (these are simplified examples without any scaling on my part, just trying to display an image unscaled):

Attempt 1:

    image = Gtk.Image()
    image.set_from_file("subpixel.png")

Attempt 2:

    surface = cairo.ImageSurface.create_from_png("subpixel.png")
    image = Gtk.Image()
    image.set_from_surface(surface)

Attempt 3:

    pb = GdkPixbuf.Pixbuf.new_from_file("subpixel.png")
    image = Gtk.Image.new_from_pixbuf(pb)
    self.add(image)

Attempt 4:

    pb = GdkPixbuf.Pixbuf.new_from_file("subpixel.png")
    surface = cairo.ImageSurface(cairo.FORMAT_ARGB32, pb.get_width(), pb.get_height())
    context = cairo.Context(surface)

    Gdk.cairo_set_source_pixbuf(context, pb, 0, 0)
    context.paint()

    image = Gtk.Image()
    image.set_from_surface(surface)
Zoltan
  • 2,928
  • 11
  • 25

1 Answers1

3

This is typically done by creating a cairo surface from the pixel data and then using gtk_image_set_from_surface(). I don't speak Python, but in C I do it like this:

GdkPixbuf *pb = gdk_pixbuf_new_from_file(file, NULL);

if(pb) {

    cairo_surface_t *s = gdk_cairo_surface_create_from_pixbuf(pb, 0, gtk_widget_get_window(w));

    if(s) {
        gtk_image_set_from_surface(GTK_IMAGE(w), s);
        cairo_surface_destroy(s);       
    }

    g_object_unref(pb);
}

This works fine here. In contrast to gtk_image_set_from_file() it won't use logical pixels but device pixels so no scaling will ever be applied.

Andreas
  • 9,245
  • 9
  • 49
  • 97
  • Wow, I did not expect an answer after more than one year. :) Unfortunately, I have not been able to get this approach to work in Python as I have not found an equivalent to gdk_cairo_surface_create_from_pixbuf. I tried a few similar things (added to my question) but they all already apply the scaling. – Zoltan Jul 07 '20 at 19:34
  • @Zoltan: You could check out how `gdk_cairo_surface_create_from_pixbuf` sets up the cairo surface and try to replicate that in Python. My code above definitely does the trick. I found your question because I had the same problem as you, just with the difference that I'm using C instead of Python. – Andreas Jul 08 '20 at 08:16
  • 1
    @Andreas I cannot thank you enough, this has been driving me absolutely crazy and when I found this issue a handful of weeks back it had no answers. This fixes my application flawlessly on HiDPI displays. – Scoopta Jul 18 '20 at 08:48
  • I haven't been able to try this answer myself but I'm going to accept it based on @Scoopta's comment. – Zoltan Jul 18 '20 at 10:19