Skip to content

HiDPI scaling finetuning #837

@elinorbgr

Description

@elinorbgr

Thanks to previous discussions #105 and some awesome work by @francesca64 in #548 winit has now a quite fleshed out dpi-handling story. However there is still some friction (see #796 notably), so I suggest putting everything on the table to figure out the details.

HiDPI handling on different platforms

It tried to gather information about how HiDPI is handled on different platforms, please correct me if I'm wrong.

Here, "physical pixels" represents the coordinate space of the actual pixels on the screen. "Logical pixels" represents the abstract coordinate space obtained from scaling the physical coordinates by the hidpi factor. Logical pixels generally give a better feeling of "how big the surface is on the screen".

Windows

On Windows an app needs to declare itself as dpi aware. If it does not, the OS will rescale everything to make it believe the screen just has a DPI factor of 1.

The user can configure DPI factors for their screens by increments of 0.25 (so 1, 1.25, 1.5, 1.75, 2 ...)

If the app declares itself as DPI aware, it then handles everything in physical pixels, and is responsible for drawing correctly and scaling its input.

macOS

On macOS, the app can request to be assigned an OpenGL surface of the best size on retina (HiDPI) displays. If it does not, its contents are scaled up when necessary.

DPI factor is generally either 2 for retina displays or 1 for regular displays.

Input handling is done in logical pixels.

Linux X11

The Xorg server doesn't really have any concept of "logical pixels". Everything is done directly in physical pixels, and app need to handle everything themselves.

The app can handle HiDPI handling by querying the DPI value for the screen it is displayed one and computing the HiDPI factor by dividing it by 96, which is considered the reference DPI for a regular non-HiDPI screen. Meaning the DPI factor can basically have any float value.

There are several potential sources for obtaining the DPI of the screen:

  • retrieving it from Xrandr, which computes it from the physical dimensions of the monitor meaning:
    • it'll almost never be a round number
    • it can be unreliable if the X server couldn't get the actual physical dimensions
  • retrieving it from a user configuration like xft.dpi or the gnome service configuration, meaning:
    • it is a globally set value, not per-monitor, so it won't work well on mixed-dpi setups
    • if several config sources are available, which is to be used?

Linux Wayland

Similarly to macOS, most of the handling is done only on logical pixels.

The app can check the HiDPI factor for each monitor (which is always an integer), and decide to draw its content with any (integer) dpi factor it chooses. If the chosen dpi factor does not match the one of the monitor displaying the window, the display server will upscale or downscale the contents accordingly.

Mobile platforms

I don't know about Android or iOS, but there should not be any difficulties coming from them : a device only has a single screen, and generally apps take up the whole screen.

Web platform

I don't know about it.

Current HiDPI model of winit

Currently winit follows an hidpi model similar to Wayland or macOS : almost everything is handled in logical pixels (marked by a specific type for handling it), and hidpi support is mandatory in the sense that the drawing surface always match the physical dimensions of the app.

The LogicalSize and LogicalPosition types provide helper methods to be converted to their physical counterpart given an hidpi factor.

The app is supposed to track HiDPIFactoChanged(..) events, and whenever this event is emitted, winit has resized the drawing surface to match its new correct physical size, so that the logic size does not change.

Reported friction

As can be expected from the previous description, most of the reported friction with winit's model comes from the X11 platform, for which hidpi awareness is very barebones.

I believe the main source of friction however comes from the fact that linux developpers are very used to the x11 model and their expectations do not map to the model winit has chosen: winit forces its user to mostly work using logical coordinates to handle hidpi, and only use physical coordinates when actually drawing. This causes a lot of impedance mismatch, which is a large source frustration for everyone.

Solutions

There doesn't appear to be some silver-bullet that can solve everything, mostly as the X11 model for hidpi handling is so different from the other platforms that unifying them in a satisfactory way seems really difficult.

So while this section is really the open question of this issue, there are some simple "solutions" that I can think of, in no particular order:

  • Winit could just keep its current model and work around X11 subtleties as best as it can, even if the dpi factor is 1.0866666666, and we can close this issue as WONTFIX or WORKS AS INTENDED
  • Winit could officially declare that X11 is a second-class citizen wrt to hidpi handling. As such, on X11 winit will always report an hidpi factor of 1 and have physical an logical coordinates always be the same. Winit would then expose some platform-specific methods like get_xrandr_dpi_for_monitor and get_xft_dpi_config, and let the downstream crate deal themselves with X11 hidpi
  • Winit could shift its dpi model to be closer to X11: do everything in physical pixels and only provide the dpi factor as an information and let the downstream users handle everything themselves
  • Someone suddenly has a "Eureka!" moment and figures out a new revolutionary API that cleanly unifies everything

Now tagging the persons that may be interested in this discussion: @Osspial @zegentzy @icefoxen @mitchmindtree @fschutt @chrisduerr

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions