Main changes:
1. Avoid invalid writes by not passing pointers to a GArray that
realloc()s its data
2. Use a hash table to store image defs, instead of an array. This
requires a custom hash/equal function
3. Make image desc computation sync, so that setting a cs always
succeeds or always fails and doesn't depend on timing.
4. Add a few debug messages in failure paths. For lack of a category,
they ended up in MISC.
This is a still experimental protocol (thus the xx prefix).
We are using it go obtain information about the compositors
preferred color state, and pass that on to our rendering machinery.
The currently supported color states are srgb, srgb-linear, rec2100-pq
and rec2100-linear. We don't have any support for ICC profiles.
Unlike other protocols, keep the support code for this protocol
fairly isolated behind wrapper objects, since the protocol is
still subject to change.
The settings portal is reporting enums as string values, so
we need to translate this setting back to what we need.
Fixes
(gtk4-demo:18902): GLib-CRITICAL **: 19:06:14.783: g_variant_get_int32: assertion 'g_variant_is_of_type (value, G_VARIANT_TYPE_INT32)' failed
that could be seen in recent nightly flatpaks.
This protocol lifts some functionality from the gtk-shell protocol,
namely the ability to tag dialogs as modal. Ensure to use this
new protocol if available for the task, instead of the gtk-shell
protocol.
Add a high-level setting that gives us more freedom to tweak
font rendering knobs according to our needs. It has a 'manual'
value that lets users continue to influence font rendering using
the low-level font-related settings as before.
Once the schemas have this, we can support setting this session-wide.
See https://gitlab.gnome.org/GNOME/gsettings-desktop-schemas/-/merge_requests/79
The initial implementation of 'automatic' font rendering is fairly
simplistic: if the monitor dpi is less than 200, prefer sharpness,
so turn on metrics hinting and slight hinting. If the monitor dpi
is at least 200, we both off.
This attempts to improve the accuracy for the "presentation_time" of an
individual GdkFrameTimings. That information is currently filled in as soon
as we get a frame callback. However, if presentation-time wayland protocol
is available, that will be used to supliment a more accurate time which
may improve future presentation-time predictions within GdkFrameClockIdle.
The protocol states that all related and sub surfaces will receive the
same information so it is safe that this could be registered for more
than just the toplevel. The information becomes idempotent.
It turns out that the workaround in 7b380b2ffc was insufficient.
During initialization, we end up calling apply_monitor_changes()
while xdg_output is set, but xdg_output_geometry isn't. Be more
careful and prevent that from wreaking havoc with negative scales.
Fixes: #6472
The first time this function is called, has_xdg_output() returns
true, but haven't yet received all the xdg-output events, so wait
for that to be done. Otherwise, the logical size is 0, and nothing
useful comes from that.
This fixes a problem that is apparent in
https://bugzilla.mozilla.org/show_bug.cgi?id=1869724, but that also
reproduces on any GTK application as described in
https://bugzilla.mozilla.org/show_bug.cgi?id=1869724#c16.
xdg_output sizes might be physical if the compositor doesn't scale them,
it seems. So to report the correct logical geometry in GDK pixels, we
need to detect this case. We do this by checking whether the wl_output
size matches the xdg_output size.
Whether or not switches include shapes to indicate their ON/OFF
state is currently controlled by the stylesheet (in particular
the HighContrast style).
However there are use cases for both using the HighContrast style
without shapes, and for using shapes with the regular stylesheet,
so follow the newly added "show-status-shapes" setting instead.
https://gitlab.gnome.org/GNOME/gtk/-/issues/5354
It started out as busywork, but it does many separate things. If I could
start over, I'd take them apart into multiple commits:
1. Remove G_ENABLE_DEBUG around GDK_DEBUG_*() calls
This is not needed at all, the calls themselves take care of it.
2. Remove G_ENABLE_DEBUG around profiling code
This now enables profiling support in release builds.
3. Stop poking _gdk_debug_flags and use GDK_DEBUG_CHECK()
This was old code that was never updated.
4. Make !G_ENABLE_DEBUG turn off GDK_DEBUG_CHECK()
The code used to
#define GDK_DEBUG_CHECK(...) false
#define GDK_DEBUG(...)
which would compile away all the code inside those macros. This
means a lot of variable definitions and debug utility functions
would suddenly no longer be used and cause compiler errors.
The protocol spec isn't clear about the relationship
between the capability enum and the uint in the capability
event.
Fix things to use the same relationship as mutter.
Instead of setting the buffer scale via the buffer-scale command, set it
via the viewport.
This technically allows setting fractional scales, but we're not doing
that.
April fools!
No, really.
The fractional scale protocol is just a way to track the surface scale,
but not a way to draw fractional content.
This commit uses it for that, so tht we don't rely on tracking outputs.
This also allows magnifiers etc to send us a larger (integer) scale if
they would like that, that is not represented by the outputs.
Since Wayland 1.15, it is now possible to use absolute paths in
"WAYLAND_DISPLAY".
In that scenario, having a valid "XDG_RUNTIME_DIR" is not a requirement
anymore.
For this reason we remove the "XDG_RUNTIME_DIR" check and we let
`wl_display_connect()` decide if our environment is correct.
Signed-off-by: Ludovico de Nittis <ludovico.denittis@collabora.com>
The cursor-theme-size setting is documented as
'0 means the default size'. Make it so by using
size 24 if we see a 0. Its better than crashing.
Fixes: #5700