With the vendor provided Nvidia driver there is a small window of time
after drawing to a GL surface before the updates to that surface
can be used by the compositor.
Drawing is already coordinated with the compositor through the frame
synchronization protocol detailed here:
https://fishsoup.net/misc/wm-spec-synchronization.html
Unfortunately, at the moment, GdkX11Surface tells the compositor the
frame is ready immediately after drawing to the surface, not later,
when it's consumable by the compositor.
This commit defers announcing the frame as ready until it's consumable
by the compositor. It does this by listening for the X server to announce
damage events associated with the frame drawing. It tries to find the
right damage event by waiting until fence placed at buffer swap time
signals.
This commit moves some of the end frame sync counter handling
code to subroutines.
It's a minor readability win, but the main motivation is to
make it easier in a subsequent commit to defer updating the
sync counter until a more appropriate time.
On X11, shortcuts inhibition is emulated using a grab on the keyboard.
So if another widget ungrabs the keyboard behind our back (for example
when a popup window is dismissed) that effectively disables the effects
of the shortcut inhibition on the surface and we need to update the
shortcut inhibition status accordingly.
Check for "grab-broken" events on the surface and clear existing
shortcuts inhibition for the matching seat, so that the client can be
notified and may decide to re-enable shortcut inhibition if desired.
GdkEvent has been a "I-can't-believe-this-is-not-OOP" type for ages,
using a union of sub-types. This has always been problematic when it
comes to implementing accessor functions: either you get generic API
that takes a GdkEvent and uses a massive switch() to determine which
event types have the data you're looking for; or you create namespaced
accessors, but break language bindings horribly, as boxed types cannot
have derived types.
The recent conversion of GskRenderNode (which had similar issues) to
GTypeInstance, and the fact that GdkEvent is now a completely opaque
type, provide us with the chance of moving GdkEvent to GTypeInstance,
and have sub-types for GdkEvent.
The change from boxed type to GTypeInstance is pretty small, all things
considered, but ends up cascading to a larger commit, as we still have
backends and code in GTK trying to access GdkEvent structures directly.
Additionally, the naming of the public getter functions requires
renaming all the data structures to conform to the namespace/type-name
pattern.
For the X11 backend, keep a list of monitors for which the surface
intersects the monitor area.
Whenever the X11 surface is configured, check against the list of
monitors to determine whether it enters a new monitor or if it left a
monitor, to emit the corresponding ::enter/leave-monitor signals just
like a Wayland compositor would.
As monitors can be added, removed or reconfigured at any time, redo
those checks whenever any of these events occur.
On X11, there is no such equivalent to the inhibit shortcut protocol
found on Wayland.
To implement the inhibit_system_shortcuts API on X11, we emulate the
same behavior using grabs on the keyboard.
To avoid keeping active grabs on the keyboard that would affect other
X11 applications even when the surface isn't focused, the X11
implementation takes care of releasing the grabs as soon as the toplevel
loses focus.
Without this, the back buffers of the wrong size
keep being used, causing flickery misdraws, as
seen when expanding the expander in the popover
in widget-factory.
There is no shape combining going on anymore, so
call this just gdk_surface_set_input_region, and
remove the offset arguments too. All callers pass
0 anyway.
Update all callers and implementations.
Sprinkle various g_assert() around the code where gcc cannot figure out
on its own that a variable is not NULL and too much refactoring would be
needed to make it do that.
Also fix usage of g_assert_nonnull(x) to use g_assert(x) because the
first is not marked as G_GNUC_NORETURN because of course GTester
supports not aborting on aborts.
replace all uses with const char * (non-interned).
Also remove a lot fo juggling from atom to GdkAtom to string and back.
The X Atom hash table is now mapping to (again, non-interned) strings.
Restructure the getters for event fields to
be more targeted at particular event types.
Update all callers, and replace all direct
event struct access with getters.
As a side-effect, this drops some unused getters.
Replace the gdk_surface_move_to_rect() API with a new GdkSurface
method called gdk_surface_present_popup() taking a new GdkPopupLayout
object describing how they should be laid out on screen.
The layout properties provided are the same as the ones used with
gdk_surface_move_to_rect(), except they are now set up using
GdkPopupLayout.
Calling gdk_surface_present_popup() will either show the popup at the
position described using the popup layout object and a new unconstrained
size, or reposition it accordingly.
In some situations, such as when a popup is set to autohide, presenting
may immediately fail, in case the grab was not granted by the display
server.
After a successful present, the result of the layout can be queried
using the following methods:
* gdk_surface_get_position() - to get the position relative to its
parent
* gdk_surface_get_width() - to get the current width
* gdk_surface_get_height() - to get the current height
* gdk_surface_get_rect_anchor() - to get the anchor point on the anchor
rectangle the popup was effectively positioned against given
constraints defined by the environment and the layout rules provided
via GdkPopupLayout.
* gdk_surface_get_surface_anchor() - the same as the one above but for
the surface anchor.
A new signal replaces the old "moved-to-rect" one -
"popup-layout-changed". However, it is only intended to be emitted when
the layout changes implicitly by the windowing system, for example if
the monitor resolution changed, or the parent window moved.
The "iconified" state is mostly an X11-ism; every other platform calls
this state "minimized" because it may not involve turning a window into
an icon at all.
Windows/surface's aren't supposed to be explicitly moved by any external
part, so don't provide API for doing so. Usage throughout Gdk is
replaced by the corresponding backend variants.
The generic layer still does the heavy lifting, leaving the backends
more or less just act as thin wrappers, dealing a bit with global
coordinate transformations. The end goal is to remove explicit surface
moving from the generic gdk layer.