Events of type GDK_SCROLL will be received if the client side window
event mask has either GDK_SCROLL_MASK or GDK_SMOOTH_SCROLL_MASK.
GDK_BUTTON_PRESS_MASK has been removed from type_masks[GDK_SCROLL]
as that bit is often set for other-than-scrolling purposes, and
yet have the window receive scroll events. In GTK+, this forces
non-smooth events bubbling, even if the widgets above want smooth
events, and legitimately set GDK_[SMOOTH_]SCROLL_MASK.
If a device provides both smooth and non-smooth events, the latter will be
flagged with _gdk_event_set_pointer_emulated() so the client side window
receives one or the other. If a device is only able to deliver non-smooth
events, those will be sent, so both direction/deltas may need to be handled.
get_event_window() just checked on GDK_TOUCH_MASK, including for emulated
pointer events, so at the very least those should also match evmasks with
no touch events whatsoever
If an active grab kicks in on a different window, _gdk_display_has_device_grab()
would still find the former implicit grab for the window below the pointer, thus
sending events to an unrelated place.
If a grab with GDK_TOUCH_MASK kicks in due to a touch sequence emulating pointer
events, don't mutate the sequence into emitting touch events right away.
Create the backing GdkTouchGrabInfo for touches even if the pointer
emulating touch sequence is already holding an implicit grab on a
window that didn't select for touch events.
the backing GdkTouchGrabInfo will be needed if the overriding device
grab finishes before the touch does in order to send events back to
the implicit grab window. Instead, wait until the touch is physically
finished before removing the matching GdkTouchGrabInfo
GDK will only receive touch events when dealing with a multitouch
device, so these must be transformed to pointer events if the
client-side window receiving the event doesn't listen to touch
events, and the touch sequence the event is from does emulate
the pointer.
If a sequence emulates pointer events, it will result in a
button-press, N motions with GDK_BUTTON1_MASK set and a
button-release event, and it will deliver crossing events
as specified by the current device grab.
These are equivalent to an implicit grab (with !owner_events), so
if the touch leaves or enters the grab window, the other window
won't receive the corresponding counter-event.
If the touch sequence happens on a window with GDK_TOUCH_MASK set,
a GdkTouchGrabInfo is created to back it up. Else a device grab is
only created if the sequence emulates the pointer.
If both a device and a touch grab are present on a window, the later
of them both is obeyed, Any grab on the device happening after a
touch grab generates grab-broken on all the windows an implicit
touch grab was going on.
Touch events don't generate crossing events themselves, so
do not rely on these to determine whether the button release
happened within the event window.
This widget is too narrow to make touch interaction tricky enough, so
don't add the penalty of having the slider run farther from the touch
coordinates if it happens to miss the slider.
This is so submenus stay open as the parent menu item is
pressed/released, since the user would typically lift the
finger in order to select a submenu item.
This makes kinetic scrolling work with viewports where the
content does not otherwise select for button or touch events,
such as testscrolledwindow's label.
Kinetic scrolling is only done on touch devices, since it is
sort of meaningless on pointer devices, besides it implies
a different input event handling on child widgets that is
unnecessary there.
If the scrolling doesn't start after a long press, the scrolling is
cancelled and events are handled by child widgets normally.
When clicked again close to the previous button press location
(assuming it had ~0 movement), the scrolled window will allow
the child to handle the events immediately.
This is so the user doesn't have to wait to the press-and-hold
timeout in order to operate on the scrolledwindow child.
The innermost scrolled window always gets to capture the events, all
scrolled windows above it just let the event go through. Ideally
reaching a limit on the innermost scrolled window would propagate
the dragging up the hierarchy in order to keep following the touch
coords, although that'd involve rather evil hacks just to cater
for broken UIs.
Anytime a touch device interacts, the crossing events generation
will change to a touch mode where only events with mode
GDK_CROSSING_TOUCH_BEGIN/END are handled, and those are sent
around touch begin/end. Those are virtual as the master
device may still stay on the window.
Whenever there is a switch of slave device (the user starts
using another non-touch device), a crossing event with mode
GDK_CROSSING_DEVICE_SWITCH may generated if needed, and the normal
crossing event handling is resumed.
This patch adds a capture phase to GTK+'s event propagation
model. Events are first propagated from the toplevel (or the
grab widget, if a grab is in place) down to the target widget
and then back up. The second phase is using the existing
::event signal, the new capture phase is using a private
API instead of a public signal for now.
This mechanism can be used in many places where we currently
have to prevent child widgets from getting events by putting
an input-only window over them. It will also be used to implement
kinetic scrolling in subsequent patches.
http://bugzilla.gnome.org/show_bug.cgi?id=641836
We automatically request more motion events in behalf of
the original widget if it listens to motion hints. So
the capturing widget doesn't need to handle such
implementation details.
We are not making event capture part of the public API for 3.4,
which is why there is no ::captured-event signal.
We don't want to fallback for 'random' touch sequences, since
that could lead to all kinds of pairedness and other violations.
Since the X server already tells us what touch events it would
have used for emulating pointer events, we just use that information
here.
Translate XI_TouchBegin/Update/End to GDK_TOUCH_BEGIN/UPDATE/END
events.
At the same time,
set pointer-emulated flags on button events with XIPointerEmulated
and on touch events emulating the pointer.
This commit introduces GDK_TOUCH_BEGIN/UPDATE/END/CANCEL
and a separate GdkEventTouch struct that they use. This
is closer to the touch event API of other platforms and
matches the xi2 events closely, too.
We introduce GDK_SOURCE_TOUCHSCREEN and GDK_SOURCE_TOUCHPAD
for direct and indirect touch devices, respecively. These
correspond to XIDirectTouch and XIDependentTouch in XI2.