2014-05-20 14:10:31 +00:00
|
|
|
|
<?xml version="1.0"?>
|
|
|
|
|
<!DOCTYPE refentry PUBLIC "-//OASIS//DTD DocBook XML V4.3//EN"
|
|
|
|
|
"http://www.oasis-open.org/docbook/xml/4.3/docbookx.dtd" [
|
|
|
|
|
]>
|
|
|
|
|
<refentry id="chap-input-handling">
|
|
|
|
|
<refmeta>
|
2015-02-17 14:37:28 +00:00
|
|
|
|
<refentrytitle>The GTK+ Input and Event Handling Model</refentrytitle>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
<manvolnum>3</manvolnum>
|
|
|
|
|
<refmiscinfo>GTK Library</refmiscinfo>
|
|
|
|
|
</refmeta>
|
|
|
|
|
|
|
|
|
|
<refnamediv>
|
2015-02-17 14:37:28 +00:00
|
|
|
|
<refname>The GTK+ Input and Event Handling Model</refname>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
<refpurpose>
|
2015-02-17 14:37:28 +00:00
|
|
|
|
GTK+ input and event handling in detail
|
2014-05-20 14:10:31 +00:00
|
|
|
|
</refpurpose>
|
|
|
|
|
</refnamediv>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
<refsect1 id="input-overview">
|
2015-02-17 14:37:28 +00:00
|
|
|
|
<title>Overview of GTK+ input and event handling</title>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
|
2014-05-22 19:16:09 +00:00
|
|
|
|
<para>
|
|
|
|
|
This chapter describes in detail how GTK+ handles input. If you are interested
|
|
|
|
|
in what happens to translate a key press or mouse motion of the users into a
|
|
|
|
|
change of a GTK+ widget, you should read this chapter. This knowledge will also
|
|
|
|
|
be useful if you decide to implement your own widgets.
|
|
|
|
|
</para>
|
|
|
|
|
|
2014-05-20 14:10:31 +00:00
|
|
|
|
<refsect2>
|
|
|
|
|
<title>Devices and events</title>
|
|
|
|
|
|
|
|
|
|
<!-- input devices: master/slave, keyboard/pointer/touch -->
|
2014-05-22 19:16:09 +00:00
|
|
|
|
<para>
|
|
|
|
|
The most basic input devices that every computer user has interacted with are
|
|
|
|
|
keyboards and mice; beyond these, GTK+ supports touchpads, touchscreens and
|
|
|
|
|
more exotic input devices such as graphics tablets. Inside GTK+, every such
|
|
|
|
|
input device is represented by a #GdkDevice object.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
To simplify dealing with the variability between these input devices, GTK+
|
|
|
|
|
has a concept of master and slave devices. The concrete physical devices that
|
|
|
|
|
have many different characteristics (mice may have 2 or 3 or 8 buttons,
|
|
|
|
|
keyboards have different layouts and may or may not have a separate number
|
|
|
|
|
block, etc) are represented as slave devices. Each slave device is
|
|
|
|
|
associated with a virtual master device. Master devices always come in
|
|
|
|
|
pointer/keyboard pairs - you can think of such a pair as a 'seat'.
|
|
|
|
|
</para>
|
|
|
|
|
<para>
|
|
|
|
|
GTK+ widgets generally deal with the master devices, and thus can be used
|
|
|
|
|
with any pointing device or keyboard.
|
|
|
|
|
</para>
|
|
|
|
|
|
2014-05-20 14:10:31 +00:00
|
|
|
|
<!-- input events: button, touch, key, motion, etc -->
|
2014-05-22 19:16:09 +00:00
|
|
|
|
<para>
|
|
|
|
|
When a user interacts with an input device (e.g. moves a mouse or presses
|
|
|
|
|
a key on the keyboard), GTK+ receives events from the windowing system.
|
|
|
|
|
These are typically directed at a specific window - for pointer events,
|
|
|
|
|
the window under the pointer (grabs complicate this), for keyboard events,
|
|
|
|
|
the window with the keyboard focus.
|
|
|
|
|
</para>
|
|
|
|
|
<para>
|
2014-05-22 19:45:31 +00:00
|
|
|
|
GDK translates these raw windowing system events into #GdkEvents.
|
|
|
|
|
Typical input events are:
|
|
|
|
|
<simplelist>
|
2015-02-18 11:33:36 +00:00
|
|
|
|
<member>#GdkEventButton</member>
|
|
|
|
|
<member>#GdkEventMotion</member>
|
|
|
|
|
<member>#GdkEventCrossing</member>
|
|
|
|
|
<member>#GdkEventKey</member>
|
|
|
|
|
<member>#GdkEventFocus</member>
|
|
|
|
|
<member>#GdkEventTouch</member>
|
2014-05-22 19:45:31 +00:00
|
|
|
|
</simplelist>
|
|
|
|
|
</para>
|
2014-05-28 13:50:06 +00:00
|
|
|
|
<para>
|
|
|
|
|
Additionally, GDK/GTK synthesizes other signals to let know whether
|
|
|
|
|
grabs (system-wide or in-app) are taking input away:
|
|
|
|
|
<simplelist>
|
2015-02-18 11:33:36 +00:00
|
|
|
|
<member>#GdkEventGrabBroken</member>
|
|
|
|
|
<member>#GtkWidget::grab-notify</member>
|
2014-05-28 13:50:06 +00:00
|
|
|
|
</simplelist>
|
|
|
|
|
</para>
|
2014-05-22 19:45:31 +00:00
|
|
|
|
<para>
|
|
|
|
|
When GTK+ is initialized, it sets up an event handler function with
|
|
|
|
|
gdk_event_handler_set(), which receives all of these input events
|
|
|
|
|
(as well as others, for instance window management related events).
|
2014-05-22 19:16:09 +00:00
|
|
|
|
</para>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
</refsect2>
|
|
|
|
|
|
2015-02-18 11:13:28 +00:00
|
|
|
|
<refsect2 id="event-propagation">
|
2014-05-20 14:10:31 +00:00
|
|
|
|
<title>Event propagation</title>
|
|
|
|
|
|
2014-05-24 05:56:44 +00:00
|
|
|
|
<para>
|
2015-02-18 11:13:28 +00:00
|
|
|
|
For widgets which have a #GdkWindow set, events are received from the
|
|
|
|
|
windowing system and passed to gtk_main_do_event(). See its documentation
|
|
|
|
|
for details of what it does: compression of enter/leave events,
|
|
|
|
|
identification of the widget receiving the event, pushing the event onto a
|
|
|
|
|
stack for gtk_get_current_event(), and propagating the event to the
|
|
|
|
|
widget.
|
2014-05-24 05:56:44 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
2015-02-18 11:13:28 +00:00
|
|
|
|
When a GDK backend produces an input event, it is tied to a #GdkDevice and
|
|
|
|
|
a #GdkWindow, which in turn represents a windowing system surface in the
|
|
|
|
|
backend. If a widget has grabbed the current input device, or all input
|
|
|
|
|
devices, the event is propagated to that #GtkWidget. Otherwise, it is
|
|
|
|
|
propagated to the the #GtkWidget which called gtk_widget_register_window()
|
|
|
|
|
on the #GdkWindow receiving the event.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Grabs are implemented for each input device, and globally. A grab for a
|
|
|
|
|
specific input device (gtk_device_grab_add()), is sent events in
|
|
|
|
|
preference to a global grab (gtk_grab_add()). Input grabs only have effect
|
|
|
|
|
within the #GtkWindowGroup containing the #GtkWidget which registered the
|
|
|
|
|
event’s #GdkWindow. If this #GtkWidget is a child of the grab widget, the
|
|
|
|
|
event is propagated to the child — this is the basis for propagating
|
|
|
|
|
events within modal dialogs.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
An event is propagated to a widget using gtk_propagate_event().
|
|
|
|
|
Propagation differs between event types: key events (%GDK_KEY_PRESS,
|
|
|
|
|
%GDK_KEY_RELEASE) are delivered to the top-level #GtkWindow; other events
|
|
|
|
|
are propagated down and up the widget hierarchy in three phases (see
|
|
|
|
|
#GtkPropagationPhase).
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
For key events, the top-level window’s default #GtkWindow::key-press-event
|
|
|
|
|
and #GtkWindow::key-release-event signal handlers handle mnemonics and
|
|
|
|
|
accelerators first. Other key presses are then passed to
|
|
|
|
|
gtk_window_propagate_key_event() which propagates the event upwards from
|
|
|
|
|
the window’s current focus widget (gtk_window_get_focus()) to the
|
|
|
|
|
top-level.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
For other events, in the first phase (the “capture” phase) the event is
|
|
|
|
|
delivered to each widget from the top-most (for example, the top-level
|
|
|
|
|
#GtkWindow or grab widget) down to the target #GtkWidget.
|
|
|
|
|
<link linkend="event-controllers-and-gestures">Gestures</link> that are
|
|
|
|
|
attached with %GTK_PHASE_CAPTURE get a chance to react to the event.
|
2014-05-24 05:56:44 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
2014-05-28 13:50:06 +00:00
|
|
|
|
<para>
|
|
|
|
|
After the “capture” phase, the widget that was intended to be the
|
2015-02-18 11:13:28 +00:00
|
|
|
|
destination of the event will run gestures attached to it with
|
|
|
|
|
%GTK_PHASE_TARGET. This is known as the “target” phase, and only
|
|
|
|
|
happens on that widget.
|
2014-05-28 13:50:06 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
2014-05-24 05:56:44 +00:00
|
|
|
|
<para>
|
2018-01-16 04:26:44 +00:00
|
|
|
|
Next, the #GtkWidget::event signal is emitted.
|
2015-02-18 11:13:28 +00:00
|
|
|
|
Handling these signals was the primary way to handle input in GTK+ widgets
|
2018-01-16 04:26:44 +00:00
|
|
|
|
before gestures were introduced. The signal is emitted from
|
2015-02-18 11:13:28 +00:00
|
|
|
|
the target widget up to the top-level, as part of the “bubble” phase.
|
2014-05-24 05:56:44 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
The default handlers for the event signals send the event
|
2015-02-18 11:13:28 +00:00
|
|
|
|
to gestures that are attached with %GTK_PHASE_BUBBLE. Therefore,
|
2014-05-28 13:50:06 +00:00
|
|
|
|
gestures in the “bubble” phase are only used if the widget does
|
2014-05-24 05:56:44 +00:00
|
|
|
|
not have its own event handlers, or takes care to chain up to the
|
2015-02-18 11:13:28 +00:00
|
|
|
|
default #GtkWidget handlers.
|
2014-05-24 05:56:44 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
2015-02-18 11:13:28 +00:00
|
|
|
|
Events are not delivered to a widget which is insensitive or unmapped.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Any time during the propagation phase, a widget may indicate that a
|
2014-05-28 13:50:06 +00:00
|
|
|
|
received event was consumed and propagation should therefore be stopped.
|
2015-02-18 11:13:28 +00:00
|
|
|
|
In traditional event handlers, this is hinted by returning %GDK_EVENT_STOP.
|
|
|
|
|
If gestures are used, this may happen when the widget tells the gesture
|
2014-05-28 13:50:06 +00:00
|
|
|
|
to claim the event touch sequence (or the pointer events) for its own. See the
|
|
|
|
|
"gesture states" section below to know more of the latter.
|
|
|
|
|
</para>
|
|
|
|
|
</refsect2>
|
|
|
|
|
|
2015-02-18 11:13:28 +00:00
|
|
|
|
<refsect2 id="event-masks">
|
|
|
|
|
<title>Event masks</title>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Each widget instance has a basic event mask and another per input device,
|
|
|
|
|
which determine the types of input event it receives. Each event mask set
|
|
|
|
|
on a widget is added to the corresponding (basic or per-device) event mask
|
|
|
|
|
for the widget’s #GdkWindow, and all child #GdkWindows.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Filtering events against event masks happens inside #GdkWindow, which
|
|
|
|
|
exposes event masks to the windowing system to reduce the number of events
|
|
|
|
|
GDK receives from it. On receiving an event, it is filtered against the
|
|
|
|
|
#GdkWindow’s mask for the input device, if set. Otherwise, it is filtered
|
|
|
|
|
against the #GdkWindow’s basic event mask.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
This means that widgets must add to the event mask for each event type
|
|
|
|
|
they expect to receive, using gtk_widget_set_events() or
|
|
|
|
|
gtk_widget_add_events() to preserve the existing mask. Widgets which are
|
|
|
|
|
aware of floating devices should use gtk_widget_set_device_events() or
|
|
|
|
|
gtk_widget_add_device_events(), and must explicitly enable the device
|
|
|
|
|
using gtk_widget_set_device_enabled(). See the #GdkDeviceManager
|
|
|
|
|
documentation for more information.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
All standard widgets set the event mask for all events they expect to
|
|
|
|
|
receive, and it is not necessary to modify this. Masks should be set when
|
|
|
|
|
implementing a new widget.
|
|
|
|
|
</para>
|
|
|
|
|
</refsect2>
|
|
|
|
|
|
2014-05-28 13:50:06 +00:00
|
|
|
|
<refsect2>
|
|
|
|
|
<title>Touch events</title>
|
|
|
|
|
|
|
|
|
|
<para>
|
2015-02-18 11:33:36 +00:00
|
|
|
|
Touch events are emitted as events of type %GDK_TOUCH_BEGIN, %GDK_TOUCH_UPDATE or
|
|
|
|
|
%GDK_TOUCH_END, those events contain an “event sequence” that univocally identifies
|
2014-05-28 13:50:06 +00:00
|
|
|
|
the physical touch until it is lifted from the device.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
On some windowing platforms, multitouch devices perform pointer emulation, this works
|
|
|
|
|
by granting a “pointer emulating” hint to one of the currently interacting touch
|
2015-02-18 11:33:36 +00:00
|
|
|
|
sequences, which will be reported on every #GdkEventTouch event from that sequence. By
|
|
|
|
|
default, if a widget didn't request touch events by setting %GDK_TOUCH_MASK on its
|
|
|
|
|
event mask and didn't override #GtkWidget::touch-event, GTK+ will transform these
|
|
|
|
|
“pointer emulating” events into semantically similar #GdkEventButton and #GdkEventMotion
|
|
|
|
|
events. Depending on %GDK_TOUCH_MASK being in the event mask or not, non-pointer-emulating
|
2014-05-28 13:50:06 +00:00
|
|
|
|
sequences could still trigger gestures or just get filtered out, regardless of the widget
|
|
|
|
|
not handling those directly.
|
2014-05-24 05:56:44 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
2014-05-28 13:50:06 +00:00
|
|
|
|
<para>
|
2015-02-18 11:33:36 +00:00
|
|
|
|
If the widget sets %GDK_TOUCH_MASK on its event mask and doesn't chain up on
|
|
|
|
|
#GtkWidget::touch-event, only touch events will be received, and no pointer emulation
|
2014-05-28 13:50:06 +00:00
|
|
|
|
will be performed.
|
|
|
|
|
</para>
|
|
|
|
|
</refsect2>
|
|
|
|
|
|
|
|
|
|
<refsect2>
|
|
|
|
|
<title>Grabs</title>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Grabs are a method to claim all input events from a device, they happen
|
|
|
|
|
either implicitly on pointer and touch devices, or explicitly. Implicit grabs
|
2015-02-18 11:33:36 +00:00
|
|
|
|
happen on user interaction, when a #GdkEventButtonPress happens, all events from
|
|
|
|
|
then on, until after the corresponding #GdkEventButtonRelease, will be reported
|
2014-05-28 13:50:06 +00:00
|
|
|
|
to the widget that got the first event. Likewise, on touch events, every
|
2015-02-18 11:33:36 +00:00
|
|
|
|
#GdkEventSequence will deliver only events to the widget that received its
|
|
|
|
|
%GDK_TOUCH_BEGIN event.
|
2014-05-28 13:50:06 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Explicit grabs happen programatically (both activation and deactivation),
|
|
|
|
|
and can be either system-wide (GDK grabs) or application-wide (GTK grabs).
|
|
|
|
|
On the windowing platforms that support it, GDK grabs will prevent any
|
|
|
|
|
interaction with any other application/window/widget than the grabbing one,
|
|
|
|
|
whereas GTK grabs will be effective only within the application (across all
|
|
|
|
|
its windows), still allowing for interaction with other applications.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
But one important aspect of grabs is that they may potentially happen at any
|
|
|
|
|
point somewhere else, even while the pointer/touch device is already grabbed.
|
|
|
|
|
This makes it necessary for widgets to handle the cancellation of any ongoing
|
|
|
|
|
interaction. Depending on whether a GTK or GDK grab is causing this, the
|
2015-02-18 11:33:36 +00:00
|
|
|
|
widget will respectively receive a #GtkWidget::grab-notify signal, or a
|
|
|
|
|
#GdkEventGrabBroken event.
|
2014-05-28 13:50:06 +00:00
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
On gestures, these signals are handled automatically, causing the gesture
|
|
|
|
|
to cancel all tracked pointer/touch events, and signal the end of recognition.
|
|
|
|
|
</para>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
</refsect2>
|
|
|
|
|
|
|
|
|
|
<refsect2>
|
|
|
|
|
<title>Keyboard input</title>
|
|
|
|
|
|
|
|
|
|
<!-- focus, tab, directional navigation -->
|
|
|
|
|
<!-- mnemonics, accelerators, bindings -->
|
|
|
|
|
</refsect2>
|
|
|
|
|
|
2015-02-18 11:13:28 +00:00
|
|
|
|
<refsect2 id="event-controllers-and-gestures">
|
2014-05-28 13:50:06 +00:00
|
|
|
|
<title>Event controllers and gestures</title>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Event controllers are standalone objects that can perform specific actions
|
2015-02-18 11:33:36 +00:00
|
|
|
|
upon received #GdkEvents. These are tied to a #GtkWidget, and can be told of
|
2014-05-28 13:50:06 +00:00
|
|
|
|
the event propagation phase at which they will manage the events.
|
|
|
|
|
</para>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
|
2014-05-28 13:50:06 +00:00
|
|
|
|
<para>
|
|
|
|
|
Gestures are a set of specific controllers that are prepared to handle pointer
|
|
|
|
|
and/or touch events, each gestures implementation attempts to recognize specific
|
|
|
|
|
actions out the received events, notifying of the state/progress accordingly to
|
|
|
|
|
let the widget react to those. On multi-touch gestures, every interacting touch
|
|
|
|
|
sequence will be tracked independently.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Being gestures “simple” units, it is not uncommon to tie several together to
|
|
|
|
|
perform higher level actions, grouped gestures handle the same event sequences
|
|
|
|
|
simultaneously, and those sequences share a same state across all grouped
|
|
|
|
|
gestures. Some examples of grouping may be:
|
|
|
|
|
|
|
|
|
|
<simplelist>
|
|
|
|
|
<member>
|
|
|
|
|
A “drag” and a “swipe” gestures may want grouping. The former will report
|
|
|
|
|
events as the dragging happens, the latter will tell the swipe X/Y velocities
|
|
|
|
|
only after gesture has finished.
|
|
|
|
|
</member>
|
|
|
|
|
<member>
|
|
|
|
|
Grouping a “drag” gesture with a “pan” gesture will only effectively allow
|
|
|
|
|
dragging in the panning orientation, as both gestures share state.
|
|
|
|
|
</member>
|
|
|
|
|
<member>
|
|
|
|
|
If “press” and “long press” are wanted simultaneously, those would need grouping.
|
|
|
|
|
</member>
|
|
|
|
|
</simplelist>
|
|
|
|
|
</para>
|
|
|
|
|
</refsect2>
|
|
|
|
|
|
|
|
|
|
<refsect2>
|
|
|
|
|
<title>Gesture states</title>
|
|
|
|
|
<para>
|
|
|
|
|
Gestures have a notion of “state” for each individual touch sequence. When events
|
|
|
|
|
from a touch sequence are first received, the touch sequence will have “none” state,
|
|
|
|
|
this means the touch sequence is being handled by the gesture to possibly trigger
|
|
|
|
|
actions, but the event propagation will not be stopped.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
When the gesture enters recognition, or at a later point in time, the widget may
|
|
|
|
|
choose to claim the touch sequences (individually or as a group), hence stopping
|
|
|
|
|
event propagation after the event is run through every gesture in that widget and
|
|
|
|
|
propagation phase. Anytime this happens, the touch sequences are cancelled downwards
|
|
|
|
|
the propagation chain, to let these know that no further events will be sent.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Alternatively, or at a later point in time, the widget may choose to deny the touch
|
|
|
|
|
sequences, thus letting those go through again in event propagation. When this happens
|
|
|
|
|
in the capture phase, and if there are no other claiming gestures in the widget,
|
2015-02-18 11:33:36 +00:00
|
|
|
|
a %GDK_TOUCH_BEGIN/%GDK_BUTTON_PRESS event will be emulated and
|
2014-05-28 13:50:06 +00:00
|
|
|
|
propagated downwards, in order to preserve consistency.
|
|
|
|
|
</para>
|
|
|
|
|
|
|
|
|
|
<para>
|
|
|
|
|
Grouped gestures always share the same state for a given touch sequence, so setting
|
|
|
|
|
the state on one does transfer the state to the others. They also are mutually exclusive,
|
|
|
|
|
within a widget there may be only one gesture group claiming a given sequence. If
|
|
|
|
|
another gesture group claims later that same sequence, the first group will deny the
|
|
|
|
|
sequence.
|
|
|
|
|
</para>
|
2014-05-20 14:10:31 +00:00
|
|
|
|
</refsect2>
|
|
|
|
|
|
|
|
|
|
</refsect1>
|
|
|
|
|
</refentry>
|