forked from AuroraMiddleware/gtk
4ae95157ae
The overview chapters don't go through shorthand processing, so don't use them there.
139 lines
4.9 KiB
XML
139 lines
4.9 KiB
XML
<?xml version="1.0"?>
|
|
<!DOCTYPE refentry PUBLIC "-//OASIS//DTD DocBook XML V4.3//EN"
|
|
"http://www.oasis-open.org/docbook/xml/4.3/docbookx.dtd" [
|
|
]>
|
|
<refentry id="chap-input-handling">
|
|
<refmeta>
|
|
<refentrytitle>The GTK+ Input Handling Model</refentrytitle>
|
|
<manvolnum>3</manvolnum>
|
|
<refmiscinfo>GTK Library</refmiscinfo>
|
|
</refmeta>
|
|
|
|
<refnamediv>
|
|
<refname>The GTK+ Input Handling Model</refname>
|
|
<refpurpose>
|
|
GTK+ input handling in detail
|
|
</refpurpose>
|
|
</refnamediv>
|
|
|
|
|
|
<refsect1 id="input-overview">
|
|
<title>Overview of GTK+ input handling</title>
|
|
|
|
<para>
|
|
This chapter describes in detail how GTK+ handles input. If you are interested
|
|
in what happens to translate a key press or mouse motion of the users into a
|
|
change of a GTK+ widget, you should read this chapter. This knowledge will also
|
|
be useful if you decide to implement your own widgets.
|
|
</para>
|
|
|
|
<refsect2>
|
|
<title>Devices and events</title>
|
|
|
|
<!-- input devices: master/slave, keyboard/pointer/touch -->
|
|
<para>
|
|
The most basic input devices that every computer user has interacted with are
|
|
keyboards and mice; beyond these, GTK+ supports touchpads, touchscreens and
|
|
more exotic input devices such as graphics tablets. Inside GTK+, every such
|
|
input device is represented by a #GdkDevice object.
|
|
</para>
|
|
|
|
<para>
|
|
To simplify dealing with the variability between these input devices, GTK+
|
|
has a concept of master and slave devices. The concrete physical devices that
|
|
have many different characteristics (mice may have 2 or 3 or 8 buttons,
|
|
keyboards have different layouts and may or may not have a separate number
|
|
block, etc) are represented as slave devices. Each slave device is
|
|
associated with a virtual master device. Master devices always come in
|
|
pointer/keyboard pairs - you can think of such a pair as a 'seat'.
|
|
</para>
|
|
<para>
|
|
GTK+ widgets generally deal with the master devices, and thus can be used
|
|
with any pointing device or keyboard.
|
|
</para>
|
|
|
|
<!-- input events: button, touch, key, motion, etc -->
|
|
<para>
|
|
When a user interacts with an input device (e.g. moves a mouse or presses
|
|
a key on the keyboard), GTK+ receives events from the windowing system.
|
|
These are typically directed at a specific window - for pointer events,
|
|
the window under the pointer (grabs complicate this), for keyboard events,
|
|
the window with the keyboard focus.
|
|
</para>
|
|
<para>
|
|
GDK translates these raw windowing system events into #GdkEvents.
|
|
Typical input events are:
|
|
<simplelist>
|
|
<member>GdkEventButton</member>
|
|
<member>GdkEventMotion</member>
|
|
<member>GdkEventCrossing</member>
|
|
<member>GdkEventKey</member>
|
|
<member>GdkEventFocus</member>
|
|
<member>GdkEventTouch</member>
|
|
</simplelist>
|
|
</para>
|
|
<para>
|
|
When GTK+ is initialized, it sets up an event handler function with
|
|
gdk_event_handler_set(), which receives all of these input events
|
|
(as well as others, for instance window management related events).
|
|
</para>
|
|
</refsect2>
|
|
|
|
<refsect2>
|
|
<title>Event propagation</title>
|
|
|
|
<para>
|
|
When GTK+ receives an event, it determines the target widget that
|
|
it is directed to. Unless grabs are involved, this is done by finding
|
|
the widget to which the window of the event belongs.
|
|
</para>
|
|
|
|
<para>
|
|
The event is then propagated from the toplevel window down to the
|
|
target widget. In this phase, which is known as the “capture” phase,
|
|
gestures that are attached with GTK_PHASE_CAPTURE get a chance
|
|
to react to the event.
|
|
</para>
|
|
|
|
<para>
|
|
Next, the appropriate event signal is emitted for the event in question,
|
|
e.g. “motion-notify-event”. Handling these signals was the primary
|
|
way to handle input in GTK+ widgets before gestures were introduced.
|
|
The signals are emitted from the target widget up to the toplevel,
|
|
until a signal handler indicates that it has handled the event, by
|
|
returning GDK_EVENT_STOP.
|
|
</para>
|
|
|
|
<para>
|
|
The default handlers for the event signals send the event
|
|
to gestures that are attached with GTK_PHASE_TARGET. Therefore,
|
|
gestures in the ”target” phase are only used if the widget does
|
|
not have its own event handlers, or takes care to chain up to the
|
|
default handlers.
|
|
</para>
|
|
|
|
<para>
|
|
After calling the event handlers, in the so-called ”bubble” phase,
|
|
gestures that are attached with GTK_PHASE_BUBBLE get a chance
|
|
to react to the event.
|
|
</para>
|
|
|
|
<!-- grabs -->
|
|
</refsect2>
|
|
|
|
<refsect2>
|
|
<title>Keyboard input</title>
|
|
|
|
<!-- focus, tab, directional navigation -->
|
|
<!-- mnemonics, accelerators, bindings -->
|
|
</refsect2>
|
|
|
|
<refsect2>
|
|
<title>Gestures</title>
|
|
|
|
<!-- touch sequences, states, anything else -->
|
|
</refsect2>
|
|
|
|
</refsect1>
|
|
</refentry>
|