[gtk] docs: Some updates to the input overview



commit bceca277ea8e18037cf1792e8a0cd86875bf7d45
Author: Matthias Clasen <mclasen redhat com>
Date:   Fri May 3 04:22:07 2019 +0000

    docs: Some updates to the input overview
    
    Remove references to no longer existing apis,
    and reword some things. Say surface instead of
    window. Start filling out the keyboard section.

 docs/reference/gtk/input-handling.xml | 256 ++++++++++++++++++----------------
 1 file changed, 133 insertions(+), 123 deletions(-)
---
diff --git a/docs/reference/gtk/input-handling.xml b/docs/reference/gtk/input-handling.xml
index d3d926e406..3895e4a269 100644
--- a/docs/reference/gtk/input-handling.xml
+++ b/docs/reference/gtk/input-handling.xml
@@ -4,15 +4,15 @@
 ]>
 <refentry id="chap-input-handling">
 <refmeta>
-<refentrytitle>The GTK Input and Event Handling Model</refentrytitle>
+<refentrytitle>The GTK Input Model</refentrytitle>
 <manvolnum>3</manvolnum>
 <refmiscinfo>GTK Library</refmiscinfo>
 </refmeta>
 
 <refnamediv>
-<refname>The GTK Input and Event Handling Model</refname>
+<refname>The GTK Input Model</refname>
 <refpurpose>
-    GTK input and event handling in detail
+    input and event handling in detail
 </refpurpose>
 </refnamediv>
 
@@ -52,13 +52,12 @@
      with any pointing device or keyboard.
     </para>
 
-    <!-- input events: button, touch, key, motion, etc -->
     <para>
      When a user interacts with an input device (e.g. moves a mouse or presses
      a key on the keyboard), GTK receives events from the windowing system.
-     These are typically directed at a specific window - for pointer events,
-     the window under the pointer (grabs complicate this), for keyboard events,
-     the window with the keyboard focus.
+     These are typically directed at a specific surface - for pointer events,
+     the surface under the pointer (grabs complicate this), for keyboard events,
+     the surface with the keyboard focus.
     </para>
     <para>
      GDK translates these raw windowing system events into #GdkEvents.
@@ -81,9 +80,10 @@
       </simplelist>
     </para>
     <para>
-      When GTK is initialized, it sets up an event handler function with
-      gdk_event_handler_set(), which receives all of these input events
-      (as well as others, for instance window management related events).
+      When GTK creates a GdkSurface, it connects to the ::event signal
+      on it, which receives all of these input events. Surfaces have
+      have signals and properties, e.g. to deal with window management
+      related events.
     </para>
   </refsect2>
 
@@ -91,8 +91,8 @@
     <title>Event propagation</title>
 
     <para>
-      For widgets which have a #GdkSurface set, events are received from the
-      windowing system and passed to gtk_main_do_event(). See its documentation
+      The function which initially receives input events on the GTK
+      side is gtk_main_do_event(). See its documentation
       for details of what it does: compression of enter/leave events,
       identification of the widget receiving the event, pushing the event onto a
       stack for gtk_get_current_event(), and propagating the event to the
@@ -120,62 +120,55 @@
 
     <para>
       An event is propagated to a widget using gtk_propagate_event().
-      Propagation differs between event types: key events (%GDK_KEY_PRESS,
-      %GDK_KEY_RELEASE) are delivered to the top-level #GtkWindow; other events
-      are propagated down and up the widget hierarchy in three phases (see
-      #GtkPropagationPhase).
+      Propagation goes down and up the widget hierarchy in three phases
+      (see #GtkPropagationPhase) towards a target widget.
     </para>
 
     <para>
-      For key events, the top-level window’s default #GtkWindow::key-press-event
-      and #GtkWindow::key-release-event signal handlers handle mnemonics and
-      accelerators first. Other key presses are then passed to
-      gtk_window_propagate_key_event() which propagates the event upwards from
-      the window’s current focus widget (gtk_window_get_focus()) to the
-      top-level.
+      For key events, the top-level window gets a first shot at activating
+      mnemonics and accelerators. If that does not consume the events,
+      the target widget for event propagation is window's current focus
+      widget (see gtk_window_get_focus()).
     </para>
 
     <para>
-      For other events, in the first phase (the “capture” phase) the event is
-      delivered to each widget from the top-most (for example, the top-level
-      #GtkWindow or grab widget) down to the target #GtkWidget.
-      <link linkend="event-controllers-and-gestures">Gestures</link> that are
-      attached with %GTK_PHASE_CAPTURE get a chance to react to the event.
+      For pointer events, the target widget is determined by picking
+      the widget at the events coordinates (see gtk_window_pick()).
     </para>
 
-    <para>
-      After the “capture” phase, the widget that was intended to be the
-      destination of the event will run gestures attached to it with
-      %GTK_PHASE_TARGET. This is known as the “target” phase, and only
-      happens on that widget.
+    <para>In the first phase (the “capture” phase) the event is
+      delivered to each widget from the top-most (the top-level
+      #GtkWindow or grab widget) down to the target #GtkWidget.
+      <link linkend="event-controllers-and-gestures">Event
+      controllers</link> that are attached with %GTK_PHASE_CAPTURE
+      get a chance to react to the event.
     </para>
 
     <para>
-      Next, the #GtkWidget::event signal is emitted.
-      Handling these signals was the primary way to handle input in GTK widgets
-      before gestures were introduced. The signal is emitted from
-      the target widget up to the top-level, as part of the “bubble” phase.
+      After the “capture” phase, the widget that was intended to be the
+      destination of the event will run event controllers attached to
+      it with %GTK_PHASE_TARGET. This is known as the “target” phase,
+      and only happens on that widget.
     </para>
 
     <para>
-      The default handlers for the event signals send the event
-      to gestures that are attached with %GTK_PHASE_BUBBLE. Therefore,
-      gestures in the “bubble” phase are only used if the widget does
-      not have its own event handlers, or takes care to chain up to the
-      default #GtkWidget handlers.
+      In the last phase (the “bubble” phase), the event is delivered
+      to each widget from the target to the top-most, and event
+      controllers attached with %GTK_PHASE_BUBBLE are run.
     </para>
 
     <para>
-      Events are not delivered to a widget which is insensitive or unmapped.
+      Events are not delivered to a widget which is insensitive or
+      unmapped.
     </para>
 
     <para>
-      Any time during the propagation phase, a widget may indicate that a
-      received event was consumed and propagation should therefore be stopped.
-      In traditional event handlers, this is hinted by returning %GDK_EVENT_STOP.
-      If gestures are used, this may happen when the widget tells the gesture
-      to claim the event touch sequence (or the pointer events) for its own. See the
-      "gesture states" section below to know more of the latter.
+      Any time during the propagation phase, a controller may indicate
+      that a received event was consumed and propagation should
+      therefore be stopped. If gestures are used, this may happen
+      when the gesture claims the event touch sequence (or the
+      pointer events) for its own. See the “gesture states” section
+      below to learn more about gestures and sequences.
     </para>
   </refsect2>
 
@@ -183,27 +176,10 @@
     <title>Touch events</title>
 
     <para>
-      Touch events are emitted as events of type %GDK_TOUCH_BEGIN, %GDK_TOUCH_UPDATE or
-      %GDK_TOUCH_END, those events contain an “event sequence” that univocally identifies
-      the physical touch until it is lifted from the device.
-    </para>
-
-    <para>
-      On some windowing platforms, multitouch devices perform pointer emulation, this works
-      by granting a “pointer emulating” hint to one of the currently interacting touch
-      sequences, which will be reported on every #GdkEventTouch event from that sequence. By
-      default, if a widget didn't request touch events by setting %GDK_TOUCH_MASK on its
-      event mask and didn't override #GtkWidget::touch-event, GTK will transform these
-      “pointer emulating” events into semantically similar #GdkEventButton and #GdkEventMotion
-      events. Depending on %GDK_TOUCH_MASK being in the event mask or not, non-pointer-emulating
-      sequences could still trigger gestures or just get filtered out, regardless of the widget
-      not handling those directly.
-    </para>
-
-    <para>
-      If the widget sets %GDK_TOUCH_MASK on its event mask and doesn't chain up on
-      #GtkWidget::touch-event, only touch events will be received, and no pointer emulation
-      will be performed.
+      Touch events are emitted as events of type %GDK_TOUCH_BEGIN,
+      %GDK_TOUCH_UPDATE or %GDK_TOUCH_END, those events contain an
+      “event sequence” that univocally identifies the physical touch
+      until it is lifted from the device.
     </para>
   </refsect2>
 
@@ -211,43 +187,66 @@
     <title>Grabs</title>
 
     <para>
-      Grabs are a method to claim all input events from a device, they happen
-      either implicitly on pointer and touch devices, or explicitly. Implicit grabs
-      happen on user interaction, when a #GdkEventButtonPress happens, all events from
-      then on, until after the corresponding #GdkEventButtonRelease, will be reported
-      to the widget that got the first event. Likewise, on touch events, every
-      #GdkEventSequence will deliver only events to the widget that received its
-      %GDK_TOUCH_BEGIN event.
+      Grabs are a method to claim all input events from a device,
+      they happen either implicitly on pointer and touch devices,
+      or explicitly. Implicit grabs happen on user interaction, when
+      a #GdkEventButtonPress happens, all events from then on, until
+      after the corresponding #GdkEventButtonRelease, will be reported
+      to the widget that got the first event. Likewise, on touch events,
+      every #GdkEventSequence will deliver only events to the widget
+      that received its %GDK_TOUCH_BEGIN event.
     </para>
 
     <para>
-      Explicit grabs happen programatically (both activation and deactivation),
-      and can be either system-wide (GDK grabs) or application-wide (GTK grabs).
-      On the windowing platforms that support it, GDK grabs will prevent any
-      interaction with any other application/window/widget than the grabbing one,
-      whereas GTK grabs will be effective only within the application (across all
-      its windows), still allowing for interaction with other applications.
+      Explicit grabs happen programatically (both activation and
+      deactivation), and can be either system-wide (GDK grabs) or
+      application-wide (GTK grabs). On the windowing platforms that
+      support it, GDK grabs will prevent any interaction with any other
+      application/window/widget than the grabbing one, whereas GTK grabs
+      will be effective only within the application (across all its
+      windows), still allowing for interaction with other applications.
     </para>
 
     <para>
-      But one important aspect of grabs is that they may potentially happen at any
-      point somewhere else, even while the pointer/touch device is already grabbed.
-      This makes it necessary for widgets to handle the cancellation of any ongoing
-      interaction. Depending on whether a GTK or GDK grab is causing this, the
-      widget will respectively receive a #GtkWidget::grab-notify signal, or a
+      But one important aspect of grabs is that they may potentially
+      happen at any point somewhere else, even while the pointer/touch
+      device is already grabbed. This makes it necessary for widgets to
+      handle the cancellation of any ongoing interaction. Depending on
+      whether a GTK or GDK grab is causing this, the widget will
+      respectively receive a #GtkWidget::grab-notify signal, or a
       #GdkEventGrabBroken event.
     </para>
 
     <para>
-      On gestures, these signals are handled automatically, causing the gesture
-      to cancel all tracked pointer/touch events, and signal the end of recognition.
+      On gestures, these signals are handled automatically, causing the
+      gesture to cancel all tracked pointer/touch events, and signal
+      the end of recognition.
     </para>
   </refsect2>
 
   <refsect2>
     <title>Keyboard input</title>
 
-    <!-- focus, tab, directional navigation -->
+    <para>
+      Every #GtkWindow maintains a single focus location (in
+      the ::focus-widget property). The focus widget is the
+      target widget for key events sent to the window. Only
+      widgets which have ::can-focus set to %TRUE can become
+      the focus. Typically these are input controls such as
+      entries or text fields, but e.g. buttons can take the
+      focus too.
+    </para>
+
+    <para>
+      Input widgets can be given the focus by clicking on them,
+      but focus can also be moved around with certain key
+      events (this is known as “keyboard navigation”). GTK
+      reserves the Tab key to move the focus to the next location,
+      and Shift-Tab to move it back to the previous one. In addition
+      many containers allow “directional navigation” with the
+      arrow keys.
+    </para>
+
     <!-- mnemonics, accelerators, bindings -->
   </refsect2>
 
@@ -255,37 +254,43 @@
     <title>Event controllers and gestures</title>
 
     <para>
-      Event controllers are standalone objects that can perform specific actions
-      upon received #GdkEvents. These are tied to a #GtkWidget, and can be told of
-      the event propagation phase at which they will manage the events.
+      Event controllers are standalone objects that can perform
+      specific actions upon received #GdkEvents. These are tied
+      to a #GtkWidget, and can be told of the event propagation
+      phase at which they will manage the events.
     </para>
 
     <para>
-      Gestures are a set of specific controllers that are prepared to handle pointer
-      and/or touch events, each gestures implementation attempts to recognize specific
-      actions out the received events, notifying of the state/progress accordingly to
-      let the widget react to those. On multi-touch gestures, every interacting touch
-      sequence will be tracked independently.
+      Gestures are a set of specific controllers that are prepared
+      to handle pointer and/or touch events, each gesture
+      implementation attempts to recognize specific actions out the
+      received events, notifying of the state/progress accordingly to
+      let the widget react to those. On multi-touch gestures, every
+      interacting touch sequence will be tracked independently.
     </para>
 
     <para>
-      Being gestures “simple” units, it is not uncommon to tie several together to
-      perform higher level actions, grouped gestures handle the same event sequences
-      simultaneously, and those sequences share a same state across all grouped
+      Since gestures are “simple” units, it is not uncommon to tie
+      several together to perform higher level actions, grouped
+      gestures handle the same event sequences simultaneously, and
+      those sequences share a same state across all grouped
       gestures. Some examples of grouping may be:
 
       <simplelist>
        <member>
-         A “drag” and a “swipe” gestures may want grouping. The former will report
-         events as the dragging happens, the latter will tell the swipe X/Y velocities
-         only after gesture has finished.
+         A “drag” and a “swipe” gestures may want grouping.
+          The former will report events as the dragging happens,
+          the latter will tell the swipe X/Y velocities only after
+          recognition has finished.
        </member>
        <member>
-         Grouping a “drag” gesture with a “pan” gesture will only effectively allow
-         dragging in the panning orientation, as both gestures share state.
+         Grouping a “drag” gesture with a “pan” gesture will only
+          effectively allow dragging in the panning orientation, as
+          both gestures share state.
        </member>
        <member>
-         If “press” and “long press” are wanted simultaneously, those would need grouping.
+         If “press” and “long press” are wanted simultaneously,
+          those would need grouping.
        </member>
       </simplelist>
     </para>
@@ -294,34 +299,39 @@
   <refsect2>
     <title>Gesture states</title>
     <para>
-      Gestures have a notion of “state” for each individual touch sequence. When events
-      from a touch sequence are first received, the touch sequence will have “none” state,
-      this means the touch sequence is being handled by the gesture to possibly trigger
+      Gestures have a notion of “state” for each individual touch
+      sequence. When events from a touch sequence are first received,
+      the touch sequence will have “none” state, this means the touch
+      sequence is being handled by the gesture to possibly trigger
       actions, but the event propagation will not be stopped.
     </para>
 
     <para>
-      When the gesture enters recognition, or at a later point in time, the widget may
-      choose to claim the touch sequences (individually or as a group), hence stopping
-      event propagation after the event is run through every gesture in that widget and
-      propagation phase. Anytime this happens, the touch sequences are cancelled downwards
-      the propagation chain, to let these know that no further events will be sent.
+      When the gesture enters recognition, or at a later point in time,
+      the widget may choose to claim the touch sequences (individually
+      or as a group), hence stopping event propagation after the event
+      is run through every gesture in that widget and propagation phase.
+      Anytime this happens, the touch sequences are cancelled downwards
+      the propagation chain, to let these know that no further events
+      will be sent.
     </para>
 
     <para>
-      Alternatively, or at a later point in time, the widget may choose to deny the touch
-      sequences, thus letting those go through again in event propagation. When this happens
-      in the capture phase, and if there are no other claiming gestures in the widget,
+      Alternatively, or at a later point in time, the widget may choose
+      to deny the touch sequences, thus letting those go through again
+      in event propagation. When this happens in the capture phase, and
+      if there are no other claiming gestures in the widget,
       a %GDK_TOUCH_BEGIN/%GDK_BUTTON_PRESS event will be emulated and
       propagated downwards, in order to preserve consistency.
     </para>
 
     <para>
-      Grouped gestures always share the same state for a given touch sequence, so setting
-      the state on one does transfer the state to the others. They also are mutually exclusive,
-      within a widget there may be only one gesture group claiming a given sequence. If
-      another gesture group claims later that same sequence, the first group will deny the
-      sequence.
+      Grouped gestures always share the same state for a given touch
+      sequence, so setting the state on one does transfer the state to
+      the others. They also are mutually exclusive, within a widget
+      there may be only one gesture group claiming a given sequence.
+      If another gesture group claims later that same sequence, the
+      first group will deny the sequence.
     </para>
   </refsect2>
 


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]