Multitouch review 6: gestures

  GtkWidget::gesture signal

There's a simple API here that consists of a number of predefined
gestures - left/right/up/down swipes as well as circular motions.
These can be enabled for a widget with gtk_widget_enable_gesture. GTK+
then creates a gesture interpreter bhind the scenes, and feeds all
touch events that are targeted to the widget to the interpreter. When
an enabled gesture is recognized, the ::gesture signal is emitted.

It is possible to register new gestures by creating one or more
GtkGestureStroke objects, combining them into a GtkGesture object, and
then calling gtk_gesture_register_static. The resulting integer id can
be passed to gtk_widget_enable_gesture to enable recognition of the
custom gesture for a widget.

The GtkGestureInterpreter object is also available as public API, and
it is possible to create one and do all the gesture recognition and
event feeding manually - in that case, ::gesture is naturally not

Only touch events are looked at for gesture recognition.

Questions and comments:

- Is the GtkGestureInterpreter really something that we need to expose
in public API ? It seems that that would only encourage bad ideas - or
is there a concrete use case for this ?

- It seems to me that we feed touch events to the gesture interpreting
machinery, but then proceed to handle them as normal - unlike the
capture/release mechanism that we were talking about earlier. I guess
a widget that is waiting for gestures is unlikely to have other uses
for touch events.

- The kinetic scrolling is looking for swipes - but is not using this
gesture interpreter mechanism. Should it ?

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]