Re: RFC: gestures



Hey Bastien,

On mar, 2014-03-04 at 02:28 +0100, Bastien Nocera wrote:
On Tue, 2014-03-04 at 00:55 +0100, Carlos Garnacho wrote:
Hey everyone,

In the past days I've been hacking again on the gestures branch, and
it's reaching an state where I feel it's getting quite solid, so I would
like to get discussion started, tentatively aiming to get this included
early in 3.13.

Overview
========

The two object types this relies on are GtkEventController and
GtkGesture. GtkEventController is a very lowlevel abstraction for
something that just "handles events". GtkGesture is a subclass very
centered around handling single or multiple sequences of
press/update.../release events, by default it's restricted to handling
touch events, although can be made to listen to mouse events, either
though API or through the GTK_TEST_TOUCHSCREEN envvar (a NULL
GdkEventSequence is used in those cases).

Multiple GtkGesture implementations are offered in the branch:

      * Drag: keeps track of drags, reporting the offset from the drag
        start point.
      * Swipe: reports x/y velocity at the end of a begin/update/end
        sequence.
      * LongPress: reports long presses, or those being canceled after
        threshold/timeout excess.
      * MultiPress: reports multiple presses, as long as they're within
        double click threshold/timeout
      * Rotate: reports angle changes from two touch sequences
      * Zoom: reports distance changes from to touch sequences as a
        factor of the initial distance.

What about the single tap/press? 

Maybe "MultiPress" is a bit of a misnomer, but single press is a case
observed there too, you just get a n_presses counter on the ::pressed
signal, and another signal when the n_presses counter is reset (eg. no
more presses, or a too far next click)

Do the gestures for which it makes sense also give back the center of the operation? 

In the signals where this is not offered, this can be gotten with
generic GtkGesture API, there's gtk_gesture_get_bounding_box[_center]
and gtk_gesture_get_point() among other helpers.

Do the gestures know
about each other? (like, if there's no long-press in my widget, will it
understand that I'm starting a drag or I'm slow at tapping?)

The inter-gesture knowledge is something delegated on the caller
GtkWidget, as the exclusive/cooperative behavior is highly dependent to
those. widgets are meant to set the sequence state accordingly (directly
or overriding GtkWidget::sequence-state-changed), and/or trigger one
state or another based on which gestures are active/recognized.

By default all gestures get potentially triggered at once, so a single
touch press could init recognition on drag/swipe/longpress/multipress
gestures, in the case of a single button/touch press you could get:

GtkGestureDrag::drag-begin
GtkGestureMultiPress::pressed (n_presses = 1)
... (double click timeout passes by)
GtkGestureMultiPress::stopped
... (long press timeout passes by)
GtkGestureLongPress::pressed
... (if a motion happens anytime after)
GtkGestureDrag::drag-update
... (when a release happens)
GtkGestureDrag::drag-end
GtkGestureSwipe::swipe

IMO, In many scenarios this within-widget parallel interpretation makes
sense, as there are some gestures more naturally cooperative with
anothers (eg. rotate and zoom, drag and swipe...). As an example,
GtkScrolledWindow consists of drag+swipe in cooperation, while
long-press cancels the 3 when triggered, so these stop taking effect and
the child receives events again.

There could also be more complex cases, canvas-like widgets offering
multiple manipulable elements, depending on the complexity, that might
be be another case where selective event delivery through
gtk_event_controller_handle_event() could be handy.

Cheers,
  Carlos



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]