Re: touch events



On Tue, Feb 07, 2012 at 08:39:23PM +0100, Chase Douglas wrote:
> On 02/07/2012 07:58 PM, Benjamin Otte wrote:
> > So, here's few interesting things I learned in the last few days while
> > talking to people about touch (mostly Peter Hutterer and Chase
> > Douglas, more power to them for getting my X up and running with touch
> > events).
> > 
> > 1) There is a huge difference between touchscreens and touchpads.
> > 
> > No platform exists that caters to both touchscreens and touchpads
> > equally. OS X is a touchpad world, iOS is a touchscreen world. Windows
> > is a touchscreen world. (I have no idea if and how Qt caters to both,
> > but my guess is it doesn't.) So XInput 2.2 seems to be the first
> > platform that tries.
> > Now, if you get a touch event sequence, those events have X and Y
> > coordinates. Unfortunately, these coordinates refer to the position of
> > the mouse pointer and not the position of the touch. And for
> > touchpads, the mouse pointer doesn't change. So you get the same x/y
> > coordinate all the time, for every event. For touchscreens of course,
> > that is entirely different as touchscreens move the mouse pointer to
> > wherever you touched last.
> > Of course, on touchpads you also get the position of the touch on the
> > touchpad. These coordinates refer to the physical touchpad and are not
> > related to the screen at all. What does that mean? Exactly, it means
> > that if you start doing math based on these coordinates, you are
> > wrong. How many units does one have to move to get a noticable effect?
> > Who knows, it's not related to pixels anyway! Unless you're thinking
> > of touchscreens again, where things match the screen perfectly fine
> > for these axes.
> > Last but not least, touchpads don't emit touch events for single
> > finger touches at all. For those touches, you use regular mouse events
> > - GdkEventButton and GdkEventMotion are your friend.
> > 
> > Of course, developers usually only have either a touchpad or a
> > touchscreen. So when they will be developing code, they will only
> > check one of the two. And I guess this usually means that the other
> > type of device will be broken. In fact, I have been using a test
> > program written by Peter himself, and even that one's very useless for
> > touchpads.
> > So my idea so far - though I'm not sure about this yet, which is why
> > I'm typing all this - is to make this difference very prominent inside
> > GTK and to use different event types. So for touchscreens, you'd get a
> > GDK_DIRECT_TOUCH_BEGIN/UPDATE/END and for your pads, you'd get
> > GDK_INDIRECT_TOUCH_BEGIN/UPDATE/END events. And we might even end up
> > giving you different events - a struct GdkEventDirectTouch vs a struct
> > GdkEventIndirectTouch - so we have the useful information in there,
> > and don't get people to write events that look at x/y.
> 
> The approach seems good in theory. Note that there's still at least one
> more type of device that has not been accounted for yet: independent
> devices. Examples include traditional mice with a touch surface on top
> for scrolling or other gestures, like the Apple Magic Mouse or the
> Microsoft Arc Mouse.
> 
> XI 2.2 doesn't specifically cater to these devices. They would be
> handled as indirect devices for now, and the X server 1.12 should work
> just fine for event handling, but there isn't a standard yet for how to
> denote an independent device. We could just annotate an indirect device
> with a read-only device property (use xinput list-props <device
> name|device number> to see device properties). Peter suggested this
> approach a while back since the server doesn't really need to know if
> the device is dependent (touchpad) or independent. Alternatively, we
> could introduce a new device type in XI 2.3.
> 
> The main thesis I'm advocating here is that whatever scheme is
> developed, keep in mind that there is at least one more device type out
> there, and who knows if there will be more in the future. I don't think
> there's any issues with your approach of splitting direct and indirect
> events; we'll have to decide in the future if we should create a new set
> of events for independent devices for the exact same reasons you
> outlined above.
> 
> > 2) system-level gestures?
> > 
> > If you've read http://who-t.blogspot.com/2012/01/multitouch-in-x-touch-grab-handling.html
> > (if not, it may help to read it to understand this paragraph), you
> > know that touch events are only forwarded to the application when all
> > grabs have rejected the touch event.
> > Now, the WM will probably want to do a touch grab, so that the usual
> > window management gestures (OS X users will recognize 4 finger swipes
> > for expose for example) can be recognized. And this means that until
> > the WM rejects the touch event (because it realized it's not a
> > relevant gesture), the application get those touch events delivered.
> > And that can cause lags.
> > Now there's multiple ways to cope with this:
> > - Make the WM reject touch events quickly
> > This requires a well-written WM (Ce Oh Em Pe I Zeeeee!!!) that rejects
> > touches quickly. So quickly in fact that it's not noticable for a
> > regular person that the touches weren't sent through to the app
> > immediately. Even when they use it to draw squiggly lines on the
> > screen. I'm not sure if that is even possible.
> 
> No, it's not really possible to guarantee the WM will be able to reject
> within a bounded timeframe. Imagine the WM wants to recognize stroke
> based gestures. With multiple strokes, the gesture recognition could
> occur over an unbounded amount of time, only to be rejected at the end
> when a stroke doesn't match. Imagine a stroke gesture for the letter M.
> If you make an N and stop, you might have consumed 1 second before the
> WM can reject the touch.
> 
> In Unity, we use tight thresholds and timeouts to bound our gestures,
> but GTK+ can't assume that all WMs will do the same.
> 
> > - Use a different model in the WM
> > The WM could not do a grab, but just listen for touch events on the
> > root window. In that case it'd only get touch events for all the
> > touches that applications haven't accepted. But applications accept
> > touches by default. So unless applications are well-written and
> > carefully reject all touches they don't care about, your system-level
> > gestures won't trigger if apps listen for touch events...
> 
> I think it's better to assume the WM will be written correctly than
> individual apps :). The Unity gesture guidelines also state that WM
> gestures take precedence over application multitouch and gesture
> interactions. That's not to say that other WMs must behave the same way,
> but GTK+ shouldn't assume clients take precedence.
> 
> > - Use XI_TouchOwnership
> > This way the app would get pending touch events even when it's not the
> > owner and could already do things. But it would require a very
> > prominent GDK_TOUCH_EVENT_CANCEL so that when the WM accepts the
> > touch, the app can revert everything it already did for the current
> > touch.
> 
> Just as this is an optional feature in X, I suggest making it optional
> in GTK+ too. I don't think many applications will need this
> functionality. However, I do think this is the best way forward when an
> application must have low-latency. I'm biased, however, since I proposed
> the touch ownership mechanism for XI 2.2.
> 
> > - something else?
> > Did I miss something here? Or is getting sytem-level gestures right
> > really complicated?
> 
> I don't think you've missed anything, and yes, system-level gestures are
> a bit complicated :).
> 
> > 3) Drivers still do gesture recognition
> > 
> > You all know and love this feature of the synaptics driver. 2 finger
> > swipes cause scrolling, 3 finger taps cause a middle mouse click. And
> > that's not gonna change: If you tap with 3 fingers in the future, you
> > get a bunch of touch events _and_ a button press/release event combo.
> > Why is that bad? Because if you implement a 3-finger tap gesture in
> > your widget that does a clipboard paste, you will now suddenly be
> > doing that paste twice.
> > Can't you just disable one of those? No, not really, because some
> > drivers (like non-synaptics) may not be doing gestures, so a 3-finger
> > tap will do nothing; likewise, some old widgets (like all of GTK 2)
> > may not be listening to touch events.
> > Couldn't the X server do it for you then? No, not really, because if
> > you have an old widget (for example a GtkHTML widget) and a new widget
> > and put them in the same GtkWindow, there'll only be one X window
> > because of client-side windows. And the X server wouldn't know which
> > event to send.
> > So how do we fix this? I have no idea, but I suppose it'll cause lots
> > of madness.
> 
> This is still a bit up in the air. This is correct according to what I
> think should happen. For example, it would require the following logic
> in a touch-aware toolkit that wants to perform kinetic smooth scrolling:
> 
> =====
> 
> device addition event
> if (device is touch capable)
>   subscribe to two-touch drag events
> 
> X scroll event (either X button event for buttons 4,5,6,7 or XI 2.1
>                 smooth scrolling event)
> if (device is *not* touch capable or scroll is not derived from touches)
>   scroll as normal
> 
> two-touch drag events (from a gesture system like utouch)
>   scroll as normal
> 
> =====

Here's one ambiguous case if we send both converted events and the touch
events:

For in-driver tapping, you may get the following events
Button press event
Touch begin/update event
Button release event

Now: was the physical button pressed or not? Because if it was, you can't
ignore the button press event in the toolkit, the tap was unrelated (e.g.
a ClickFinger-like gesture).

AFAICT you can't know without looking into the driver - which the toolkit
can't and shouldn't. The event sequence is not guaranteed on the protocol,
so the above sequence is valid - even if the current implementation does not
send it that way. You can guess based on timestamps, event coordinates, etc.
But it's still guesswork, something I'm not comfortable with.

Cheers,
  Peter

> 
> Determining if a device is touch capable is not very hard. However,
> determining if an X scroll event is not derived from touches is more
> difficult. X core and XI 1.x button events do not have "flags", which
> could tell the client about this information. We could add a flag for XI
> 2.x button and smooth scrolling events for when they are derived from
> touches. However, this still isn't a 100% solution to the problem. There
> are other corner cases that are harder to show, just take my word for it
> :). Note that this is only a problem for trackpads, and maybe
> independent devices.
> 
> I feel a real solution to this issue is so difficult that it may not be
> feasible to implement, much less maintain. I also feel that assuming an
> indirect, touch-capable device has no non-touch scrolling functionality
> is good enough for now. This heuristic breaks down when you start
> talking about a drawing tablet like a Wacom Intuos 4 if you want to use
> the jog-wheel for scrolling, but professional drawing tools have always
> required special configuration. Yes, it's not ideal, but I see it as a
> price we pay for extending a 25 year old window system.
> 
> Peter is not convinced that this is how we should handle things. He is
> still mulling things over. I'll let him reply with his thoughts.
> 
> Thanks!
> 
> -- Chase


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]