Re: touch events



So, here's few interesting things I learned in the last few days while
talking to people about touch (mostly Peter Hutterer and Chase
Douglas, more power to them for getting my X up and running with touch
events).

1) There is a huge difference between touchscreens and touchpads.

No platform exists that caters to both touchscreens and touchpads
equally. OS X is a touchpad world, iOS is a touchscreen world. Windows
is a touchscreen world. (I have no idea if and how Qt caters to both,
but my guess is it doesn't.) So XInput 2.2 seems to be the first
platform that tries.
Now, if you get a touch event sequence, those events have X and Y
coordinates. Unfortunately, these coordinates refer to the position of
the mouse pointer and not the position of the touch. And for
touchpads, the mouse pointer doesn't change. So you get the same x/y
coordinate all the time, for every event. For touchscreens of course,
that is entirely different as touchscreens move the mouse pointer to
wherever you touched last.
Of course, on touchpads you also get the position of the touch on the
touchpad. These coordinates refer to the physical touchpad and are not
related to the screen at all. What does that mean? Exactly, it means
that if you start doing math based on these coordinates, you are
wrong. How many units does one have to move to get a noticable effect?
Who knows, it's not related to pixels anyway! Unless you're thinking
of touchscreens again, where things match the screen perfectly fine
for these axes.
Last but not least, touchpads don't emit touch events for single
finger touches at all. For those touches, you use regular mouse events
- GdkEventButton and GdkEventMotion are your friend.

Of course, developers usually only have either a touchpad or a
touchscreen. So when they will be developing code, they will only
check one of the two. And I guess this usually means that the other
type of device will be broken. In fact, I have been using a test
program written by Peter himself, and even that one's very useless for
touchpads.
So my idea so far - though I'm not sure about this yet, which is why
I'm typing all this - is to make this difference very prominent inside
GTK and to use different event types. So for touchscreens, you'd get a
GDK_DIRECT_TOUCH_BEGIN/UPDATE/END and for your pads, you'd get
GDK_INDIRECT_TOUCH_BEGIN/UPDATE/END events. And we might even end up
giving you different events - a struct GdkEventDirectTouch vs a struct
GdkEventIndirectTouch - so we have the useful information in there,
and don't get people to write events that look at x/y.

2) system-level gestures?

If you've read http://who-t.blogspot.com/2012/01/multitouch-in-x-touch-grab-handling.html
(if not, it may help to read it to understand this paragraph), you
know that touch events are only forwarded to the application when all
grabs have rejected the touch event.
Now, the WM will probably want to do a touch grab, so that the usual
window management gestures (OS X users will recognize 4 finger swipes
for expose for example) can be recognized. And this means that until
the WM rejects the touch event (because it realized it's not a
relevant gesture), the application get those touch events delivered.
And that can cause lags.
Now there's multiple ways to cope with this:
- Make the WM reject touch events quickly
This requires a well-written WM (Ce Oh Em Pe I Zeeeee!!!) that rejects
touches quickly. So quickly in fact that it's not noticable for a
regular person that the touches weren't sent through to the app
immediately. Even when they use it to draw squiggly lines on the
screen. I'm not sure if that is even possible.
- Use a different model in the WM
The WM could not do a grab, but just listen for touch events on the
root window. In that case it'd only get touch events for all the
touches that applications haven't accepted. But applications accept
touches by default. So unless applications are well-written and
carefully reject all touches they don't care about, your system-level
gestures won't trigger if apps listen for touch events...
- Use XI_TouchOwnership
This way the app would get pending touch events even when it's not the
owner and could already do things. But it would require a very
prominent GDK_TOUCH_EVENT_CANCEL so that when the WM accepts the
touch, the app can revert everything it already did for the current
touch.
- something else?
Did I miss something here? Or is getting sytem-level gestures right
really complicated?

3) Drivers still do gesture recognition

You all know and love this feature of the synaptics driver. 2 finger
swipes cause scrolling, 3 finger taps cause a middle mouse click. And
that's not gonna change: If you tap with 3 fingers in the future, you
get a bunch of touch events _and_ a button press/release event combo.
Why is that bad? Because if you implement a 3-finger tap gesture in
your widget that does a clipboard paste, you will now suddenly be
doing that paste twice.
Can't you just disable one of those? No, not really, because some
drivers (like non-synaptics) may not be doing gestures, so a 3-finger
tap will do nothing; likewise, some old widgets (like all of GTK 2)
may not be listening to touch events.
Couldn't the X server do it for you then? No, not really, because if
you have an old widget (for example a GtkHTML widget) and a new widget
and put them in the same GtkWindow, there'll only be one X window
because of client-side windows. And the X server wouldn't know which
event to send.
So how do we fix this? I have no idea, but I suppose it'll cause lots
of madness.

That's it for today's episode of "lots of fun with X"

Benjamin


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]