Re: touch events

On 02/08/2012 10:33 AM, Peter Hutterer wrote:
> On Tue, Feb 07, 2012 at 07:58:23PM +0100, Benjamin Otte wrote:
>> 3) Drivers still do gesture recognition
>> You all know and love this feature of the synaptics driver. 2 finger
>> swipes cause scrolling, 3 finger taps cause a middle mouse click. And
>> that's not gonna change: If you tap with 3 fingers in the future, you
>> get a bunch of touch events _and_ a button press/release event combo.
>> Why is that bad? Because if you implement a 3-finger tap gesture in
>> your widget that does a clipboard paste, you will now suddenly be
>> doing that paste twice.
>> Can't you just disable one of those? No, not really, because some
>> drivers (like non-synaptics) may not be doing gestures, so a 3-finger
>> tap will do nothing; likewise, some old widgets (like all of GTK 2)
>> may not be listening to touch events.
>> Couldn't the X server do it for you then? No, not really, because if
>> you have an old widget (for example a GtkHTML widget) and a new widget
>> and put them in the same GtkWindow, there'll only be one X window
>> because of client-side windows. And the X server wouldn't know which
>> event to send.
>> So how do we fix this? I have no idea, but I suppose it'll cause lots
>> of madness.
> This situation is the main source for headaches right now. More
> specifically: we have the choice between accepting that some gestures just
> won't work as touch gestures (if interpreted by the driver) or that we loose
> these same gestures on legacy apps (if interpreted by the toolkit).
> To expand: if we interpret a gesture in the driver and convert it to any
> other event E, we have no mechanism to tell a client that the new event E
> corresponds to some other touch event and should be ignored. This will
> likely lead to mishandling of some events. This is why I'm advocating a
> either-or approach: if you enable in-driver gestures you're stuck with those
> and you won't get touch events for them. if you want toolkit gestures, well,
> you'll need all apps to support them.
> This isn't a nice solution but it is at least predictable.

In #xorg-devel on freenode Peter, Daniel Stone, and I had a chat about
how to handle this. We tried to find ways of flagging events so the
various potential clients would know what to do, but there wasn't any
airtight solution to the problem. It depended on each client making
assumptions about what all other clients would do.

Instead, we decided to go with the either-or approach as Peter
described. If you enable two-touch scrolling in the X synaptics input
module, you won't get touch events while a scroll is occurring. Same
thing for the various tapping and clicking actions that synaptics supports.

The major reason we want touch events when there are two touches is for
smooth, kinetic scrolling. XI 2.1 provides for smooth scrolling
recognized by the input module. However, it is missing information on
when a physical scroll action begins and ends, so it can't be used for
kinetic scrolling. We will be looking into adding this information for
XI 2.3.

I think the end-game will require each toolkit/application to perform
its own scrolling based on touch events. Performing gesture recognition
in the X server is a non-starter, and hacking around with specific
corner cases like scrolling just complicates the whole issue. Speaking
as an Ubuntu developer, if we could get gtk, qt, firefox, chrome, and
libreoffice interpreting two touch scroll gestures in a uniform manner,
we could then disable two-touch scrolling in the synaptics module by
default and move past this mess.

-- Chase

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]