Re: touch events

2012/2/2 Benjamin Otte <otte gnome org>
So from reading Matthias' mails it seems to me that this is the first
step, so we should get this right first. We should have a good idea of
where we want to go, but we don't need to merge the full multitouch
branch to get touch events. So let's start small.
Note that I do not have any touch-capable devices, so I have no way to
test the things I'm suggesting here. All I can and will do for now is
API review.

Just FYI, if anyone wants to get a device to test this, the Apple magic trackpad works on Linux and it is really neat for testing.

Chase Douglas and the other touch engineers in the System teams use it for the gesture support development in Ubuntu. It is a good workaround if you don't want to buy a whole multitouch capable tablet(pc).
For now I'll only care about event delivery of touch events and ignore
crossing events, pointer emulation and those. I do however care about
portability to Windows and OS X. (I'll ignore Wayland because I assume
it'll be similar/identical to X11).

As a guidance, I've looked at the following documents:
- X11
- Windows
- OS X
- Qt (for reference)

As far as I understand, none of these are concerned with multitouch;
multitouch is something that is implemented on top of this basic
system by the machinery commonly known as "gestures".

In all of these, touch is implemented via touch "sequences". A
sequence is a list of events, always delivered like this:
1 BEGIN event
0 or more UPDATE events
1 END or CANCEL event
Each of those events carries a designated ID that identifies the sequence.

There are some differences however, and I'm not sure if we need to
care about those right now or if we can think about them later:
- CANCEL vs END may be a flag on a single END event (X11) or two
separate events (OS X). (I personally prefer the OS X part, because
cancelling vs performing an action is a very different thing).
- Some operations on X11 require "accepting" a touch sequence (in
particular for grabs).
- Some update events distinguish between "did move" and "did not move"
(OS X, Qt).

The current multitouch patchset implements touch events by using
GdkButtonEvent and GdkMotionEvent and appending a sequence id to them.
Apart from the fact that I'm not sure if that's API stable (my guess
would be "hell no"), I don't like that approach. The platforms above
seem to agree with me. Qt, Windows and OS X have a custom TouchEvent
for that purpose.

I did not look at all yet at what contents a GdkTouchEvent should have
and how references to previous events should be handled. My gut
feeling says "let the sequence take care of it", Qt says "we'll give
you a list of all motion points and lots of other points, too", I
didn't look at other toolkits.

I would also prefer the touch sequence to not be hardcoded as an
integer, but encapsulated - a pointer typedef for example. This would
allow adding features to touch sequences later on (like the accepting
stuff from X11 grabs).

That's it for the events. Some comments not directly related, but
relevant to a patchset based on these ideas:
- Qt has device type TouchScreen vs TouchPad. Should we have that?
- I suppose just having a GDK_TOUCH_MASK is enough?

Does this look sane? Or are there any platform issues? Did I miss anything?

gtk-devel-list mailing list
gtk-devel-list gnome org

Alberto Ruiz

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]