[g-a-devel] Allowing assistive technologies to work with touch gestures.
- From: Luke Yelavich <luke yelavich canonical com>
- To: desktop-devel-list gnome org, gnome-accessibility-devel gnome org, unity-dev lists launchpad net
- Subject: [g-a-devel] Allowing assistive technologies to work with touch gestures.
- Date: Mon, 12 Nov 2012 15:13:51 +1100
Greetings all.
This is meant as a cross-list discussion between GNOME Shell developers, Unity developers, and accessibility developers, please make sure you keep everybody in CC when replying.
With Ubuntu desktop development partly focusing on the desktop experience on the Google Nexus 7, I have decided to start work on implementing touch gesture support for assistive technologies, with the primary usecase being for screen reader use. To that end, my focus of work will be on Orca, basing on the 3.7/3.8 branch. The reason why I have started this discussion accross several lists should become aparant later in this email. I have referenced magnification, although I am not going to be looking into it.
Other touch platforms, i.e IOS and android have differing levels of assistive technology enablement already present, as outlined below:
IOS:
Screen reader: VoiceOver on the IOS platform replaces the standard IOS gestures with its own, allowing the user to explore what is on the screen, and perform different actions on the currently selected item, where selected means the last item the user located with their finger, and whos name or description was spoken.
Magnifier: The IOS magnifier augments the existing set of gestures with its own, allowing the user to turn magnification on and off, zoom in and out, and navigate the screen with 3 finger drags to find the content they are looking for. The user can then use standard IOS gestures to perform the desired action on the content they are working with.
Android:
Screen Reader: The Android screen reader as of 4.1, called Talkback, replaces the standard Android gestures with its own, along similar lines to VoiceOver on IOS.
Magnifier: So far as I know, Android 4.1 does not have any magnification included yet. Supposedly this is coming in 4.2.
Given the other touch platforms, it would ideally be best to follow along similar lines to IOS in terms of gestures used. Where possible, it should also be possible to customize the gestures one can use to perform various tasks. Whilst I intend to extend Orca's existing system of managing commands to add touch gestures, I am not likely to get to extending Orca's preferences UI to allow users to customize touch gestures yet, thats likely for next cycle however.
For those who are interested in the progress of this work in general, you can find more information, and can follow along at (1). Note this blueprint has an Ubuntu focus, but I intend to do as much as I can to make sure all desktop environments benefit.
Unity devs, please correct or add something to the discussion if I am wrong here, or have missed something important. Unity uses the geis framework (2) to implement multi-touch gestures for use on either touch screens, or multi-touch trackpads. Gestures such as 3-finger drag to move windows around on the screen, 3-finger tap to bring up grab handles, and 4-finger tap to bring up the unity dash are currently implemented. To make sure these gestures work, unity needs to take control of touch input on the root X window. The touch system works such that any gesture that is not used by unity is allowed to move on up the stack to be swallowed by other apps if they support touch gestures.
Since Orca will need to be able to accept basic gestures such as single finger dragging and tapping, Orca needs to somehow be at the bottom of the stack, to make sure it can be the first to act on any such gesture the user performs, since as mentioned above with the IOS implementation, such gestures are used for desktop/app navigation, exploration, and activation of widgets. This still has to allow for gestures that Orca doesn't care about, to work, like those mentioned above for unity.
My initial thought is to implement some form of hand-off process, where an assistive technology like Orca can request to take over control of gesture processing at the root X window. This would require a desktop environment that requires root X window gesture processing priority to register with an arbitrator that manages who has access to the root X window for gesture processing. Orca would then register with the same arbitrator, and request to be handed control of gesture processing, and possibly be given a pointer to the relinquishing environments' code to allow for further gesture processing to take place for the environment, if the gesture used is not one that Orca listens for.
Since this is for assistive technologies, and given that at-spi is the defacto standard for GUI environment accessibility on GNOME/Unity/QT, I propose that this arbitrator could be added to at-spi. Its slightly outside at-spi's scope, but any desktop environment that wishes to offer accessibility will have this service running in the background, and I don't see the point in writing another daemon to run in the user session just for this arbitration process.
There may of course be alternatives that I haven't yet thought of, given my understanding of how touch works in unity/X/GTK etc is not by any means complete just yet. I would particularly like to hear from Unity developers, since they can shed more light on unity's touch requirements. From what I have been told, GNOME shell doesn't yet do any such advanced gesture processing, but there may be a desire to do so in the future, hense inclusion of GNOME developers in the discussion.
Thanks in advance for your time in reading what has turned out to be a lengthy email, and I hope we can work out a solution that satisfies all parties.
Luke
P.S. I had thought of including KDE/QT folks, but as far as I know, KDE accessibility efforts to work well with at-spi are still in their infancy, and I wasn't sure who to contact there. However if any QT related folks are reading this, please feel free to chime in, and/or forward this message to any others you feel should know about, and participate in this discussion.
(1) https://blueprints.launchpad.net/ubuntu/+spec/desktop-r-accessibility-touch-gestures
(2) https://launchpad.net/geis
[Date Prev][
Date Next] [Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]