Re: [g-a-devel] Allowing assistive technologies to work with touch gestures.



Luke Yelavich <luke yelavich canonical com> wrote:
 
> My initial thought is to implement some form of hand-off process, where an
> assistive technology like Orca can request to take over control of gesture
> processing at the root X window. This would require a desktop environment
> that requires root X window gesture processing priority to register with an
> arbitrator that manages who has access to the root X window for gesture
> processing.  Orca would then register with the same arbitrator, and request
> to be handed control of gesture processing, and possibly be given a pointer
> to the relinquishing environments' code to allow for further gesture
> processing to take place for the environment, if the gesture used is not one
> that Orca listens for.
> 
> Since this is for assistive technologies, and given that at-spi is the
> defacto standard for GUI environment accessibility on GNOME/Unity/QT, I
> propose that this arbitrator could be added to at-spi. Its slightly outside
> at-spi's scope, but any desktop environment that wishes to offer
> accessibility will have this service running in the background, and I don't
> see the point in writing another daemon to run in the user session just for
> this arbitration process.

If I remember rightly, AT-SPI 2 already handles keyboard events (including the
synthesis of keyboard input), so there is a precedent within the existing
infrastructure for the above proposal.

It would also be prudent to consider the implications of the anticipated move
by desktop environments from X to Wayland for the proposal. We don't want
accessibility infrastructure developers to be scrambling to catch up when a
widely used desktop environment makes that transition, since it is known well
in advance that this is likely to happen.



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]