On 05/12/16 00:43, Luke Yelavich wrote:
Hey folks.
I am working on accessibility for Unity 8. One thing that we need to solve is
input event capture/processing for Orca. Whilst things work well enough with
Qt via its X QPA plugin, the Mir QPA plugin does not do any form of keyboard
snooping, therefore when using Orca with Qt apps under Mir, Orca's commands
cannot be used. I am pretty sure the same applies for Wayland as well.
Yes, the same applies for Wayland. It is one of our oldest TODOs. We
even had some email threads about the topic on wayland-dev some years
ago, without too much success. The thing is that they are open to
discuss it, but they didn't give too much hints about how to implement
it, and we should be the ones doing it.
FWIW, the issue is not only about snoop input events, but also about
synthesize input events.
I know
in the case of Qt's Mir support, the developers do not want to add any form
of keyboard snooping such as is present in Gtk/Clutter/Qt via its X QPA
plugin.
Depending on who you ask, that also happens on gtk, but even on X11.
Fortunately, after our insistence, whey kept it on X11. Future on
Wayland is uncertain.
I'm wondering whether anybody has done any work to spec out a cross desktop
solution for this problem. My understanding is that any solution would not
involve working on Wayland/Mir directly, since the compositor/shell is the
arbitor of all things input. Instead any solution would be implemented in the
compositor/shell, so mutter, KWin, and the unity 8 shell.
Yes, the first step, as far as I remember the chats, would be solving it
first on the compositor. For Wayland-GNOME, would be gnome-shell.
Without having tried to spec something out myself, I don't think it would be
that complex, probably something over DBus that allows an assistive
technology such as Orca to register its interest to process input events with
the shell, at which point it is notified again via DBus when an input event
needs processing, and signals the shell appropriately.
About exposing it through DBUS, at-spi already provides the API. On the
ancient times, it was implemented using the X extension XEvIE. When that
become unsupported, it was reimplemented with a combination of snooping
and event polling.
On a ideal world, we would like to reuse the existing at-spi API so Orca
and any other AT could use it, reimplementing it with "something"
available on Wayland.
I guess this si
something that would need amending an atspi spec somewhere, or would this be
more along the lines of XDG?
As I mentioned, at-spi already includes the client API (obviously,
perhaps it could be improved). What it is missing is the Wayland/Mir
part, that as you mention, it is more a XDG thing. The more promising
thing (the "something" I mentioned before) I found is this:
https://www.x.org/wiki/Events/XDC2014/XDC2014DodierPeresSecurity/
It is promising because although the presentation point accessibility as
one of the main affected, it mentions that affects other use-cases,
meaning that we could get more traction.
Unfourtunately I didn't have time to take a deep look to this proposal.
I don't know about their current state either.
Finally, other place to take a look to are screen on keyboards. They
face similar problems, and I bet that there are Mir-based platforms with
some kind of screen on keyboard (Ubuntu Touch?)
Thanks for the interest, and good luck
[1] http://www.freedesktop.org/wiki/Software/XEvIE
_______________________________________________
gnome-accessibility-devel mailing list
gnome-accessibility-devel gnome org
https://mail.gnome.org/mailman/listinfo/gnome-accessibility-devel