Re: [gdm-list] Bringing back a11y
- From: Brian Cameron <Brian Cameron Sun COM>
- To: Francesco Fumanti <francesco fumanti gmx net>
- Cc: gdm-list gnome org, gnome-accessibility-list gnome org
- Subject: Re: [gdm-list] Bringing back a11y
- Date: Thu, 07 Feb 2008 13:48:09 -0600
Francesco:
Your question is about the transfer of the AT starting mechanism from
the gdm 2.20 to the new branch.
I would like to use the opportunity to look at the problem from
another point of view:
Thanks for bringing this up. I definitely agree that we should
think about this, and design a better solution for users who need
to launch AT programs.
Let's take a step back and review the requirements:
- We need the ability to launch AT programs via gestures
(key-combinations, mouse button presses, and dwell activities).
This is because some users, such as people who are blind
or who can only use a switch-button, need some mechanism
to launch AT programs in order to interact with the GUI at
all.
- Commands that get launched need to be configurable since
users may need to launch AT programs with special options
or use an AT program that might not be integrated into GDM
by default.
- All functionality in the GUI needs to be keyboard navigable
- We need some mechanism to switch to an a11y theme. To
support multi-user systems, it would be ideal to allow users
to be able to switch to the a11y themes via a hotkey or
dwell gesture.
Note that the GNOME desktop session has similar problems. There
is no way to launch AT programs on-demand in your GNOME session.
A person with a11y needs would require that someone configure
their system to autolaunch the tools that they need via GConf
settings.
It would be ideal if there were a common solution to solve this
problem for both GDM and the GNOME desktop session. Now that GDM
2.21 is using metacity, perhaps the best long-term solution
would be to simply improve metacity so users can define hotkey,
button-press and dwell gestures to launch AT programs (or other
programs) on-demand. Then this functionality could just be used
in both GDM and for the user desktop session.
My first question: Why not make the presence of Assistive Technology
visible in the greeter for example by providing an Accessibility
menu that contains menuitems for the different ATs offered at login
time.
I think such a feature would be very helpful. It would help users
who are able to navigate the UI to launch AT programs that they
need.
Ideally, such a solution should be configurable. Users may need
to launch AT programs with different arguments or may need to
launch AT programs that are not integrated into GDM. So, if a
menu or pop-up dialog is shown, users should be able to add new
programs to the list of choices via configuration.
This is not strictly a requirement if users can also launch AT
programs via hotkey and dwell gestures, but it would definitely
improve usability for users who are able to navigate the UI
without first needing to launch an AT program.
The listener can then use gdk_screen_get_toplevel_windows to get a
list of windows running in the session, iterate over them, use the
GDK_WINDOW_XWINDOW () macro on the resulting gdk windows to get their
corresponding XIDs and use XGetProperty to find the window with the
_NET_WM_NAME set to "Login Window" or WM_WINDOW_ROLE set to
"greeter-login-window".
And if instead of using the system of window border crossing,
some software capable of recognising mouse gestures independently from
the window that is on the screen? (For example similar to some gesture
recognition offered by addons in browsers?)
I agree. The existing dwellmouselistener only supports recognizing
events based on window border crossings. While this meets the
requirements of a dwell gesture, there are probably better types of
dwell gestures that could be supported.
There probably should be a usability study to determine what gestures
are required and most usable for launching various types of AT programs.
Whatever types of dwell gestures we decide to support should likewise
be configurable so that users can specify what programs should be
launched when gestures are received.
Moreover, I would like to use this opportunity to ask whether in addition
to adding the AT starting methods that were available in GNOME 2.20, also
some other, visible methods could be added?
I think this is a very good idea. It probably requires a bit of work to
make the mechanism properly configurable, but I am sure that if someone
wanted to do the work to add a menu, or pop-up dialog, or whatever that
it would make sense to include this in GDM.
It might also be handy if this GUI highlighted what
key/moustbutton/dwell gestures can be used to launch various AT
programs.
For people that are not able to click, I would suggest a method provided by
the greeter (for example a dwellable area/icon) to enable mousetweaks, the new
software that was accepted in GNOME 2.22 and that provides systemwide dwelling.
You can see it running here in GDM 2.20 started by a libdwellmouselistener
gesture:
https://help.ubuntu.com/community/Accessibility/OnboardAndMousetweaksAtGDM
(The onscreen keyboard is another software not related to mousetweaks.)
I think it makes the mose sense to make things configurable. If users
want to use onscreen, they should be able to configure GDM to use it.
By saying this, I am not suggesting that we don't include onscreen
support in the default configuration, but we should not limit the user
to only be able to use the tools which are integrated into GDM's default
configuration.
Another point that might also deserve some attention is the passing of
the activated ATs to the following gnome session and similar on logout.
I think that the following bug addressed the issue:
http://bugzilla.gnome.org/show_bug.cgi?id=411501
This is a good idea, but there are some difficulties that make this
a challange:
+ The needs of navigating the login screen is fairly trivial, compared
with the needs of navigating your GNOME session. So you might want
to launch AT programs differently at login time to reflect this.
+ Also, on multi-user systems the AT programs may be shared by multiple
users with different needs. For example, it might make the most sense
to launch GOK in "dwell" mode rather than "click" mode at login time,
since this is more usable for a wider audience. However, some users
might really want "click" mode in their actual GNOME session.
So, it probably does not make sense to pass the actual command that
GDM uses to the GNOME session.
In the above bug report, it was decided that the best way to deal
with this could be to pass a label such as "text-to-speech" or
"on-screen-keyboard" to the session. This way the user could
configure what "text-to-speech" means differently in the GNOME
session versus at login time.
As I don't have a good overview of the different Assistive Technologies,
I am also sending a copy of this message to the gnome-accessibility-list in
the hope that people with more knowledge will add their contribution.
(I think that the best is to send all answers to the gdm list where the
discussion started.)
As I suggested, I think the best long-term solution would be to find
a way to support keybindings and dwell gestures to launch AT programs
that works both in the GNOME session and at login time.
However, in the meanwhile, it might make the most sense to continue
supporting something similar to keymouselistener and dwellmouselistener.
These are fairly trivial to port to the new GDM 2.21 and would be
immediately useful. Any other design would likely take some release
cycles to design and complete.
Brian
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]