Re: [gdm-list] Bringing back a11y
- From: Brian Cameron <Brian Cameron Sun COM>
- To: Francesco Fumanti <francesco fumanti gmx net>
- Cc: Ray Strode <halfline gmail com>, gdm-list gnome org, gnome-accessibility-list gnome org
- Subject: Re: [gdm-list] Bringing back a11y
- Date: Thu, 07 Feb 2008 17:21:57 -0600
Francesco:
- Commands that get launched need to be configurable since
users may need to launch AT programs with special options
or use an AT program that might not be integrated into GDM
by default.
Indeed, for example in the case of an accessibility menu, the same
program could be listed several times (with varying names) for
different configurations.
Right, ths is currently done with orca to specify whether to
run in magnifier, text-to-speech, or both modes. Check the
gesture configuration files your /etc/X11/gdm/modules directory.
- All functionality in the GUI needs to be keyboard navigable
I supposed that in the meantime this would be obvious for any
program that would like to call itself "modern" or uptodate.
(I suppose that the new GDM will be fully navigable by keyboard.)
And vice-versa: any command available somewhere in the program
should also be available in the menus of the program.
Unfortunately GDM 2.21 is not fully keyboard navigable. Anybody
want to help. :)
It would be ideal if there were a common solution to solve this
problem for both GDM and the GNOME desktop session. Now that GDM
2.21 is using metacity, perhaps the best long-term solution
would be to simply improve metacity so users can define hotkey,
button-press and dwell gestures to launch AT programs (or other
programs) on-demand. Then this functionality could just be used
in both GDM and for the user desktop session.
Maybe a standard should be created about how to launch the different
Assistive Technologies. Or does such a standard already exist?
I'm not aware of any such standard. It would be very useful. However,
I'd think a usability study would be helpful to figure out what users
really want. It also would probably be good to engage the KDE community
so that gestures can be standardized across free desktops.
It was not accessibility related, but I saw a mouse gesture software
(on a commercial platform), where the user could define the gesture
and the action triggered by the gesture. Among the actions there was
launching/quitting applications/scripts, sending shortcuts/keystrokes,
select menu items,...
That does sound cool. One problem with dwell gestures is that they can
be wasteful of the CPU with listeners polling for mouse movements all
the time. That's one reason it might be good to integrate the gesture
listeners into a proram that already does such polling, such as
metacity.
In the above bug report, it was decided that the best way to deal
with this could be to pass a label such as "text-to-speech" or
"on-screen-keyboard" to the session. This way the user could
configure what "text-to-speech" means differently in the GNOME
session versus at login time.
What about an option on the greeter (or a11y theme) that gives the user
the choice. This could even be fine tuned:
- enable ATs in GNOME session only if the corresponding AT is not set to
automatically start
- override AT settings of GNOME session
(This are only quick examples with no serious reflection.)
That's reasonable, but the biggest problem is how do you manage the
situation where you want the login program and the desktop session
to run different programs for the same feature. Perhaps I want to
use onscreen in GDM, but GOK in my user session. Ideally, it should
be possible for the user to login for the first time, switch the program
they want to use, and configure it to their needs without needing
assistance.
As I don't have a good overview of the different Assistive Technologies,
I am also sending a copy of this message to the gnome-accessibility-list in
the hope that people with more knowledge will add their contribution.
(I think that the best is to send all answers to the gdm list where the
discussion started.)
As I suggested, I think the best long-term solution would be to find
a way to support keybindings and dwell gestures to launch AT programs
that works both in the GNOME session and at login time.
In other words, we need standards!?
Before that, we probably need a usability study focused on this issue.
Or we could try to hack together a solution without a usability study
I guess. At the very least, it would be helpful to get feedback from
people about how this could be standardized.
Brian
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]