[g-a-devel] GNOME Accessibility presentation - contribution to stock GNOME presentations



Hi all,

Way back in July, I gave a presentation (once in English, once in
French) of GNOME accessibility technologies - I thought it might be a
useful "stock" presentation for that for others.

Some things definitely need improvement - simple inaccuracies like
talking about gnopernicus, outdated screenshots of GNOME 2.4, the photo
of my brother & the family (perhaps too personal for a stock
presentation), and the presentation needs a narrative - I've attached my
notes from the presentation below to give you an idea, it's a good 45
minutes to 1 hour long presentation. I also have a French translation.

The core goal of the presentation is to show that accessibility is
important because of the people we help. It's important not because
having a certain level of conformance with standards opens the door to
government contracts, or as a selling point for the software, but
because it helps users & developers, sometimes (like through Strongwind,
Dogtail and LDTP) in unexpected ways.

The presentation is too big to send to the list, so I've put it on a
website - you can get it at: http://dneary.free.fr/Presentations/Digital
ramps and handrails.pdf (versions in .ppt and .odp will also be there,
but perhaps with missing bitmaps, etc).

The general thrust of the presentation is:

 * We use computers with standard input & output devices - a mouse, a
keyboard, a screen.
 * But that doesn't cover all use-cases. Blind people can't see screens.
People with degenerative motor illness can't use mice or keyboards. Old
people with "normal" illnesses like arthritis and vision impairments
can't easily use all this stuff either. And kids (and parents holding
babies ;) also have trouble with these devices which require
sophisticated hand-eye co-ordination
 * There are other hardware inputs & outputs that can help:
  * Joysticks instead of mice
  * Drawing tablets
  * Braille keyboards
  * Audio input & output (speech synthesis, audio signals, speech
recognition for commands)
  * A whole range of things like accelerometers, championned by the
iPhone and the Wii, and in general the whole range of video game
controls which make you think differently abut interracting with a computer
  * More specialised: eye trackers that can use eye movement and blink
patterns to command
  * And finally, software to make things easlier

 * Here's where GNOME fits in
 * Project founded on the principle of universal access - making
computer technology available to anyone, not just geeks, regardless of
culture, technical or physical ability - in 3 main ways: consistent,
usable, learnable user interfaces; internationalised and localised
applications (chance to explain the difference between internationalised
("take out all local assumptions") and localised ("add back in all the
local constraints for many cultures")); work on accessibility (a nod to
Sun Microsystems and IBM, who have been long-time champions of this).

The rest of the presentation is a demo of various accessibility features
in GNOME. I discovered several quirks & bugs while doing the demos :-}

The demos split into 2/3 parts:

 1. General GNOME features which are useful to people with handicaps
 2. Accessibility features available to all GNOME applications,
regardless of the desktop configuration
 3. Features that depend on AT-SPI being activated, and which can be
considered "advanced" accessibility tools


 * Keyboard shortcuts: the entire GNOME desktop is available through use
of only the keyboard. Remove mouse, start demo:

Basics:
 1. Switch applications (Alt-Tab)
 2. Choose panel (Ctrl-Alt-Tab) - open a new application through panel
  (BUG #542325: When you open a menu while navigating with the keyboard,
you cannot again navigate with the keyboard until you click somewhere
with the mouse)
 3. Alt-<key> to navigate menus of an application
 4. Tab, Shift-Tab to navigate through interface elements in an
application (including web application) (would be nice to show
navigation to toolbar, but I can't figure out how to do it)
 5. Each application has a set of short-cuts - show that standard
shortcuts are used across all applications to make it easier for users
of a new application.

 * Themes
  1. Show high contrast themes, and explain how they help colorblind or
visually impaired users.
  2. Show configurability of things like font sizes

 * Audio
  1. Black screen represents what a blind person sees when turning on
their computer. Ask the crowd: "it takes 30s to 2 mins to boot a
computer - how does the user know when they can log in?" There's an
audio signal emitted when GDM is ready to rock which serves that purpose.
 2. Show audio events config

 * Sticky keys
  1. Explain: You can press one key at a time, and still do Alt-F or
Alt-Tab. Useful if you have a baby in your lap, or any range of physical
disabilities that makes chording difficult.
  2. Activate, and do Alt Tab, Ctrl S, Alt Shift Tab, etc.
    (NOTE: I discovered after the presentation to do something like
"cycle through application windows", you need to hit Alt-Alt to get the
Alt key to stay "stuck" while you tab through the windows. To unstick
it, hit Alt again)

 * Slow keys
  1. Explain: Allows you to specify a minimum time a key must be pressed
to register a keypress (A11y guys: I found this to be more annoying than
anything else. Can you give me a usecase where this would be useful?)
 2. Demonstrate.

 * Bounce keys (NOTE: I didn't show or explain this):
  1. Explain: Bounce keys has only the first tap of rapid repeat presses
register. Useful for people with conditions such as Parkinsons who have
trembling hands and poorer co-ordination
 2. Demonstrate

 * Dasher
  1. Launch the application, and give a demo (it's useful to have a
sentence you know you want to write prepared)
  2. Most people won't understand what they're seeing; Explain: Dasher
allows you to write text by driving a pointer through the alphabet,
letter sizes are bigger/smaller depending on the relative probability
that they'll be next in the word you're writing, training text allows
the application to set the probabilities
  (BUGs: Dasher on Ubuntu (and on Linux in general) appears to have a
number of bugs around choosing the alphabet (application crashes),
importing training texts (app crashes) and in at least Edgy Eft, writing
text (app stopped after about 20 letters). Also, on Linux, I don't know
how to send the text to another window.)
  3. Invite someone from the audience to come up & give it a go, and
while they're writing, explain the letter sizes and use cases where this
is useful (eye tracker, joystick, etc.)


!! From here, AT-SPI needs to be activated (it's also useful for Dasher)


 * Mouse Tweaks
  1. Explain: Mouse Tweaks allows you to click the mouse without
clicking it, or to emulate a right-button click with just the left
button (yes! All you one-button Mac users can open contextual menus now
without going to the keyboard!)
 2. Demonstrate delay-click: Click & hold to raise a right-click
contextual menu
 3. Demonstrate dwell-click: Leave the mouse steady for a second, and
when the pointer changes, move up, down, left, right to get single,
double, right click or click & drag functionality

 * Orca
  1. Explain: A screen reader, which hooks into several voice engines,
and braille displays, and a screen magnifier
  2. Demonstrate screen reader (works best when hooked into sound system):
   * Open pre-edited document in gedit, read it with Orca
   * Navigate using keyboard
   * Type a sentence in gedit
   * Read a PDF document in Evince
   * Navigate the web (extra brownie points if you can find a page with
awful accessibility, and say "this is what the web looks like to a blind
person" - no harm underlining that web accessibility is important)
  3. Demonstrate Screen magnifier (very quick, turn it on, show what it
does)

 * GNOME Onscreen keyboard
  1. Sad confession: I never figured out how to use this, and so I
showed screenshots. Every time I try to turn on GOK, I end up losing
control of my mouse (something to do with a secondary mouse pointer not
being available, so he disables my primary one) and a modal dialog comes
up that I can't easily dismiss.
  2. Explain that the GOK allows more than just typing letters, you can
configure it to type words, menu items, adapts to the screen context
(modal dialogs, file dialogs, different applications, etc)

I'd love to be able to demo this on my laptop, if someone could show me
how :)

  * Braille keypoard support - obviously I didn't demo this, but spoke
about how Braille type 2 support was recently added, allowing Braille
typists to chord more complex words more easily now. I hope I understood
this properly :)

  * Convenient side effects
   In general, when people think about a11y features, they think about
stuff useful only to people with disabilities. At the start I tried to
explain that it's not just people with disabilities that benefit from
things like ramps and kerbcuts, they also help people with push-chairs,
shopping trollies, old people and children.

   Similarly, a11y features for the computer don't just help people with
disabilities (in the classical sense) - high-contrast themes also help
older people, sticky keys also help dads with babies (Federico Mena
wrote a blog entry about this by the way, and it echos my personal
experiences, which is why I keep bringing it up).

   Making websites accessible, for example, also makes them more useful
in space-constrained environments like cellphones, which are
increasingly used as web browsing devices by people without a disability.

   Finally, AT-SPI makes your applications GUI scriptable - allowing the
creating of automated testing frameworks like Accerciser, Strongwind,
LDTP and Dogtail. These applications weren't the primary motivation
behind it, but a nice useful side-effect.

Thanks! Questions?


-- 
Dave Neary
GNOME Foundation member
bolsh gnome org


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]