CSUN trip report



Greetings,

Last month was the annual CSUN Conference on Technology and Persons with
Disabilities.  Sun Microsystems highlighted accessibility solutions for
computers running UNIX (such as the Solaris operating environment,
GNU/Linux, and other computer systems).  There was a UNIX Accessibility
series of conference sessions on Thursday, including demonstrations of
StarOffice accessibility on Windows systems as well as those running UNIX. 
Sun's booth was an entire room at the conference hotel, and Sun held a
series of hands-on guided tours of the accessible GNOME desktop for UNIX
where attendees used the Gnopernicus screen reader/magnifier and the GOK
dynamic on-screen keyboard to navigate the graphical desktop, create text
documents, and even read the Los Angeles Times website in Braille.  This
year also saw the third annual Linux Accessibility Conference, which was
held on Thursday afternoon and part of Friday.


This lengthy trip report describes in some detail all of the events relating
to UNIX Accessibility at CSUN, as well as the demonstration of StarOffice
accessibility on Windows and UNIX systems.


The key messages from Sun and the UNIX Accessibility community at CSUN were:

 1. That the UNIX environment, with the GNOME 2 graphical desktop, is
    becoming a very accessible alternative desktop for users with
    disabilities.  GNOME 2 is a a free, open source graphical desktop for
    UNIX, with accessibility support built in as a forethought (vs. bolted
    on as an afterthought).  GNOME 2 provides full keyboard access to
    the desktop and applications, rich themeing support with 
    pre-configured options like high-contrast and large-print, and a 
    comprehensive and extensible accessibility architecture implemented in 
    the core of graphical user interface.

 2. The development of a fundamentally different approach to 
    accessibility, where assistive technologies get all of the information 
    they need from supported programming interfaces - no more patching the 
    operating system or building off-screen-models for screen access.

 3. The development of Gnopernicus - a free, open source screen reader
    and magnifier for GNOME by BAUM Retec AG.  Gnopernicus was 
    demonstrated on both an Intel RedHat Linux system as well as the
    Sun SunRay network terminal running Solaris.  Many attendees
    participated in a guided tour of the GNOME 2 desktop using
    Gnopernicus - some with speech, some with Braille, and some with
    full-screen magnification - at the Sun booth.

 4. The development of GOK - a free, open source, dynamic on-screen 
    keyboard for GNOME by the University of Toronto Adaptive Technology
    Resource Centre.  GOK was demonstrated on both a Sun Solaris system
    and an Intel RedHat Linux system, with both single switch access
    and support for the Madentec Tracker head-mouse.

 5. A demonstration of accessibility support built into both StarOffice 
    and the open source OpenOffice.org office productivity suite of
    applications (word processor, spreadsheet, presentation package, and 
    drawing package - with full support for reading and writing Microsoft 
    Office file formats).  The demonstration showed how a user can
    fully interact with StarOffice applications without using the mouse,
    how StarOffice respects the user's desktop theme settings, and
    highlighted a number of specific accessibility preference settings
    in StarOffice.  StarOffice support for assistive technologies was
    shown on both the UNIX and Windows platform - the latter in 
    conjunction with ZoomText Xtra for Windows.

 6. A demonstration of accessibility support built into the open source
    Mozilla web browser (which also includes applications for web page
    creation, and electronic mail).  The demonstration showed how a user 
    can fully interact with Mozilla without using the mouse, how Mozilla
    respects the user's desktop theme settings, and highlighted a number 
    of specific accessibility features in Mozilla.  During the guided
    tour sessions, several attendees used Mozilla on UNIX with the 
    Gnopernicus screen reader/magnifier, including one user who used
    the time to catch up on current events through the Los Angeles Times
    web site.


Below is a fairly detailed summary of each of the three sessions relating
to GNOME and UNIX Accessibility:


 o The first session in the UNIX Accessibility series was "UNIX and
   GNOME Accessibility overview" - highlighting the Accessible GNOME 2
   desktop by Gary Little and Peter Korn of Sun Microsystems.  

   Peter Korn began the session with an overview of this session and the
   two that would follow, as well as the other events relating to
   UNIX Accessibility at the conference.  He outlined Sun's goals and
   vision for accessibility ("Anyone. Anywhere. Any time. on Any device"),
   and then briefly described UNIX, GNOME, and outlined the accessibility
   functionality in GNOME.

   Next up was Gary Little, who talked about GNOME in more detail.  Gary
   talked about the goals for the GNOME project, and described many of
   the features of the GNOME 2 desktop.  Gary noted that Sun and others
   have released "Phase I" of the accessible GNOME desktop already, with
   full keyboard navigation and theme support, and it can be downloaded
   for free from Sun for Solaris at http://www.sun.com/gnome/ and also
   ships with a number of Linux distributions.  Finally, Gary noted
   that the "Phase II" release of GNOME accessibility will include
   two open source assistive technologies: Gnopernicus and GOK (which
   were the subject of the third presentation in the UNIX Accessibility
   series - see below).

   Peter Korn then returned to the stage, and spent the rest of the
   presentation demonstrating the accessibility features that are
   in the shipping "Phase I" GNOME desktop, as well as those coming in 
   "Phase II".  Specially, Peter demonstrated keyboard navigation of the 
   desktop, the high-contrast large print theme (and the themeing engine 
   in general), and several special features of the Nautilus file manager 
   which is part of GNOME - the ability to "zoom" the content region to 
   see things up to 400% larger, and they way that Nautilus knows about
   a variety of file types and will render them in the file view or in
   the case of sound files play them when the user selects one.

   Peter then launched the Gnopernicus screen reader/magnifier, and 
   showed how Gnopernicus tracks the user focus and reads the item the 
   user is interacting with as well as pertinent information about it 
   (e.g. telling  the user that they have activated a menu, the name of
   that menu, and the number of items in that menu). He explained that
   Gnopernicus treats speech and Braille as different modalities, and
   the information rendered in speech is different than what is rendered
   in Braille (using as an example items in a menu, where the information
   about what the item is and the accelerator keys for invoking it
   is spoken in a particular order in particular voices, while it is
   rendered very differently in Braille with accelerator keys set off
   from the text of the menu item by placing them in parenthesis).  Peter 
   also demonstrated Gnopernicus' support for Braille, including the 
   built-in Braille Monitor which displays the characters being rendered 
   to the attached Braille display.  Peter further showed the 
   magnification features of Gnopernicus, including multiple zoom levels, 
   picture smoothing functionality, and optional inverse video.  Peter 
   demonstrated Gnopernicus with a variety of applications on the
   desktop, including the Nautilus file manager, the simple text editor
   application, and the gnome menu panel.  


   Peter then launched the GOK dynamic on-screen keyboard, and explained
   how GOK differs from traditional on-screen keyboards.  He showed
   the main keyboard - which has the keys "Compose", "Window", "Pointer",
   "Launcher", "Activate", "Settings", "Menus", "Toolbars", and "UI Grab".
   As GOK is really a complete user interface and user interaction model
   for users with significant physical disabilities (such as single 
   switch users, and people using head-tracking devices or eye-gaze
   technology), it provides much more functionality than any other
   on-screen keyboard.  

   Peter showed how the GOK "Launcher" keyboard provides a programmable 
   set of applications buttons that a user can configure to directly 
   launch applications that a user commonly wants to use.  Peter used GOK
   configured for single-switch access to select the Launcher keyboard
   and launch the GNOME Text Editor application.  He then went to the
   "Menus" keyboard to directly interact with the menus in the Text Editor
   application, and GOK dynamically built a special keyboard showing
   the Text Editor menu bar, and then when Peter selected the "File"
   menu, GOK built and presented a second special keyboard showing the
   menu items inside the "File" menu.  The traditional on-screen keyboard 
   can be found under the "Compose" button, which provides the ability to 
   simply insert keystrokes into the topmost application, and is how
   a GOK user would enter text into the Text Editor application, or any
   other application on the desktop.

   Peter noted that while neither Gnopernicus nor GOK were shipping yet,
   Sun planned to being a beta testing program in the near future, and 
   is soliciting volunteers who would be interested in beta testing the
   accessibility GNOME desktop with Gnopernicus and GOK.  Peter then
   opened the floor for questions.

   
   For more information about the GNOME accessibility architecture,
   see the following web pages:

    http://www.sun.com/gnome
    http://www.gnome.org/start
    http://developer.gnome.org/projects/gap


 o The second session in the UNIX Accessibility series was "Accessible
   UNIX Applications: StarOffice and Mozilla" - showcasing the 
   accessibility features of these important applications, presented
   by Peter Korn of Sun Microsystems with Malte Timmermann of the
   Sun StarOffice development team on hand to answer questions.

   Peter first highlighted the main features of StarOffice: that it
   is a complete office suite with a full-featured word processor, 
   a powerful spreadsheet, and a very flexible presentation package,
   as well as a database and equation editor; that StarOffice uses
   XML as its native file format but can read and write Microsoft Office
   files; and that there is an open source edition: OpenOffice.org,
   which has been ported to the Macintosh in addition to running on
   same Linux, Solaris, and Windows platforms that StarOffice runs on.
   Peter noted that a large and growing number of people are using
   StarOffice - 15% of Windows office users use StarOffice according
   to a poll by Windows Magazine.  Peter further noted that Sun has
   donated ~$6 Billion worth of StarOffice software to schools worldwide,
   and that virtually every Linux distribution ships with either
   StarOffice or OpenOffice.org included, as do a growing number of
   Windows PCs.

   Peter then talked about the accessibility features available in
   StarOffice version 6.1 beta 1, which is now available on the web
   for download for Windows, Solaris, and Linux.  These features include
   full mouseless operation (everything can be done from the keyboard);
   full theme support for things like high-contrast and large print;
   and support for cross-platform accessibility APIs which support
   the Gnopernicus screen reader/magnifier and the GOK dynamic on-screen
   keyboard under UNIX, as well as early JAWS and ZoomText support under
   Microsoft Windows.  He then proceeded to demonstrate these features,
   launching StarOffice on the GNOME desktop with the high-contrast
   large print theme set which StarOffice respected.  Peter showed the
   StarOffice "Zoom" feature, that allows a user to have StarOffice
   render the content portion of the document larger (the user can enter
   a zoom percentage).  Peter navigated through the user interface via
   the keyboard (which follows the GNOME desktop keyboard navigation
   conventions).  Peter brought up the Accessibility preferences dialog
   in StarOffice, and showed the various special settings for supporting
   accessibility in StarOffice, including things like a special text
   selection cursor for read-only text, configurability of the tool
   tip time-out, and whether to allow animation in graphics and text.

   Peter then moved to a PC running Windows, ZoomText, and StarOffice
   version 6.1 beta 1 for Windows.  Peter demonstrated how ZoomText
   tracks the user's keyboard interaction with menus - reading them 
   and moving the magnifier to magnify the item the user interacting
   with.  When Peter opened a spreadsheet and moved between the cells,
   ZoomText read the name and contents of the current cell (speaking
   "Cell A2", "Cell B2", etc.), and the magnifier tracked the cell
   selection as well.

   Peter noted that the first public beta release of StarOffice
   accessibility was recently posted to the web and is available for
   download.  He then opened the floor for a few minutes for StarOffice
   accessibility questions prior to giving a demonstration of Mozilla
   accessibility.  Malte Timmermann of the Sun StarOffice engineering
   team also answered questions.


   For more information about StarOffice accessibility, see the following
   web pages:

     http://www.sun.com/staroffice/accessibility
     http://ui.openoffice.org/accessibility
     http://www.sun.com/software/star/staroffice/beta/


   The second half of the presentation was focused on Mozilla
   accessibility.  Peter highlighted the key features of Mozilla: that
   it is a full featured, cross-platform browser; it is a web page
   editor; a powerful electronic mail client supporting IMAP and
   POP3 mail; and a netnews client.  Peter then talked about the
   accessibility features being built into Mozilla.  These features
   include full mouseless operation (everything can be done from the 
   keyboard); full theme support for things like high-contrast and large 
   print; and support for cross-platform accessibility APIs which support
   the Gnopernicus screen reader/magnifier and the GOK dynamic on-screen
   keyboard under UNIX.  He then proceeded to demonstrate these features,
   launching Mozilla on the GNOME desktop with the high-contrast
   large print theme set which Mozilla respected.  Peter showed the
   Mozilla "Zoom" feature, that allows a user to have web page content
   rendered larger (the user can enter a zoom percentage).  He navigated 
   through the user interface via the keyboard (which follows the GNOME 
   desktop keyboard navigation conventions).

   Peter then launched the Gnopernicus screen reader, and showed how
   Mozilla supports the GNOME accessibility architecture, through which
   Gnopernicus is able to provide blind and low vision access to web
   browsing in UNIX environments.  Peter used Gnopernicus to track
   keyboard interaction with the Mozilla user interface (reading menus
   and dialog boxes), and then opened the CSUN conference web page and
   used Gnopernicus to read the the information on that web site.  Peter
   explained that HTML accessibility information as detailed by the
   Web Accessibility Initiative is being exposed through the GNOME
   Accessibility Framework, making it available to screen access
   technologies such as Gnopernicus.  Peter noted specifically the
   AccessibleHypertext interface, that provides a list of all of the
   hyperlinks on a web page for alternate presentation by an assistive
   technology.

   Next, Peter launched the GOK dynamic on-screen keyboard.  GOK 
   enumerated all of the menus in Mozilla and presented a dynamic
   keyboard giving a single switch or head-tracker user direct access
   to all of the items on all of the Mozilla menus.  Likewise, he
   showed the "Toolbar" keyboard of GOK, which listed all of the
   buttons on the Mozilla toolbar - including the special buttons for
   directly launching the Mozilla e-mail client, and address book.  
   Peter noted that only those toolbar buttons which are active in
   Mozilla are shown as available in GOK.

   Peter then took questions about Mozilla accessibility.


   For more information about Mozilla accessibility, see the following
   web pages:

     http://www.mozilla.org/projects/ui/accessibility
     http://www.mozilla.org/projects/ui/accessibility/unix
     http://www.mozilla.org/docs/end-user/moz_shortcuts.html


 o The third session in the UNIX Accessibility series was "Assistive
   Technology for UNIX: The Gnopernicus Screen Reader/Mangifer and
   The GNOME On-screen Keyboard" by Thomas Friehoff of BAUM and
   Simon Bates of the University of Toronto Adaptive Technology Resource
   Centre.  Peter Korn of Sun Microsystems briefly introduced the
   session, and then introduced Thomas Friehoff - the Vice President 
   of R&D at BAUM Retec A.G. and the person in charge of Gnopernicus 
   screen reader/mangifier development.

   Thomas gave an overview of his talk: that he would describe BAUM's 
   motivation  for doing Gnopernicus; talk about the architecture and 
   targeted platforms of Gnopernicus; show the user interface design of 
   Gnopernicus; and talk about BAUM's development plans going forward.

   Thomas described the core Gnopernicus development team: 4 engineers
   working in Romania full time for the last 18 months (with some of
   that time devoted to learning about UNIX/Linux development).  He
   gave the BAUM mission statement: "To offer Products and Services to
   Blind and Visually impaired persons, to make them more successful in
   their business and private life!"  BAUM achieves this mission through
   a focus on development, distribution & service of products, 
   installation & training.  Thomas noted that as Gnopernicus is open
   source, BAUM expects to make money from their development efforts
   through Gnopernicus distribution, installation, and training.

   Thomas noted BAUM's motivation for developing Gnopernicus: that today
   Windows dominates the market, that they and their customers are 
   looking for alternatives, and that they want to be early adopters
   of new technologies.  Further, Linux systems are popular in BAUM's
   home in Germany.  BAUM is getting many questions from users seeking
   access to graphical environments in Linux.  Recently the German
   Parliament decided to standardize on Linux for their workstations, and
   a town near BAUM's home in Heidelberg plans to have all desktops
   running Linux by 2004.  Finally, Thomas noted that BAUM's development
   staff has frankly gotten tired of Windows development - they wanted
   to do something new.  So, when Sun introduced the GNOME Accessibility
   architecture to BAUM, they decided to "go for it" and develop an
   open source screen reader for UNIX systems.
   
   Thomas stated that BAUM is targeting two platforms: Linux with Intel
   PC hardware and Sun Solaris systems.  The BAUM development team does
   almost all of their development under Linux, and has been delighted 
   to find that with virtually no problems Gnopernicus compiles and runs 
   on Solaris without modification - proving one of the values of having
   a defined accessibility architecture vs. the Windows approach of
   hacking into an undocumented system.  

   Thomas showed a diagram of the Gnopernicus architecture: that it
   is simply another application on the desktop - like Mozilla or
   StarOffice or the Text Editor - and that Gnopernicus simply uses the
   standard GNOME Accessibility interfaces to communicate with these
   applications in order to provide an alternate presentation in speech,
   magnification, and/or Braille of these applications.  Furthermore,
   there is a standard way for new and potentially novel applications to 
   support the accessibility interfaces, so Gnopernicus need not be
   modified in order to provide access to them.  The hope is that once
   the screen reader is done, all further energies will go toward 
   improving the user interface, as opposed to their work in Windows
   where they are constantly having to re-engineer their screen
   reader in order to be able to get at what is on the screen.

   Thomas noted that the architecture of Gnopernicus is different from
   that of other screen readers - the core of the product contains no
   user interface code; rather that code lives in a separate series of
   modules (for speech, magnification, and Braille), making it very
   straightforward to build different products for other user needs
   (for example for people with learning disabilities or the elderly).
   Thomas described the two parts of the Gnopernicus user interface:
   the series of configuration dialog boxes (for output devices, for
   keyboard key assignment, and to load and save settings); and the
   direct keyboard access interface to the functions of Gnopernicus
   (using the numeric keypad, through the use of the standard keyboard
   keys with special modifiers a user might define, and through the
   buttons of an attached Braille display).  With the configuration
   dialogs, everything is done through the graphical display.  The 
   direct keyboard interface includes a set of "immediate" commands
   (speak the contents of the status bar, read the items on the menu
   bar, make the speech faster/slower, etc.), and there is generally
   no graphical visual feedback.
   
   Thomas then showed a series of slides containing all of the graphical
   configuration dialogs of Gnopernicus.  He also showed the Braille
   Monitor - a window showing visually on screen what is being sent
   to the Braille display.  One of the graphical configuration dialogs
   Thomas talked about was for magnification settings: Gnopernicus
   supports a range of magnification features including separate mouse
   cursor magnification, differential (x,y) coordinate magnification up 
   to 16x, full-screen crosshairs (in a user-selectable color), a 
   variety of picture smoothing options, several mouse tracking options,
   panning and inversion options, and a number of "zoom" regions so that
   the user can have one portion of their screen dedicated to magnifying
   one source while other portions of their screen are magnifying other
   sources.  Thomas also noted that all of these specific settings can be
   invoked directly from the direct keyboard interface commands.

   Another series of graphical dialogs Thomas talked about were the
   Braille settings dialogs.  Options Thomas highlighted included the
   a choice of Braille devices connected to the serial ports (currently
   the BAUM Vario and ALVA lines of displays are supported), a choice
   of Braille translation table (currently English, German, Spanish,
   and Swedish are supported), and a choice of action to be taken
   when one of up to two rows of touch cursors is selected (including
   mouse movement/click/double-click, moving the text caret, and
   presenting various sorts of information about the object/character
   at that Braille cell).  Thomas also demonstrated how a user can
   map specific commands to various other buttons on a Braille display.

   Thomas then showed how the Gnopernicus direct keyboard interface
   can be configured - where each command can be mapped to various
   keys on the numeric keypad, or to user-defined key combinations.
   Gnopernicus uses the concept of "layered" keypads which a user
   can toggle between, thereby making a much larger set of keys available
   for the direct keyboard interface, and grouping related commands
   onto the same layer (e.g. all magnification commands on one layer)
   for more logical use.  The user can choose a specific named command
   and map it to a particular key on a particular layer on the numeric
   keypad.

   Thomas talked about Gnopernicus' flexible presentation of information.
   Through the Presentation dialog box, a user can configure precisely
   what information is rendered in speech, Braille, or magnification for
   each type of event in the user interface.  For example, a Braille user 
   might want menu items to be rendered with a three character 
   abbreviation of the role of the object ("MNU"), followed by the text 
   of the menu item, followed by any accelerator keys associated with 
   that menu item shown within parenthesis (so the user would immediately
   know that that text isn't actually visually character-for-character 
   on the screen).  Likewise a speech user might want to have menu items 
   rendered with the role of the object ("Menu") spoken in a high-pitched
   "menu" voice, the text of the menu item spoken after it in a 
   medium-pitched "text" voice, and any accelerator keys spoken in a 
   low-pitched "accelerator" voice. Thomas also noted that these named 
   "Gnopernicus voices" are completely configurable by the user, who can 
   collect a particular set of speech parameters for a particular 
   text-to-speech engine together into a named "Gnopernicus voice" (such 
   as "accelerator"), and then have Gnopernicus use that voice for 
   presenting specific things in the user interface, in response to 
   specific events on the desktop.

   Running out of time, Thomas skipped over many of his slides, only
   briefly mentioning the Gnopernicus Find command (which allows a
   user to search not only for text, but named graphics, as well as
   for attribute runs such as "find the next bit of italicized text",
   or "find the next bit of underlined text that is selected").  Thomas
   then briefly showed on his slides the complete default configuration
   of the keyboard interface - all of the commands on each layer of the
   keypad.  Finally, Thomas gave a brief report on the state of the
   project and the plans going forward.  As of February 20th, Gnopernicus
   is "feature complete", and BAUM is now in the "application testing
   phase".  BAUM hopes to have "product quality" by the middle of this
   year.

   
   For more information about Gnopernicus and BAUM, see the following
   web pages:

     http://www.baum.de
     http://www.baum.ro/gnopenricus.html


   Peter Korn briefly returned to the stage, and introduced Simon
   Bates of the University of Toronto Adaptive Technology Resource
   Center and one of the developers of the GOK dynamic on-screen keyboard.
   Simon passed along regrets from Jutta Treviranus, who had intended
   to be at CSUN and give this presentation.

   Simon introduced GOK and the GOK project - an open source on-screen
   keyboard that uses the GNOME Accessibility architecture to provide
   a richer set of functions than the traditional on-screen keyboards
   of other platforms.  Like all on-screen keyboards, GOK displays
   a set of keys in a window that is always top-most.  GOK supports
   multiple input devices (single switch with delay, head-tracker, and 
   eye-gaze devices), and multiple access methods (direct selection, 
   scanning and inverse scanning, and dwell selection).  Simon explained
   how these access methods work: direct selection activates keys on
   the keyboard by moving a pointing device over a key and clicking
   it; dwell selection activates keys by moving the pointing device over
   a key and letting it "dwell" there for a specified duration; and
   scanning and inverse scanning activates keys through the press of
   a single switch (or pair of switches) to activate in sequence a row
   of keys and then when the desired row is selected individual keys
   within that row with the user pressing their switch to then choose
   the specific key on that row.

   While GOK can of course replace the physical keyboard, Simon explained 
   that GOK goes beyond these basic on-screen keyboard functions, 
   providing direct access to applications, and supporting desktop 
   interaction from the GOK dynamic keyboards.  Further, GOK is very
   extensible and customizable.  With GOK, a user with a significant 
   physical disability has complete and efficient access to their
   entire desktop and application suite, via the GNOME Accessibility
   architecture.

   Simon then went into some detail on the various access methods,
   showing how they work, and how they can be configured.  For example,
   GOK can be configured to flash the keys when selected, and/or play a 
   brief sound when a key is selected.  In dwell and automatic scanning
   modes, the user can specify the dwell timeout and the scanning
   interval.  The user can also configure the number of times the
   automatic scanning will cycle through the keys before resetting.

   After this general introduction, Simon gave a tour of GOK, starting
   with the keys on the main GOK keyboard.  Simon first showed the
   GOK Compose keyboard (which replaces the user's physical alphanumeric
   keyboard).  The Compose keyboard supports word completion, works with
   the AccessX Sticky Keys functionality (for latching modifier keys like
   Shift, Ctrl, and Alt), provides visual feedback of the latched modifier
   state, and is dynamically created when launched to match the actual 
   physical keyboard on the user's computer.

   Simon then described the functionality of three of the keys on the main
   GOK keyboard that provide direct access to the general desktop and
   applications: the "Menus" key, the "Toolbars" key, and the "UI Grab"
   key.  These functions work by using the support for the GNOME
   Accessibility architecture built into the GNOME desktop and
   applications - including applications like StarOffice, Mozilla, and
   those Java applications which implement the Java Accessibility API.
   The Menus keyboard is a dynamic keyboard whose keys are the items
   of the menu bar.  When a key on the Menus keyboard is selected, a
   new Menus keyboard appears whose keys are the contents of that 
   menu (e.g. the Menu keyboard for the File menu of the GNOME Text
   Editor applications would be "New", "Open...", "Open Location...",
   "Save", "Save As...", "Revert", "Print Preview...", "Print...",
   "Close", and "Quit").  This provides a user with direct access to
   all of the menus in their applications.  Likewise, the Toolbars
   keyboard is a window of keys showing all of the toolbar elements of
   a GNOME application.  Finally, the UI Grab dynamic keyboard presents
   a set of keys for all of the user-interface elements in the active
   window that can be directly activated (the buttons, radio buttons,
   and check boxes) - particularly useful for direct interaction with
   dialog boxes like the Save dialog of an application.

   Simon continued the tour of GOK with another set of three keys on
   the main GOK keyboard: "Launcher", "Activate", and "Pointer", which
   provide access to the general desktop (rather than access within
   a particular application the user is running).  Launcher is a
   customizable keyboard whose buttons will launch any application that
   the user placed there - so that commonly used applications like
   web browsers and e-mail can be rapidly launched by the GOK user.
   The Activate keyboard is a dynamic set of keys representing all of
   the running applications on the user's desktop.  Selecting one of
   these will bring the application it represents to the front and
   ready to accept keyboard focus.  Finally, the Pointer button is
   used to release the mouse pointing device (if it is being used for
   direct or dwell selection) for use on the desktop - important if
   the system is being shared by a GOK and non-GOK user (for example
   in a training situation).
 
   Simon then talked about the final two keys on the main GOK keyboard:
   "Window" and "Settings".  The Window keyboard is a set of keys for
   moving the visual GOK window around on the screen - for example to
   move it out of the way of a window underneath that the user is
   interacting with.  The Settings button brings up the GOK configuration
   dialog box, which is where much of the configuration of GOK occurs.
   Simon didn't have time to go through all of the GOK Settings dialog,
   but showed briefly some of the settings, like the ability to configure
   the visual display of the GOK keyboards.

   Simon then invited Peter back onto the stage, and together they gave
   a brief demonstration of GOK on the GNOME desktop.  Peter started
   GOK, used a Tash USB switch and automatic scanning to bring up the
   Launcher keyboard, and from there launched the Text Editor application.
   Peter then brought up the Menus keyboard, and choose first File and
   then Open to bring up the Open File dialog box for the Text Editor.
   Changing his mind, Peter decided he really wanted to write a new
   letter, and so he again used the USB switch to bring up the UI Grab
   keyboard and then choose the "Cancel" button in the Open File dialog.
   Peter continued to change his mind, deciding instead the he wanted to
   launch an application that hadn't been pre-installed on GOK's
   Launcher keyboard.  He again used the USB switch to select the Activate
   keyboard, and from there activated the GNOME desktop menu panel.
   Then he selected the Menu keyboard, and from there the Applications
   menu which promptly displayed a new keyboard listing all of the 
   accessible graphical applications on his GNOME desktop.  Peter choose
   to launch the GNOME Help application.  This concluded the GOK
   demonstration, and this third presentation of the day.

   For more information about GOK, see the following web page:

     http://www.gok.ca
     http://gok.ca/csun2003/  (slides from Simon's presentation)


After these three presentations, Sun hosted a series of "Accessibility
Experience" sessions in their booth.  Up to six attendees at a time attended
these hands-on hour-long sessions on either the Gnopernicus screen
reader/magnifier, or the GOK dynamic on-screen keyboard.  Several of the
systems were set up with the BAUM Vario 40-cell Braille displays, some with
the Madentec TrackerOne head-tracking device, and all with the Tash USB
switch devices.  Nearly 50 users signed up for these sessions, and several
additional folks who hadn't signed up in advanced joined sessions just as
they were starting.  Attendees to these sessions were quite enthusiastic
about the technology.  We received many excellent suggestions for additional
features to incorporate into GOK.  One Gnopernicus user launched Mozilla and
found the Los Angeles Times web site so he could read about the unfolding
war in Iraq.  Another Gnopernicus user was an accessibility consultant who
had written custom Java applications adhering to the Java Accessibility
API.  We downloaded one of his Java applications and Gnopernicus had no
problem reading it, magnifying it, and rendering the Java application's user
interface in Braille. 



On Thursday afternoon and a few hours on Friday JP Schnapper-Casteras
convened the third Linux Accessibility Conference in the La Jolla room of
the Marriott hotel.  Attendees included representatives from Sun's
Accessibility team, Sun's StarOffice development team, RedHat, Adobe, the
American Foundation for the Blind, and the Cincinnati Federation for the
Blind attended, as well as number of other interested individuals.  Sun
Microsystems gave an update on the GNOME Accessibility architecture, and
discussed hopes for seeing several additional applications supporting that
architecture.  Sun also gave an update on the state of
StarOffice/OpenOffice.org accessibility.  There was discussion about
building an open source Daisy reader - so that users with print impairments
on UNIX systems would be able to read electronic books such as those
available from bookshare.org.  JP gave an update on the KDE Accessibility
effort - there is now a formal KDE Accessibility module where work is
going.  There was a lively discussion about Adobe PDF accessibility on UNIX
environments, including discussions about authoring accessible PDF (perhaps
from StarOffice/OpenOffice.org?).  Finally there was a general and
open-ended discussion about a variety of open issues - getting the word out,
recruiting more volunteers to the effort, getting development versions of
the GNOME assistive technologies into users hands for testing, etc.


This was an exciting conference, with a dizzying series of demonstrations of
accessibility on the UNIX platform and applications using the GNOME
desktop.  The features and flexibility of the assistive technologies being
developed is very impressive.  The promise from Sun that these assistive
technologies will be bundled with their desktop computers, and the
expectation that various Linux vendors will also bundles these technologies
with their UNIX offerings, is particularly exciting!


I would like to thank Tash Inc. for their loan of a dozen USB Switch Click
and USB Mini Click single switch devices for use at CSUN.  These switches
work nicely with the GOK dynamic on-screen keyboard on both Intel Linux
systems and Sun Solaris systems, as was demonstrated last month at the
conference.  Numerous people used these switches in Sun's booth and also as
part of their hands-on Accessibility Experience sessions (see above).  You
can get information about these switches at: http://www.tashinc.com/

I would also like to thank Madentec for their loan of several Tracker One
head pointing devices.  Like the Tash switches, these USB head trackers work
very well with the GOK dynamic on-screen keyboard on both Intel Linux
systems and Sun Solaris systems.  Numerous people used the Tracker One at
CSUN in Sun's booth and also as part of their hands-on Accessibility
Experience sessions (see above).  You can get more information about the
Tracker line of head pointing devices at: http://www.madentec.com/

Finally, I would like to thank BAUM for their loan of several Vario 40 cell
Braille displays, which work flawlessly with the BAUM Gnopernicus screen
reader/magnifier on both Intel Linux systems and Sun Solaris systems, as was
demonstrated at CSUN.  Attendees seemed particularly pleased by the degree
to which Gnopernicus supported all of the features of these displays.


Sun will be making the slides from the conference presentations available in
the near future on the web, at: http://www.sun.com/access



Regards,

Peter Korn
Sun Accessibility team



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]