Re: Help needed: About gnome architecture - auditory desktop
- From: Bill Haneman <bill haneman sun com>
- To: Eduardo Trapani <etrapani unesco org uy>
- Cc: Bill Haneman <bill haneman ireland sun com>, gnome-accessibility-list gnome org
- Subject: Re: Help needed: About gnome architecture - auditory desktop
- Date: Tue, 08 May 2001 21:00:02 +0100
Eduardo Trapani wrote:
>
> Hi Bill,
>
> > I recommend that you look at the Gnome Accessibility Project pages
> > (http://developer.gnome.org/projects/gap) and review the ATK API.
>
> Ok, I am doing it right now. I would hold the rest of the message until
> I have read enough but I can't wait. It was great to receive such a
> prompt response.
I hope you are finding it useful, welcome to the Gnome accessibility
effort!
> > There is work already underway that, in effect, creates ATK "peers"
> > for desktop objects. The textual and other information provided by
> > these ATK implementations is intended for use by alternative interface
> > presentations, such as auditory interfaces.
>
> I would then just have to deal with the ATK peers and from them gateway
> the information to my interface.
Yes, though I think that eventually what you are hoping to do might be
better off using the out-of-process SPI rather than directly using
ATK. If you use ATK, of course, your solution will have to "live" in
the same process space as the application, but if you use the SPI
interfaces (which, unfortunately, are not yet implemented, but should
be ready in a couple of months) then your "auditory desktop process"
can have its own pid. In essence what you are proposing to do it
write a type of "screenreader".
> > > b) send events to widgets so that I can gateway the input from the
> > > auditory desktop to the gnome desktop.
>
> > GTK widgets are being patched to better support keybindings. Vision
> > impaired users should be able to access all of the GTK widget actions
> > from the keyboard, and by using ATK an auditory interface can be
> > notified of the user's actions.
>
> Mmm, I am not sure about this. Let's say the auditory interface says
> "Ok button" and the user speaks to the microphone "go ahead". How would
> then the auditory interface "click" the button? Maybe it is not voice,
> but there could be a shortcut key to click on the last heard button.
> Would that be covered?
The AtkAction interface (and the SPI's AccessibleAction interface)
will support the idea of "actions" that certain user interface
elements can have. These interfaces allow clients to invoke actions
on widgets. However, in the case of GtkButton, the normal case is for
the auditory interface to speak "OK button" in response to a focus
traversal via the keyboard to the button. In this case since the
button has focus, if you press the Enter key on the keyboard, you will
activate it.
If you are talking about speech recognition as well, then there are
two approaches: either the speech input system can synthesize a key
event for "Enter" (and TAB for "next", etc.), or the speech system can
invoke the button's action via the AtkAction or corresponding SPI
AccessibleAction interface.
> > Perhaps you could describe in more detail the particular applications
> > and "use cases" which you would like to support, then we could discuss
> > how existing and planned Gnome accessibility interfaces could best be
> > used.
>
> Well, this is just an idea. I have been reading a lot about
> accessibility projects (mercator/ultrasonix, etc.), seen patches to
> xlib, and what I liked most is the notion of a roomed auditory
> interface. Basically you have rooms where things are located. Things
> are applications, documents or doors. That is what concerns the
> desktop.
This maps reasonably well onto the visual metaphor of multiple virtual
"desktops".
> For running applications then you could have a menu shelf or ... I don't
> know, I have not thought about that part yet and still there area some
> documents on auditory interfaces waiting to be read.
>
> My idea is to gather as much information as I can from the widgets and
> their relationships and then build an object model that the auditory
> desktop can use.
>
> (user) -- [auditory desktop]
> |
> |
> ----------
> (translator)
> ----------
> |
> |
> (application) -- [ gnome desktop ]
> [gnome applications]
>
> This would be the situation. The auditory desktop only sees/uses the
> objects from the translator. This isolates the auditory desktop from
> gnome, kde or whatever is on the other side.
The Gnome Accessibility SPI serves the role of the translator in your
diagram, if I understand you correctly (see
http://developer.gnome.org/projects/gap/SPIBlockDiagram_small.png).
If your auditory desktop interfaces with the SPI you will be
"isolated" in the way you describe.
> The translator builds its object model (if there is already a suitable
> one it will use that one directly) from the graphical one. Gnome seems
> to be accessible directly from the auditory interface, but I would
> rather have the translator, even as a very thin direct-mapping layer,
> just for the sake of portability. Besides there migh be some small
> adjustments and certainly the relationship between graphical and
> auditory widgets will not be one to one.
The adjustments to the "object model" are typically taken care of
inside the AT (adaptive technology) if necessary. If you don't do
lots of special-casing on a per-application basis you will probably
find that the widget hierarchies (GtkWidget hierarchy vs. AtkObject
hierarchy) match more closely than you might at first expect.
> The goal is that at least well behaved gnome applications could be
> transparently routed to the auditory desktop.
The goal, which we are making good progress towards, is for all
applications that use stock GTK+-2.0 widgets to be fully accessible
(usable via ATK interfaces) unless they do things like override
standard keybindings or do other nonstandard things. For those and
other applications, the application maintainer can take other steps
(like supporting ATK interfaces for custom widgets) to ensure that the
applications are still fully compatible with the Gnome accessibility
architecture.
> > If you truly want to present a different metaphor, then perhaps GTK+
> > is not the right tool, you might consider other ways of providing
> > alternate interfaces to applications.
>
> You might be right, but applications do not know they are going to be
> accessible and to translate one metaphor to the other I need to know the
> logical structure of the graphical one..
Of course. But this is why we are focusing on adapting the existing
graphical metaphors to more general meta-interfaces, ATK/AT-SPI.
> I have thought of many ways to do it, but you always have to "cut in"
> somewhere and so far Gnome seems to provide the most comfortable place
> to sit and have access to everything. If I were to, for example, create
> a pseudo X server then I would be losing a lot of information about the
> widgets. (I think that in this case some "common sense" logic would
> help try to guess the logical structure, but it would be really
> complicated.)
You might get some widget containership info out of X but that's about
all, you would probably lose a lot of textual information. If you
can't do what you are planning with ATK and the Accessibility SPI,
then those interfaces need to be extended, because that's precisely
what they are for :-)
Best regards,
Bill
> Any other ideas on how to do it?
>
> Eduardo.
>
> _______________________________________________
> gnome-accessibility-list mailing list
> gnome-accessibility-list gnome org
> http://mail.gnome.org/mailman/listinfo/gnome-accessibility-list
--
--------------
Bill Haneman
Gnome Accessibility / Batik SVG Toolkit
Sun Microsystems Ireland
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]