[orca-list] Input methods and audio feedback



Hello,

I have been wondering about the audio feedback when using an input
method. In case you don't know: an input method is a non-standard way to
type text, to type for instance asian characters by typing the reading
and choosing the correct character. There are a couple of implementation
for braille to use a standard PC keyboard and have braille
uncontraction.

At the moment what happens is that gtk sends the keypresses to orca,
which speaks them out. When the input method completes the input,
nothing is sent to orca.

In the Braille-on-PC-keyboard typing case, when typing '⠁⠃', gtk
will send 'f', 'f' and 'g' to Orca, which speaks that out, while it
could make better sense to speak 'a' and 'b'. When uncontraction
produces "about", nothing is spoken.

I was thinking that perhaps gtk should, when it gets the uncontracted
text from the input method, send the uncontracted word to orca and let
it speak it out. But that won't fix the 'f' and 'g' utterances. Now I'm
thinking that perhaps the input method should produce the audio feedback
itself, because only it knows what key typings actually mean and how the
committed result should be spoken (maybe nothing should be spoken at all
because it is obvious from what was spoken previously during typing).

What do people think about it?

Samuel


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]