Re: [orca-list] Most accessible IDE for java





On 12/20/2016 11:39 AM, Tom Masterson wrote:
To make accessibility universal there are a couple of things that need
to happen.

1.  We need to make it part of the education system and start teaching
it's importance and how to do it as part of coding classes.  It is not
taught and therefore not even thought of by most developers.

2.  Make it easy if not almost automatic so that developers don't have
to spend extra time adding it.  Especially when using an IDE it should
automatically add at least some form of accessibility items even ...
I think this model of accessibility is fundamentally flawed.

I'm not visually impaired, my hands don't work right and I use speech
recognition. If you think support for visually impaired users is bad,
there is nothing for speech recognition users.

My experience building my own speech interfaces has led me to believe
that trying to maintain a GUI with the lumps on the side for
text-to-speech or speech recognition support is completely flawed. Our
user experiences have almost nothing to do with how you represent
information on the screen.

A button, visually, something you press. From general user experience
perspective, it's "something you can do". To build good interfaces for
likes of us, we need to know and to be able to drive the interface from
the "what you can do" level, not from knowing there is a green oval with
the word "drink me" on it.

The engineering/technology change (and maybe legislative) is that all
applications should provide an API/micro services interface that exposes
all of the things you can do with the application to build the user
interface. For example, the GUI would be a separate URI from the
text-to-speech or speech recognition interface. But it should be
possible to run all interfaces in parallel so that you can listen to the
system's response to what I've said.

Part of the reason why I advocate for this approach comes my
observations that people with working hands and eyes can't even produce
a good GUI interface. How could you expect them to produce a good
accessibility interface when they have no experience with how
disabilities affect one's ability to use a computer.

Another reason why I advocate for the micro services model is that I
want to have one box with all of my speech controls with me and any
system I use becomes automatically accessible just by connecting in my
little accessibility system. You don't make every system accessible by
loading down accessibility tools. You make systems accessible by
providing an interface usable by another machine.

The year or two ago I did an experiment with the Windows VM running
NaturallySpeaking driving the host machine (Linux). It works rather
well. I didn't have enough time, money or energy to build a full
accessibility framework but it gave me good enough results that I think
it's worth exploring further.

Slightly shifting, I find the topic of making eclipse accessible
interesting. I've been exploring programming using speech recognition
for way too many years. Progress has been hampered by the almost
completely orthogonal needs of a disabled user versus a tab user. From a
speech recognition perspective, the interface needs logical elements,
not characters. For example, I want to be able to change the second
argument or navigate to the next predicate, not move over 20 characters
then delete the next six characters. I suspect a similar UX change would
help visually impaired users as well.

So while I appreciate the work that folks have put into the various
accessibility toolkits, I think the toolkits should be thrown away
because they exclude a whole class of accessibility needs and they are,
quite frankly a dead-end solution because they don't provide the
information accessibility solutions need.





[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]