Re: no joy...



On Thu, Dec 22, 2011 at 02:34:58PM -0700, Michael Torrie wrote:
Date: Thu, 22 Dec 2011 14:34:58 -0700
From: Michael Torrie <torriem gmail com>
Subject: Re: no joy...
CC: Gtk-app <gtk-app-devel-list gnome org>

On 12/22/2011 01:58 PM, Gary Kline wrote:
    i am reallty not doing anything that arcane.  the nutshell
    of it is that in 

    "while (!done loop)"

    gvim [ or another editor that can use abbreviations ]
    creates a series of text files.  1 to some N.  what it
    written to  each file is then read aloud via espeak -f;
    this application is an attempt to help those who are speech
    impaired or mute and have a small laptop.  i have been
    smallish gadgets that lack a keyboard.   

Okay that explains things a little bit better.  Why are you involving a
text editor like gvim or kate?


        abbrevs.  iFWIW,  my last brain op messed up my entire rt
        side and because my left hand wasn't that good, i type only
        around 20wpm.  by learning only 130 or abbrvs, you can gain
        roughly 30%.   so imagine some poor kid [[[ OR woman--or,
        for that matter, anybody who has a driving goal to learn and
        to *communicate*]]]:: there are roughly 100million with some
        kind of physical disability.  

        typing on an editor like vi/gvim that has builtin
        abbreviations means fewer keystrokes.

        my Xlub code only builds on my server ... right now.  i
        think there was an easy graphic editor; adding the abbrev 
        code to the Xlib editor shouldn't be  that hard.  

Shouldn't you just either write the text you want to speak to a file and
the espeak that?  Or use a pipe to send espeak text?  Or maybe use some
kind of speaking api (maybe espeak has an api)?


        that's what gvim does.  my default filename is 'talk.[N].txt.
        after i've typed "[qesc]:x[enter]"  espeak -f <file> reads     
        it and opens "talk.[N+1]txt" and wait for keybd input.

        but say that somebody want to hear what i said several
        minutes before.  i heave to search all my *txt files to find
        the one he wants.  thed display button will bring up 500, 500 
        windows.  i need buttons on the popped window.  or window.
        One window:  buttons like [prev], [nrxt], [speak] [qauit
        window].  

If I wanted to espeak something I would use fopen to write the text to a
temporary file, then spawn espeak -f to read that.  Or most probably I'd
use popen() and send espeak the text through a pipe. 


        all of my text files are in ~/VBC/<directory[s]> ... 
        everything is saved at least untill the conversation is
        over. noneed to make temp copies unless i wanyed Exact
        record  in the event that i added a few words to an earlier
        file.  --this is for-future-discussion!

That's more basic
Linux programming than GTK programming of course.

    my app is not targeted at people who would use the device
    that has a touchscreen [plus hard drive + batteries].  I'
    tried one of these things in 2003 and a later model in '09.  
    my disability is fairly pronounced, but i could barely lift
    this box.  i believe you could even play games on it.
    for me, the screen was not that easy to press.  i prefer an
    actual keyboard.  


    if i'm talking to people or a person i am hard to understand
    without a few weeks of getting used to my speech patterns;

Well you are understandable now in e-mail, and what you are trying to do
is becoming more clear.

    with a shell script that i put together in 20 minutes, i
    could type onto my EEE-900A and the computer would be my
    voice.  i have been in touch with the people who are
    developing the "$100 laptop" that is being used globally.
    they said: sure, create a gui app that can be used by the
    physically disabled or deaf.  

Okay so you are trying to come up with a graphical program whereby you
can type something (say in a text box) and have espeak speak it so that
others can hear and understand you?  Do I have this right?


        i think so; it isn't rocket science ...  i'll send you the
        code with the gcc line if you 'd like.  



    this morning, i got gvim to spawn a Konsole; espeak  echos
    what i typed.  but while the display button (with other
    buttons) can find something i typed earlier, there is no 
    way to close the display window.  i need some means  of 
    putting buttons on the display window.

Hmm.  Maybe you should post your code so that others can see what it
does so far.


        better yer:)


    in my 11.10 ubunto, the makefile for one zetcode did not
    build the top menu bar.  the two buttons below were there.
    either i'm missing some gtk package, or something else is
    broken.  [?]

I'm not familiar with zetcode.


        there w as a zip file and a "Makefile" that looked straight
        out of the  DOS/Doze playbook.


_______________________________________________
gtk-app-devel-list mailing list
gtk-app-devel-list gnome org
http://mail.gnome.org/mailman/listinfo/gtk-app-devel-list

-- 
 Gary Kline  kline thought org  http://www.thought.org  Public Service Unix
           Journey Toward the Dawn, E-Book: http://www.thought.org
          The 8.57a release of Jottings: http://jottings.thought.org
             Twenty-five years of service to the Unix community.




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]