I have a (glib) program that takes an input and output file descriptor, makes io_channels of them, and then runs a g_main_loop. When input appears, the input handler gets called, when the output channel is ready, the output handler gets called. The input side seems to be fine, if there's nothing to read, it doesn't get called. The thing is that the output descriptor might well be ready to write but I might not have anything for it yet. What I'd like is for the output handler not to be called unless both the descriptor is ready *and* there's data to write. The problem is that the main loop calling the output handler spikes my load, which isn't a nice thing for a server to do when there's no work. Perhaps the only solution is for the output handler to de-register itself every time the work queue empties and to have the enqueue operation re-register the handler. But this seems a horrible kludge. Any suggestions on how to structure this? Tia. -- Jeff Jeff Abrahamson <http://www.purple.com/jeff/> GPG fingerprint: 1A1A BA95 D082 A558 A276 63C6 16BF 8C4C 0D1D AE4B
Attachment:
pgpleWtAg7oqo.pgp
Description: PGP signature