Re: [gtk-list] Re: :gtk design question (events vs signals)
- From: "Jason Stokes" <jstok bluedog apana org au>
- To: gtk-list redhat com
- Subject: Re: [gtk-list] Re: :gtk design question (events vs signals)
- Date: Mon, 26 Apr 1999 16:16:44 +1000
Ok, I'm new to X programming, so forgive me for any confusion on my part. I
still think this is a valid question from the point of view of considering GTK
as an abstract virtual machine offering a particular interface to the
programmer.
> Events are a stream of messages received from the X server. They drive the
> Gtk main loop; which more or less amounts to "wait for events, process
> them" (not exactly, it is really more general than that and can wait on
> many different input streams at once). Events are a Gdk/Xlib concept; Xt
> does not come into play at all.
As I understand it, Xt harnesses events by creating an event processing loop,
processing events in the queue, and associating them with actions and
callbacks. Gdk works in a similar way, no?
> Signals are a feature of GtkObject and its subclasses. They have nothing
> to do with any input stream; really a signal is just a way to keep a list
> of callbacks around and invoke them ("emit" the signal). There are lots of
> details and extra features of course. Signals are emitted by object
> instances, and are entirely unrelated to the Gtk main loop.
> Conventionally, signals are emitted "when something changes" about the
> object emitting the signal.
>From the viewpoint of the programmer who's thrown control to the standard GTK
event loop, it looks like this: you've got your GTK machine that drives your
event-driven application. From time to time some things happen, like a
toggle-button widget being toggled; these user actions emit signals that you
associate with callback functions. From time to time, other things happen,
such as a DestroyEvent; but these things are called "events", not signals,
even though you handle them much the same way by associating them with
callback functions, but with a different syntax.
> Signals and events only come together because GtkWidget happens to emit
> signals when it gets events. This is purely a convenience, so you can
> connect callbacks to be invoked when a particular widget receives a
> particular event. There is nothing about this that makes signals and
> events inherently related concepts, any more than emitting a signal when
> you click a button makes button clicking and signals related concepts.
Yes, I understand this, but why couldn't events be abstracted away behind a
uniform signal interface? Is being notified of a GdkExpose event *that
different in underlying concept* to being notified, via signal, that a toggle
button has been pressed? They're both "things that happen in user space that
we've asked to trigger callbacks in our program."
> In light of this, I don't see how your question really makes sense; I
> don't see how events could be considered a "lower level signal" or how you
> could merge events with signals.
I haven't thought about the details, but it just seems that from the
standpoint of a applications programmer, not an implementor, it would be
preferable to have one unified interface for "events happening out there",
based on the signals mechanism.
Hope this makes more sense.
--
Jason Stokes: jstok@bluedog.apana.org.au
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]