[gtk-list] Re: :gtk design question (events vs signals)



Jason Stokes writes:

>> Signals and events only come together because GtkWidget happens to
>> emit signals when it gets events. This is purely a convenience, so
>> you can connect callbacks to be invoked when a particular widget
>> receives a particular event. There is nothing about this that makes
>> signals and events inherently related concepts, any more than
>> emitting a signal when you click a button makes button clicking and
>> signals related concepts.
> 
> Yes, I understand this, but why couldn't events be abstracted away
> behind a uniform signal interface?  Is being notified of a GdkExpose
> event *that different in underlying concept* to being notified, via
> signal, that a toggle button has been pressed?  They're both "things
> that happen in user space that we've asked to trigger callbacks in
> our program."
[...]
> I haven't thought about the details, but it just seems that from the
> standpoint of a applications programmer, not an implementor, it
> would be preferable to have one unified interface for "events
> happening out there", based on the signals mechanism.

I agree with this in principle, but don't know how you would handle
the pragmatic issues.  In my mind, the difference between a Signal and
an Event is entirely the level of abstraction that takes place.  (I'll
use Signal and Event, capitalized, for the GTK meanings of the terms -
event is a particularly lousy name for a callback mechanism.)  A
Signal is an event generated entirely at the whim of the widget,
without any specific regard to the reason for generating it.  The
example of a "click" action occuring when the user presses return or
presses a button is reasonable.  An Event is mapped directly to some
kind of input device or output device activity.  Exposure, mouse
motion, enter/leave, etc. are all events that really are not
abstracted significantly from the I/O devices.  The only reason that
you would want to handle these Events directly is if you also wanted
some low-level, device-dependent information about them.  For example,
if I only care that a GtkButton was clicked, I use the Signal.  If I
want to know the absolute position of the mouse cursor at the time,
and only want to respond to a left-mouse click, I use the Event.  In
effect, Signals can abstract sequences of one or more Events, dropping
low-level information along the way.  I have a tough time calling them
orthogonal, but they are, in my mind, quite different.

The question of whether we should offer Signals and Events as a
unified mechanism really comes down to: Should we offer
device-dependent information to Signal handlers?  Can we do this in
such a way that the abstract nature of Signals is preserved?  As it
stands, you can still acquire the device event information through a
GDK call to retrieve the last event, though this seems to be somewhat
unreliable to me (e.g., what happens if you are looking at the last
event during a synthesized "click" action?).

Andrew



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]