Re: [gtk-list] Re: :gtk design question (events vs signals)



>  Ok, I'm new to X programming, so forgive me for any confusion on my
>  part.  I still think this is a valid question from the point of
>  view of considering GTK as an abstract virtual machine offering a
>  particular interface to the programmer.

Bzzzt, buzzword alert :-)

>  As I understand it, Xt harnesses events by creating an event
>  processing loop, processing events in the queue, and associating
>  them with actions and callbacks.  Gdk works in a similar way, no?

Gdk is just a thin wrapper over Xlib.  Gtk creates a main loop using
Glib, then gets events from the X server using Gdk.  Once it gets an
event, it looks up the widget that the event refers to, and emits the
proper signal for that widget.

>  Yes, I understand this, but why couldn't events be abstracted away
>  behind a uniform signal interface?  Is being notified of a
>  GdkExpose event *that different in underlying concept* to being
>  notified, via signal, that a toggle button has been pressed?
>  They're both "things that happen in user space that we've asked to
>  trigger callbacks in our program."

Huh?  Events *are* presented to the user via object signals.  However,
X events and Gtk+ object signals are orthogonal to each other:

	- Events are just notifications from the X server to the
          application.

	- Signals are notifications from a Gtk+ object to the
          programmer.  They are used to inform the program that
          something interesting happened to an object.

Gtk+ wraps X events (or Gdk events, if you will) using object signals,
for convenience.  Sometimes events are too low level, so the widgets
abstract them further into higher-level signals.  For example,
GtkButton implements the proper behavior that would result from
getting EnterNotify/LeaveNotify/ButtonPress/ButtonRelease events in
the form of prelighting and depressing, and if appropriate, it emits a
higher-level "clicked" signal.  This signal could also be triggered by
a non-mouse action, such as the user pressing Enter when that button
is the default.

This is nice because the application usually doesn't care about the
sequence of events that trigger some behavior in a widget.  From the
point of view of the application, the button was "clicked", and thus
emits that signal.  It may also have emitted button_press,
button_release, and other event signals in the process, but GtkButton
was nice enough to wrap them for you.

And in your particular question, widgets do receive exposure events
via the "GtkWidget::expose_event" signal.

  Federico



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]