Widget Portal? OpenGL widgets?
- From: Juhana Sadeharju <kouhia nic funet fi>
- To: gtk-devel-list gnome org, gtk-app-devel-list gnome org
- Subject: Widget Portal? OpenGL widgets?
- Date: Tue, 13 Jan 2004 18:09:53 +0200
Hello.
The widget portal stuff below is a repost from gtk-app-devel; I still
want to know if GTK widgets can be rendered to an arbitrary pixmap.
What changes you should make to the GTK if GTK is not capable of
rendering widgets to arbitrary pixmaps?
Such pixmaps which are not displayed would not be bound to the window
manager, but it is not a problem.
I also asked are there other ways to achieve the same results, but
there might not be another way. If widgets are rendered to screen,
they are bound to the window manager and to the screen size. Arbitrary
number of virtual screens would help if that does not involve X11
re-configuration.
Widget zooming could be possible if I write zoomable widgets myself or
completely discard GTK (e.g., I create one drawing area and then
make my own windows and widgets on that DA). I could still use Pango
for drawing text to my own windows and widgets. Not everything would
be lost.
-*-
Now that widget portal idea comes from 3D graphics, I would like to
know if there are plans to support OpenGL widgets in GTK. I have
heard that many games render widgets with the same 3D graphics
engine which renders the game graphics. Is this true?
How GTK widgets are rendered? If I have a 3D graphics accelerator
card, then are the widgets rendered there? If not, then what
acceleration GTK uses, if any?
OpenGL widgets in GTK could be implemented in parallel with
non-OpenGL widgets, but OpenGL widgets would have to be on the
OpenGL drawing areas. GTK2 seems to not have GtkGLArea widget, why?
Again I could code my own OpenGL windows and widgets on the OpenGL
drawing area (in 3D space?), but it would be better if all the GTK
mechanisms would be usable inside the drawing area windows and widgets.
-*-
Summary: Instead of a widget portal (changing existing widgets to
render to arbitrary pixmaps), it would be better to extend the
GTK mechanisms inside the drawing areas and build a library of
new widgets (perhaps OpenGL widgets first).
How would DRI help in this? Does DRI make OpenGL widgets possible
in X11 as easily as any other widgets? Would we get OpenGL widgets
that way to GTK more easily?
Do you understand? Are you still with me? ;-)
Regards,
Juhana
== cut ==
Widget Portal
-------------
WP makes possible the following features:
-A flicker-free display for a GUI having hundreds of widgets
-An external zoom operation for the widgets
-An arbitrary warping of the widgets
-A post-processing of the widget graphics
WP is a container and is associated with a pixmap. The container size
and the pixmap size matches.
The widgets contained in the WP are rendered to the associated pixmap.
This pixmap may then be copied with an arbitrary mapping to another
pixmap or to the screen. The pixmap can be processed by the image
processing means.
The events related to the associated pixmap are forwarded to the
widgets contained in the WP.
Example application 1:
The GUI is seen through a water.
-A top level window with a pixmap
-Pointer/mouse callbacks on the pixmap
-Periodic timeout callback
-A WP of the same size than the above pixmap
-A punch of widgets contained in the WP
Timeout callback animates a water refraction which is then used
to map the WP pixmap to the pixmap in the top level window.
The inverse water refraction is used to map the pointer related
events from the top level pixmap to the WP.
Post-processing on the top level pixmap can be used to add
light reflections.
Example application 2:
A modular synth editor with a zoom. (See Quasimodo and Nord Modular.)
-A top level window with menubar, module lists, etc., and a pixmap
with scrollbars
-Multiple dynamically instantiated invisible module GUIs
-One WP per module GUI
The pixmap with scrollbars works as a canvas where modules are grabbed
and dragged. WP's geometry is taken from the module database. Because
WPs are containers, will each module GUI pack to the correct size.
The WP pixmaps are mapped to the canvas pixmap based on the module
locations and on the pan and zoom values (zoom out is the preferred
zoom operation here). Pointer events are mapped from the canvas pixmap
to the corresponding WP.
Example application 3:
A wrapper GUI for visually impaired.
-A top level window with a pixmap and a WP
-An application with a GUI
The top level window of the application is replaced with the WP.
Pan and zoom is implemented in the mapping between the WP pixmap
and the visible pixmap. The colors can also be modified: colors
changed, number of colors reduced, contrast increased.
Implementation questions:
How the WP widgets can be rendered to a pixmap instead of the screen?
How GTK renders the widget now?
I would not like to go for solution where one fools X11 to believe
that we have multiple screens and where the widgets are visible
in a fake screen which is then copied. One reason is that the screen
space ends sooner than the arbitrary pixmap space.
If an internal control signal moves the scaler (say), the scaler
should move even it is not in the screen. The WP pixmap is their "screen".
The widgets should not know about this difference, i.e., the WP should
be transparent to widgets.
How to deliver events to the WP? Events like "mouse press at point (x,y)"
should be delivered to the WP. Point (x,y) is relative to the WP container.
== end ==
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]