[Nautilus-list] Re: Feature request: mouse gestures in Nautilus



Hi, Simos, et al.  My 2 bits on your below questions are as follows:

>	There are decisions to be made on where the mouse gesture support
>should be put.
>	a. How much should be abstracted? Should it be only for pointing
>	devices like mice? Or for both mice and pens?

I see no reason to not make it available to any pointing device.

>	b. Should there be an X extension or is there one that can
>	accomodation the mouse gesture functionality?

I'm not sure that I'm qualified to answer authoritatively on the question, but 
my inclination is to have the stroke events sent to the X clients as a message 
containing the raw stroke code, and add convenience functions to the gnome 
libraries (or a CORBA service?) that would manage the mappings of stroke codes 
to discrete strokes, and perhaps to keymappings.

>	c. Should the mouse gesture core be put in the X server, the
>	graphical environment (like GNOME or KDE), the Window manager
>	(like sawfish) or as an even higher level applications like
>	in the Sensiva solution?

The window manager is a quick and easy hack, but only works for window manager 
functions (or key mappings if it sends the events to a client window, I 
suppose).

I think the right place for it is in the window manager.  Send the events to 
whichever window the stroke was started on.  (Or catch some strokes in the 
window manager.)  I hacked stroke support into fvwm2 a long time ago and it 
only took an hour or so.  (it was even hacked into the config file parser, 
etc, so it was a really clean import)

On the recognition ability of LibStroke, I am confident that it can recognize 
reliably all of the strokes that the CAD tools do.  I am not sure about how it 
compares to the accuracy of, say, Palm's recognizer.  It's certainly accurate 
enough for general GNOME use.

Regarding the utility of stroke interfaces, I'm sure it will be an important 
feature in windowing systems in the future, and am a little suprised that it 
isn't already.  ;-)

My co-maintainer (Dan Nicolaescu) and I (mostly Dan!) have made good progress 
in making LibStroke GNOME-aware, so that work is already started.

Mike (of wayV fame), have you any thoughts on the right way to integrate 
strokes into GNOME apps?  wayV has a great feature that allows for temporary 
writing of the stroke to the screen as you're entering it.  That should surely 
be a feature of whichever layer is capturing the stroke.

The LibStroke philosophy is to make the library portable and independent of 
the environment (X11, GNOME, embedded, whichever) but to include lots of 
support functions for various environments and example code so that developers 
can make strokes available to their applications.  In the case of GNOME, which 
I'm a big fan of, I think making a complete support package and integrating it 
in a clean way would be warranted -- that way a large pool of developers can 
use the strokes by working through the GNOME framework as they're used to w/o 
having to do any hacking.

Mark




In message <Pine LNX 4 33 0105080615080 12376-100000 pc96 ma rhbnc ac uk>, Simo
s Xenitellis writes:
>
>Dear All,
>	This is an initial e-mail about mouse gestures,
>or in the general view, a way to associate input device events
>to specific actions.
>	For example, if someone is browsing using Mozilla,
>she may use the "secondary" mouse button to click and drag
>an imaginative line (or "stroke") to the left in order to
>go to the previous page in the history.
>	Programmatically, this associates the sequence of
>events:
>	. secondary button clicked (get position)
>	. mouse move (record positions while moving!)
>	. secondary button released (get position)
>
>to the History Back action (Alt-Left arrow, for example).
>
>Using the position values, an application can determing the
>direction of the mouse.
>
>	This functionality is quite common in PDAs
>and strangely enough with the CAD products.
>
>	It has become very popular from a couple of postings
>in Slashdot, one discussing the new version of the Opera
>WWW browser that has "mouse gesture" support
>and another that was mentioning  a company called "Sensiva"
>(www.sensiva.com) that has Win/Mac and Linux application software
>to enable gesture support.
>
>	Reading the comments on mouse gestures, there were two
>groups a people, people either adoring them and other that disliked
>them. The negative group results could be due to user-interface
>unfimiliarity.
>
>	Thus, one could use the Sensiva Linux client and configure
>it to be used in Linux X applications.
>
>	Apart from the Sensiva client, there are several
>open-source initiatives such as:
>
>	http://wayv.sourceforge.net 	(Mike Bennett)
>	http://http://www.etla.net/libstroke/ (Mark Willey)
>	http://www.handhelds.org/projects/xscribble.html (HandHelds.org)
>
>This "mouse gesture" functionality looks to be a required feature in
>embedded versions of GNOME and a highly desired one in GNOME in general.
>
>	There are decisions to be made on where the mouse gesture support
>should be put.
>	a. How much should be abstracted? Should it be only for pointing
>	devices like mice? Or for both mice and pens?
>	b. Should there be an X extension or is there one that can
>	accomodation the mouse gesture functionality?
>	c. Should the mouse gesture core be put in the X server, the
>	graphical environment (like GNOME or KDE), the Window manager
>	(like sawfish) or as an even higher level applications like
>	in the Sensiva solution?
>
>	Since Nautilus is enhancing the user-experience quite
>dramatically, there might be a need to add such support of mouse gestures
>in a clean way. Mouse gestures constitute a big enhancement to the UI.
>
>	Mouse gestures do not appear to be present in either Windows
>or the Mac and additionally appear not to be patented.
>
>I would be happy if I manage to stir a conversation on mouse gestures.
>
>Thanks,
>simos
>
>







[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]