Re: set_encoding() in VncDisplay



On Tue, Sep 01, 2009 at 01:52:05PM -0300, Jonh Wendell wrote:
> > I'm not really understanding the point of this new function, adding
> > two tunables
> > 
> >   VNC_DISPLAY_ENCODING_SPEED
> >   VNC_DISPLAY_ENCODING_QUALITY
> > 
> > When 'SPEED' is set, it is just adding the ZRLE encoding, and swapping
> > order of RRE/COPYRECT, and then duplicating the existing tunable for
> > allowing JPEG or not. ZRLE, RRE, COPYRIGHT are all loss-less encodings
> > so if one order is faster, it should be used all the time, there is no
> > quality issue involved in that choice. 
> 
> My idea is: slow (low power cpu) machines accessing servers in local
> network could use the 'QUALITY' flag, thus avoiding zlib operations. 

Ah, so its not really a speed/quality tradeoff. It is really just about
shifting CPU computation overheads between the client & server. I'm
not sure what we should call this idea though.

Maybe something like

 VNC_DISPLAY_OPTIMIZE_CLIENT_CPU
 VNC_DISPLAY_OPTIMIZE_SERVER_CPU

and then later adding 

 VNC_DISPLAY_OPTIMIZE_NETWORK

(which would do vncviewer like optimization)

> Also, my idea is to implement the 'AUTO' flag, just like vncviewer does,
> based on network throughput. Most of times applications will want to use
> AUTO flag, but 'advanced' users might want to tune that setting.

Doing auto-selection as per vncviewer is really not a good idea. The
way tightvnc does it is inherantly flawed at the VNC protocol level
since it has a designed in race condition. This is why you'll often
see vncviewer crash when it changes colour depth - this is particularly
bad when talking to QEMU's VNC server which exposes the design flaw
more readily.

Auto-selection is a nice idea, but we'd quite likely need todo a new
VNC extension to actually have it work at all reliably. There might
be one already existing that we've not implemented yet, but it'll
need investigating...

> > If that ordering is determined, then having a set_encoding method is 
> > redundant since we already have a set_lossy method for turning on/off
> > JPEG encodings which do impact quality.
> 
> I think that even choosing 'SPEED' setting we should not use tight jpeg
> by default. So, this function should be used, just like it's used
> currently.

I still think whether optimizing client, or server, CPU utilization
either should allow JPEG to be turned on/off with existing flag


Regards,
Daniel
-- 
|: http://berrange.com/     -o-    http://www.flickr.com/photos/dberrange/ :|
|: http://libvirt.org  -o-  http://virt-manager.org  -o-  http://ovirt.org :|
|: http://autobuild.org       -o-         http://search.cpan.org/~danberr/ :|
|: http://freshmeat.net/~danielpb/    -o-   http://gtk-vnc.sourceforge.net :|


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]