Re: row and column size not being preserved when switching X displays
- From: sllewbj blueyonder co uk (Joe Wells (reverse mailbox letters only for non-public replies))
- To: Alan Horkan <horkana maths tcd ie>
- Cc: Gnumeric Dev List <gnumeric-list gnome org>
- Subject: Re: row and column size not being preserved when switching X displays
- Date: 05 Nov 2004 21:36:42 +0000
Alan Horkan <horkana maths tcd ie> writes:
For what it's worth, the Gimp "first run" installer asks for a DPI,
offers a way to determine it (measure the length of rulers they give you
and input that length), and allows you to trust the OS (even on
Windows). On Windows, at least, it appears to get values that look
pretty good.
"First run" installers are a disaster, far worse than extra options in the
preferences dialog and in some cases you need to put them in the
Preferences dialog anyway to allow for later configuration.
A "first run" installer would certainly not solve the problem I am
reporting, which is that Gnumeric documents that work on one X display
fail when Gnumeric is run on the same machine but using a different X
display. You certainly can not expect two different X displays to
have the same dpi values. That is why the X server should be queried
for what it thinks its dpi values are.
Hopefully a smart and almost completely automatic method can be
implemented by taking the DPI provided by the XServer as a
'recommendation' (but without trusting it so much as to get caught
out by disinformation).
Some library used by Gnumeric already *is* trusting this dpi value!
So unless you have some way to stop this library from using the X
server's reported dpi, then the main body of Gnumeric must also trust
it to keep the size of various displayed things consistent.
--
Joe
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]