Re: row and column size not being preserved when switching X displays

Jody Goldberg <jody gnome org> writes:

On Sat, Oct 30, 2004 at 11:45:42PM +0100, Joe Wells wrote:
I recently switched to a new computer with a display that has 141
pixels per inch.  When I start the X server on my new computer, I
inform it that it has 141 dpi.  When I run Gnumeric 1.2.12 (Gentoo
ebuild) on the new computer displaying on its X server, my old
documents get their row and column sizes screwed up.  When I run
Gnumeric 1.2.12 on the new computer but instead displaying on the X
server of my old computer (which thinks it is 75 dpi), my documents
look the way they should.

Using both X displays, the rows and columns get the same number of
_pixels_.  For example, I have a row in which I have 8 point text and
Gnumeric reports the height of the row is 9.75 points (13 pixels) on
both X servers.  However, on the new X server, 13 pixels is way too
small to display the text and everything gets displayed as ####.  On
the old X server, 13 pixels is a good size.  When I print to
PostScript, 8 point text looks exactly the same size.  So it is the
row/column size that is changing.

I looked in the source code and it appears that while it is running
Gnumeric keeps track of the row and column sizes in pixels.  However,
in the Gnumeric XML document format, it appears that Gnumeric keeps
track of row and column sizes in points.

How can I get this fixed?  If I knew what source code lines to hack
on, I could do it, but I am way too busy to learn now.  I am happy to
test patches that anyone sends me.

Gnumeric stores things as pts.  However, it defaults to using 96 dpi
for the display, which is what XL appears to do.  You can edit the
settings in the prefs dialog.  When the code was written most X
servers got the dpi measurements wrong, we should probably add a
setting to trust them now.


Thanks very much for answering!

There's an important bit of information in my report that I failed to
emphasize and I am not sure it was obvious to all of you folks why it
is important.  Here is the important bit of information:

  *PART* of Gnumeric *ALREADY*IS* paying attention to the dpi reported
  by the X server and another part is not.

I am guessing the part that is paying attention is one of the GNOME
libraries that deals with font issues.

When some library used by Gnumeric adjusts the size of things based on
the dpi reported by the X server, and the main body of Gnumeric either
takes the default value of 96 or the fixed value recorded somewhere
under ~/.gconf, then things are pretty much guaranteed to go wacky,
because the two values will differ.  Furthermore, the ratio between
the two values can change when one runs Gnumeric from the same machine
but using a different X display.  This is the problem I am reporting.

Currently what this means is that I have to invoke some command (I
forget the details) to edit the GConf registry every time I *start*
Gnumeric to make sure the dpi value stored in GConf matches the X
server's dpi.  If I want to run two copies of Gnumeric simultaneously
on the same machine, but using two different displays, I have to edit
the GConf registry after starting the first copy but before starting
the second.

Can't Gnumeric just use the same code used by the other GNOME library
(whichever one Gnumeric is using that is noticing the X server's
reported dpi)?  Is there no standard GNOME way for this to work?  Or
is each GNOME library and application going to figure out the dpi to
use in a way that is not guaranteed to make them all match?

As a completely separate question, why does this affect printing at
all?  How can printing possibly be dependent on the dpi of the X


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]