Differences in displayed font sizes



I have a small program that displays loaded text in a GtkTextView,
using tags to change font size. The font size is set by code like:
    tag = gtk_text_buffer_create_tag (buffer, NULL, "size", 10*PANGO_SCALE,
NULL);

I am using GTK+ 2.10.
When I compile under Dev C++ and run locally on Windows XP, the size looks
fine.
When I compile on RedHat Enterprise Linux, run it remotely and display via
my X
server, the size is several points smaller.
When I run the same Linux-compiled executable on my local Fedora Linux box,
the
size is the same as when it is run remotely and displayed on the X server
locally.
When I run the executable on the remote RedHat Linux box, sending output to
my
local Fedora box, the size is the same as when run locally on the Linux box.

To summarize, the only time the size looks "correct" is when it is run
locally
on WinXP.  All other times, it is smaller.  The windows are the exact same
size,
so it is not a question of logical pixel dimensions being miscalculated.

The question is, is there some initialization necessary in order to adjust
for
the underlying font rendering engine? I do not yet understand how fonts are
rendered on Win32 versus X11.

Stewart



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]