Information re. display gamma



Hello,

while translating the gcm, I spotted the following:

--
Gamma is adjusted to affect the color of the attached monitor.
Traditionally Linux has used a gamma value of 1.0, but this makes
monitors look washed out compared Windows XP or OS X. Apple
traditionally used a value of 1.8 for a long time, but now use the same
value as Microsoft. Microsoft has always used a value of 2.2.
--

Where does the information that Linux has used a gamma of 1.0 come from?


I would assume it is a wrong interpretation of technical data.

Both Mac/Win tools prefer to tell the user the final viewing gamma
(which is a combination of gamma adjustment in LUT of graphics card and
of the monitor's native gamma). These tools expect that the monitor
native gamma is 2.2 (which is close to sRGB). In the past, Mac monitors
had native gama 1.8.

Linux tools - like xgamma and gcm - instead tell the number, which is
applied to LUT only. Note that the final viewing gamma is equal to
monitor's native gamma divided by LUT gamma.

In other words, an image (untagged by ICC profile) displayed in Windows
and Linux will appear the same, provided that the same video card and
monitor is used and no colour management/calibration is applied. This is
also the reason, why display ICC profiles created in Windows can be used
in Linux.

If the information quoted above remains, users will be confused and try
to set gamma 2.2 in gcm.

I would propose to change the text so that it is clear that the user can
correct the monitor's native gamma and that it is not the final viewing
gamma.

Or am I wrong?

Milan Knizek
knizek (dot) confy (at) volny (dot) cz
http://www.milan-knizek.net - About linux and photography (Czech
language only)



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]