Date: Mon, 19 Dec 2011 08:52:39 +1030
From: 00ai99 gmail com
To: stecnico0506 hotmail com
CC: gimp-developer-list gnome org
Subject: Re: [Gimp-developer] Luminosity in LAB does not agree with Wikipedia or Matlab
What you have not quite said, Richard, is that this adjustment is the one that you would need in order to correct linear RGB to standard sRGB -- so the output of GIMP is not gamma corrected. IMO this is correct, since gamma-correcting would introduce inaccuracies and also reduce precision (which in this case is already limited to 1/256.). The wikipedia images are based on converting the L to rgb by treating the AB channels as if they were zeroed. This produces an image that *looks* accurate but is mathematically inaccurate.
That is the part I was not sure about, but I erred on avoiding speculation and chose to just make observations as I came across them.
Out of curiosity I looked through the C source for the decompose plugin and noticed that the LAB decomp actually performs a cube root (and offset) of its input values during its calculations, which is why a linear input gradient produces a nonlinear result.
But as stated by the original poster this behavior is incompatible with the results produced by Adobe Photoshop or Matlab using their LAB color modes -- Wikipedia states that "uniform changes in L*a*b* components should correspond to uniform changes in perceived color" -- not the absolute color intensities, but its perception. IMO, if you take a greyscale gradient that looks linear and uniform in GIMP's native RGB space and decompose it to LAB, the L channel should still look linear.
Perhaps there can be an additional L*a*b* color option added to the plugin that incorporates the gamma correction into its processing? Performing it manually after the decomp does lose some quality in the color channels for obvious reasons, but if it could be included as part of the plugin's execution then it would not.
strata_ranger hotmail com
Numbers may not lie, but neither do they tell the whole truth.