Re: [Gimp-developer] Gegl gaussian blur gamma error



On Sat, Aug 4, 2012 at 11:44 PM, Elle Stone <l elle stone gmail com> wrote:
> While working on the article, I noticed that the Gimp Gegl gaussian
> blur nicely blurs *regular* sRGB images without the darkening
> artifacts that normally accompany gaussian blurring. But when used on
> a linear gamma image, the Gegl gaussian blur makes it look like the
> image was actually blurred in a gamma=0.45 color space (there are
> "lightening" artifacts). So similar to the problem when opening a
> 16-bit tif, there seems to be a strange gamma error involved.

How do you create a linear gamma image to do your tests? The only way
you actually can do that is to open a PNG image that is 16 bit (or to
create a 16bit / 32bit image in the first place). If you take a gamma
encoded image and change it's profile to be "linear light" with lcms
you are changing the pixel values of the image, but the meta data
passed around in terms of bablformats in the GeglBuffer will still
state that this is sRGB data, and when gegl:gaussian-blur is blurring
it that sRGB data will be converted to linear light data for the
actual blurring.

For newly created images in GIMP-2.9 the buffers are created as sRGB
(gamma encoded), since this is where gamma encoding makes sense;
giving higher fidelity in the shadows. For higher bitdepths GIMP-2.9
creates linear light buffers.

/Ø
-- 
«The future is already here. It's just not very evenly distributed»
                                                 -- William Gibson


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]