Re: [Gimp-developer] Appropriate scales for curves and levels

Hi Elle,

On Mon, 2013-06-10 at 15:28 -0400, Elle Stone wrote:
Hi all,

When working on a 16-bit integer or 32-bit floating point image, Gimp
from git "levels" tool shows increments of 0.1 on a scale from 0 to
100.0, theoretically giving 999 increments.

However, when using levels to slowly raise or lower the black point or
the white point, the displayed input/output number jumps by 1 halfway
between integer increments - eg jumps from 0 to 1 at 0.5, from 1 to 2
at 1.5, and so forth. And the image display doesn't change until the
jump occurs. So instead of 999 increments, there are really only 99
increments, less than half as many increments as are available with
any 8-bit image editor using a scale that runs from 0 to 255.

Whether 99 increments or 255 increments, both scales are too coarse
for accurately setting a black point (levels or curves) or modifying
tonality in the shadows (curves) of a linear gamma image. 99
increments is not all that great for more perceptually uniform color
spaces, either.

That was simply a bug, fixed now:

commit 875b1705fc9fa61fb742c6609f46b1f753c41f29
Author: Michael Natterer <mitch gimp org>
Date:   Mon Jun 10 21:51:15 2013 +0200

    app: don't ROUND() all input/output values to int in GimpLevelsTool
    That was a leftover from before the change to 0..100 sliders for
    non-8-bit images. Spoted by Elle Stone.

At one point Gimp had a very fine-grained scale for curves, even
though it only supported 8-bit image editing. Now that Gimp supports
high bit depths, are there there plans to (re)introduce fine-grained
curves and levels scales to match the higher bit depths?

Don't assume that all bugs are intentional features ;) The != 8-bit
code generally seems to work flawlessly, but there are still rough,
edges, particularly in the GUI display of values. They are all bugs

Matching the scales for curves, levels, eye-dropper/pointer/sample points:

The levels scale is from 0.0 to 100.0 except for 8-bit images, where
the scale is from 0 to 255.

The curves scale and histogram "info" scale are always from 0 to 255,
regardless of the image bit depth.

That's work-in-progress, I have already changed the histogram code
in my tree to move towards being proper.

The RGB eye-dropper (which recently has seen some serious love from
someone - whoever you are, thank you!) displays RGB values from 0 to
255, 0 to 65535, or 0 to 1.00000, depending on the bit-depth of the
image. (The floating point eye-dropper also shows values that are less
than 0 and greater than 1 - again, thank you most wonderful person for
making possible the proper eyedroppering of floating point hdr/exr
images without having to resort to Cinepaint or Krita!)

I proudly take the credit for that :)

As going back and forth between curves, levels, and the eye-dropper is
such a basic and common action while editing, it would be nice if the
various scales always matched each other at any given bit depth. As it
is, I end up dividing and multiplying a lot to translate between the
different scales.

That is definitely the plan...

I would request/suggest making all 8-bit scales (eye-dropper, levels,
curves, and of course the pointer information and sample points) as
0-255, all 16-bit integer scales as 0-65535, and all floating point
scales (eye-dropper, levels, curves, pointer, sample points) as 0 to
1.000000, so at any given bit depth all three ways of
sampling/viewing/modifying the image match.

...however I am not sure if we really want the values from the
eyedropper's "Pixel" mode in all sorts of GUIs. But it will definitely
not stay 0..255, that's for sure :)


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]