Re: [Gimp-developer] 10-bit display color depth support (30-bit, deep color)
- From: Magnus Larsson <k magnus larsson bahnhof se>
- To: gimp-developer-list gnome org
- Subject: Re: [Gimp-developer] 10-bit display color depth support (30-bit, deep color)
- Date: Sun, 5 Dec 2021 12:58:52 +0100
A fix in message
I am using EIZO Color Edge CG279X monitor and the quoted message was
from Nov 2020.
Magnus
On 12/5/21 12:43 PM, Magnus Larsson wrote:
Hello gimp-developer-list,
In this mail to gimp-user
https://mail.gnome.org/archives/gimp-user-list/2020-November/msg00002.html,
from /From/: LKH about 10-bit display color depth support (30-bit,
deep color) the discussion (from last year Nov 2021) end with:
"
Re: [Gimp-user] 10-bit display color depth support (30-bit, deep color)
------------------------------------------------------------------------
* /From/: LKH <lkhgraf protonmail com>
* /To/: Liam R E Quin <liam holoweb net>, "gimp-user-list gnome org"
<gimp-user-list gnome org>
* /Subject/: Re: [Gimp-user] 10-bit display color depth support
(30-bit, deep color)
* /Date/: Sun, 15 Nov 2020 06:19:32 +0000
------------------------------------------------------------------------
I am still experiencing banding with a gradient made from scratch in
GIMP 2.99 in 32-bit floating point mode
on X11 when xdpyinfo and xwininfo say the window is 30-bit. If
dithering is off the banding is obvious. I
believe 16-bit support for TIFF is in GIMP anyway because it says it
is in 16-bit mode after loading it.
I have tested Adobe Photoshop CC on both Windows and macOS and the
monitor does successfully display in 30
bit. No amount of zooming in on the test gradient produces banding in
those cases.
"
Issue:
I am seeing the same thing as user LKH, that is GIMP does not render
30-bit, deep color on my Linux system. (I do not have Adobe Photoshop
on Windows, so this I have not tested).
I understand many conditions have to be fulfilled:
1) Application can render: I am using GIMP_2_99_8-110-g44f6ee36fe
compiled from git source.
2) OS can render: I am using Debian 11 with X11 with 30 bit depth
enabled (not Wayland):
xwininfo -root | grep Depth => Depth: 30
glxinfo | grep "direct rendering" => direct rendering: Yes
3) I have an NVIDIA Quadro RTX 4000 GPU with NVIDIA drivers that
support 30 bits and Nvidia Xserver Setting app report back 30 bits
when checked
4) I am using EIZO GLX 2790X wide gamut monitor and monitor reports
that it is operating in 10 bits when queried: EIZO monitor | Signal =>
Full Range, RGB 10 bit
5) A set of test files which actually has 10 bit or more raster and
GIMP in 16 bit mode: GIMP | Image | Encoding => 16 bit
I can still see no difference between an 8 bit gradient (in 8 bits)
and a 16-bit gradient (in 16 bits).
I use these test images (I have used other test files as well. Same
result. No 10 bit gradation is rendered in GIMP display):
https://www.eizo-apac.com/support-service/tech-library/monitor-test
16_bit_TIFF:
https://www.eizo-apac.com/static/uploads/files/16bitgradation-wfuncyaztzod.tif
8_bit_gradient:
https://www.eizo-apac.com/static/uploads/files/8bitgradation-wfqfukzeuago.tif
What do you think?
I respect the complexity of 1-5 above, but feedback in gimp-user list
refered to above, seem to indicate that GIMP can use 30-bit visuals,
at least GIMP 2.99-x.
Best regards,
Magnus Larsson
_______________________________________________
gimp-developer-list mailing list
List address: gimp-developer-list gnome org
List membership:
https://mail.gnome.org/mailman/listinfo/gimp-developer-list
List archives: https://mail.gnome.org/archives/gimp-developer-list
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]