[GtkGLExt] Bug in shaders with Mea 7.5 devel driver with Intel 965GM chipset



We develop an application displaying video thanks to OpenGL and GLSL shader.
The shader is used to process YUV to RGB space color conversion, and image enhencements as edge enhencement.
OpenGL is used to process resizings - all kind of zooms.
This application was first developped to run on a PC with NVIDIA FX Quadro GPU, Linux Fedora 10, Nvidia closed source driver
and was written in Java with the LWJGL OpenGL wrapper for Java.
We have test programs developped in C for GTK that use the gtkglext OpenGL wrapper for GTK.
These programs have been tested with the following wrappers :
GLUT, gtkglext, gtkglarea and native GLX.
We want now to develop the same kind of application but for a low consumption embedded platform.
So we want it to run on an Intel 965 GM chipset based hardware.
We are trying our test programs on this platform with Fedora 11, the Intellinux 2D driver and the Mesa 3D 7.5-devel driver.
The drivers come from the Fedora repositories.
In this case the colors appear pink with the gtkglarea, gtkglext and LWJGL wrappers.
The colors appear correct with GLUT and GLX.
After a lot of experiments I have changed the YUV to RGB conversion shader by the following :

void main(void)
{
float r, g, b ;
r = 0.5 ; g = 0.5 ; b = 0.5 ;
gl_FragColor = vec4(r, g, b, 1.0) ;
}

I expected a grey display.
I get grey with GLUT and GLX but black with gtkglext and gtkglarea (not tried on Java).

I change the shader for the fiollowing :

void main(void)
{
float r, g, b ;
r = 1.0 / 2.0 ; g =  1.0 / 2.0 ; b = 1.0 / 2.0  ;
gl_FragColor = vec4(r, g, b, 1.0) ;
}

I get grey for all wrappers.
Fun isn't it !!

So this leads to a walk around to the above problem.
In the YUV to RGB shader given, I can replace all the color space
conversion matrix coefficients constants by fractionnal constants. For
exemple
replace :
y  = 1.1643 * (y - 0.0625) ;
by
y  = 2.3286 / 2.0 * (y - 0.125 / 2.0 ) ;

And the colors are correct.

BUT, if I try to display a pure colors red green blue black pattern, the final computed RGB values depends on the used wrapper.

It is expected that my shader, fed with the pattern, computes the
following r, g, b values :
     R   G   B
blue  0   0   255
green 0   255 0
red   255 0   0
black 0   0   0
We get the following values with GLUT and GLX
     R   G   B
blue  17  63  255
green 32  245 0
red   209 0   0
black 1   0   1
We get the following values with gtkglarea and gtkglext
     R   G   B
blue  26  101 255
green 34  181 0
red   210 0   6
black 17  16  17

Conclusion :
- there is obviously a bug in way gtkglarea and gtkglext drive the shader
compilation.
- there is an extra processing in the OpenGL pipeline which is different
between the two kind of frameworks that modify the colors.

Have anybody an idea ?

I attach a tar file with the demo and its makefile.
This demo has been tested on an Lenovo R61 Laptop with Intel 965 GM chipset and Fedora 11 (pink colors with gtkglext and gtkglarea) and on a Dell M4400 laptop with Nvidia Quadro GPU with Nvidia closed source driver and Fedora 10 (correct colors with all wrappers).

The demo is intended to display a red green blue checkerboard shaped pattern.
GlxInfo data for the two above hardwares are attached in the tar file.
Just replace the  xx.fs shader with the above shaders.

Thank you.

Attachment: yuvglxdemo.tar
Description: Binary data



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]