[Gegl-developer] Boundary bugs gegl_buffer_set for RGB buffers?
- From: Dov Grobgeld <dov grobgeld gmail com>
- To: gegl-developer-list gnome org
- Subject: [Gegl-developer] Boundary bugs gegl_buffer_set for RGB buffers?
- Date: Tue, 4 Jun 2013 22:37:28 +0300
While trying to make the gaussian-blur plugin work with different input formats, I encountered a bug that I'm getting different result depending on the output format. E.g. I'm getting, what seems to be tile boundary problems when using RGB that I don't see when using RGBA. Currently I have verified that I'm using the "iir" algorithm and that the channel pixels sums of the buffers are identical for the two different formats.
It seems that I'm getting a fade-off effect at the edges of the tiles when doing gegl_buffer_set() into a RGB buffer that I don't get when using RGBA. Is this a known problem? Should i file a bug about it?
Thanks,
Dov
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]