Hi Victor,I have an NVIDIA GPU with 1G dedicated VRAM, i7 Core CPU, and 16G of RAM. In any case, 150x150 radius should be doable. In any case, I am sure that when Gimp goes production, it's bound to come up.Any further report on Mac? While I am finding that it's quite fast on an Mac with 8G RAM, I am not sure if the GPU is being used.Note that everything I have seen/ read about OpenCL says that you don't need to load the library at least on a Mac.Thanks,
ParthaOn Fri, Feb 15, 2013 at 7:22 PM, Victor Oliveira <victormatheus gmail com> wrote:
Hi Partha,Thanks for the bug report, I'll give a look when I can.Notice that a 150x150 radius is very big and maybe yout GPU doesn't have enough memory for that, but it was supposed to fallback to the CPU in this case.ThanksVictorOn Fri, Feb 15, 2013 at 8:47 AM, Tobias Ellinghaus <houz gmx de> wrote:
_______________________________________________Am Freitag, 15. Februar 2013, 04:52:49 schrub Partha Bagchi:
> Hi Victor,That is normal when running OpenCL (or CUDA) kernels on a GPU that has a
>
> Latest git. I have an NVIDIA GeForce GT 230M card with 1G VRAM. The opengl
> version is 1.1 CUDA 4.2.1 etc. Windows 7 64bit
>
> I think OpenCL is taking down my video every time. Here is a simple
> repeatable test for me.
>
> 1. Open 16 bit tiff.
> 2. Duplicate layer.
> 3. Layer - desaturate -> invert
> 4. GEGL gaussian blur x = y = 150.
>
> Takes the screen down (goes black and recovers) and leaves a dark tile on
> the layer.
>
> Any ideas?
monitor connected. On Windows it will time out quite quickly, on Linux AFAIK
not. However, that can be configured. Google is your friend, just search for
"nvidia watchdog".
[...]
> Thanks,
> Partha
Tobias
gimp-developer-list mailing list
gimp-developer-list gnome org
https://mail.gnome.org/mailman/listinfo/gimp-developer-list
_______________________________________________
gimp-developer-list mailing list
gimp-developer-list gnome org
https://mail.gnome.org/mailman/listinfo/gimp-developer-list