Re: stuck in file crawl task loop


> Hmm, this might be the cause of the loop.  The code always checks the
> sqlite database first because it has to[1], which means that if the old
> ones aren't being dropped from the DB we could be pulling old
> information.
> I will look into it.

Joe checked in some fixes and debugging hooks to track these problems in

We expect the situation to improve and hope that the problem will be fixed . 
If it is not fixed, then the logs will contain enough information to help 
nail it.

Anybody seeing these loops, it would be very helpful if you can test the 
changes. If you are building from svn, then just checkout the trunk. If you 
are using packages, I built the 0.2.14 source + these changes. I am putting 
the changed binary files BeagleDaemonLib.dll and IndexHelper.exe at

Backup your original BeagleDaemonLib.dll and IndexHelper.exe (found 
at /usr/lib/beagle or /usr/local/lib/beagle) and replace them with the new 
ones. I would also recommend you to backup your current ~/.beagle - no harm 
in backing up.

I dont have evolution, so the evo-related backends are not enabled in this 
dll. That should not matter, since you will be testing the filesystem 

Then start the beagled with parameter "--backend Files" to only enable the 
filesystem backend and monitor for looping. If you suspect looping, please 
restart with "export BEAGLE_DEBUG_FSQ=1" set, and the log will contain enough 
information for us to nail the bug totally.

Last,  I hope simply replacing the binary file works. I have not tried this 
before. In theory it should work.

Thanks in advance,
- dBera

Debajyoti Bera @
beagle / KDE fan
Mandriva / Inspiron-1100 user

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]