beagle-build-index performance with large fileset



I have a fileset of approx 120,000 files in 22,000 folders. My allow pattern 
limits the files down to 43,000. It seems that it takes a long  time to build 
the index. When I run it again with the index built and no changes it still 
takes a long time. Is an incremental update supposed to run  faster, or is 
there any incremental update? Can I profile this in some way to see if there 
is a way to speed it up? Perhaps this is the expected runtime?

Here's what 'time' says on an initial run:

real    161m1.564s
user    21m2.444s
sys     11m23.006s

Here's what 'time' says on a re-run, (three files updated):

real    165m30.974s
user    95m13.815s
sys     39m42.865s

BTW, I get the message "Debug: Scanned 43356 files in 22663 directories" in 
less than five minutes.

-- 
Pat Double, pat patdouble com
"In the beginning God created the heaven and the earth."



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]