Beagle indexing gthumb comments


I have been using gthumb for ages now to associate comments with my
digital photos.  I'm fairly happy with it (i.e. am not planning on
swapping to F-Spot etc), but I would like beagle to be able to index
the comments.  Essentially, given a file:

gthumb stores it's comments in a gzipped xml file:

Which looks something like this when decompressed:
<?xml version="1.0"?>
<Comment format="2.0"><Place></Place><Time>0</Time><Note>Richard reading a bedtime story to Logan and Katie</Note><Keywords></Keywords></Comment>

I have set up a simple external filter like this:

Where /usr/local/bin/ looks like this:
if [ "$1" ]; then
    comments="$(dirname "$1")/.comments/$(basename "$1").xml"
    if [ -f "$comments" ]; then
        zcat "$comments" | sed -e "s|<Time>0</Time>||g" -e "s/<[^>]\+>//g" | grep -v "^$"

beagle-extract-content seems to look good. Yay!  I'm assuming beagle
will automatically realize it has to re-index all the matching files

My question is around what happens if there are multiple filters that
are associated with a particular mimetype?  Do they all get to have a
crack at the file and extract information or is it first in first
served?  That is, will enabling my comment extractor interfere with any
metadata extracted by the existing jpeg image filter?

A second (and probably more important) question: As I understand it,
beagle will only index/update this information when the image file
changes, rather than when the xml comment file changes -- is there any
way to get the correct operation?  (Perhaps it would be possible to
workaround with a cron entry that looked for xml file changes and told
beagle to update the relevant images?  Pointers welcome)


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]