Re: Meta-Data Import
- From: Warren Baird <photogeekmtl gmail com>
- To: David Moore <dcm MIT EDU>
- Cc: F-Spot list <f-spot-list gnome org>
- Subject: Re: Meta-Data Import
- Date: Mon, 10 Apr 2006 14:33:01 -0400
David Moore wrote:
Rather than add this kind of import feature into f-spot, my feeling is
that it would be more appropriate to write a tool that would write the
metadata directly into an XMP or IPTC header in the image. Then f-spot
will automatically see the metadata.
I must admit I'm not that familiar with XMP and IPTC headers... If
they are rich enough to represent all of the metadata I need, that
sounds like a good solution.
Does f-spot already read this data when importing photos?
I'll do some more reading on XMP tonight...
I'd like to see the various photo applications get away from having a
central metadata database at all. I feel that they should use the image
data as the authoritative database and generate a cache as necessary for
fast performance. But upon deleting the cache, switching applications,
reinstalling the OS, moving to a difference machine, etc., all metadata
will seamlessly remain because the image files themselves are the
repository. Maybe such a philosophy is already in mind for f-spot, but
I haven't had much success pushing gthumb in that direction.
I'm a little concerned by the possible performance implications of
keeping the in-image metadata as the authoritative source... I probably
already have about 10k images - and that number is growing rapidly.
With 100k images, I suspect just checking the timestamps against a
database might take some time...
However, as long as there's an option to turn off the scanning for new
metadata for the images, and a way to manually trigger it when you know
the images are more up-to-date, then I think it would probably be fine.
The main drawback of such an approach is that some people don't like
touching the original image file for any reason whatsoever. In that
case (or for formats that don't allow embedded XMP such as RAW), the XMP
data can be stored in a separate file alongside each image. An
important aspect of this approach is that there is only one image per
XMP file so that it can be copied anywhere the image goes.
Ick. I don't like this solution much. I haven't moved to RAW yet ---
but I'm contemplating it. My concern is that if anyone uses a file
management tool that doesn't understand whatever means is used to
associate the metadata file with the RAW file - you lose all of your
The only solution I can think of that might address this (but probably
has more serious flaws) is a centralized database indexed by some kind
of hash. As long as all tools that manipulate the image are smart
enough to update the hash value stored in the database, it might work.
It's been a while since I've looked at the spec of the open RAW format
standard --- does it support XMP or something like it? If so, then a
workable solution for RAW would just be to convert everything to the
] [Thread Prev