Re: large import



On Sun, Dec 23, 2007 at 09:35:00PM -0600, Timothy Millard wrote:
> 
> On Mon, 2007-12-24 at 01:19 +0100, Lorenzo Milesi wrote:
> > > Any suggestions how best to import?  I've got about 80GB of images to
> > > import.
> > >   
> > I strongly doubt you will be able to import them in one shot. Sadly
> > f-spot suffers some memory leaks, and it has been reported that
> > importing MANY pictures at a time will fail.
> > You should do import smaller directory each time.
> 
> I found that F-Spot crashes when /tmp is smaller than my import. This
> was several versions ago and may not be applicable
> 
> My solution was to increase the size available to /tmp to 2 GiB (used
> tmpfs so the files live inside memory).

I'm thinking I'll write a script and move the files and add the photos
directly to the SQLite database.  That would avoid those problems.


I have not looked at f-spot source, but looking at the SQLite schema
it looks quite simple.  Seems like I just need to insert into the
"photos" table.  I suppose I could try and figure out some way to
split the images into "rolls".

But, I'm not sure how to generate the thumbnail images.  Do I need to
do that or will f-spot create them?  I could see that being a
time-consuming task.

I think most of my images have embedded thumbnails. Could I extract
those to use as thumbnails?

I actually do already have a large number of images in ~/.thumbnails
from a previous installation.  How is the file name hash determined?

~/.thumbnails/normal$ ls | wc -l
2989


I think I asked before, but is SQLite the only supported database?

-- 
Bill Moseley
moseley hank org



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]