Re: Beagle support for Liferea 1.4+



(Oops. Missed the mailing list earlier)

Currently, whenever the liferea cache file is updated, the entire file
is parsed again and all the feeds are updated. The same thing could be
done, whenever the db file changes, index everything right from the
start.

This could be expensive. Or maybe not. Sqlite is generally pretty
fast, but adding and removing hundreds (thousands ?) of feed entries
might be slow. I am not sure how the current system scales up with
lots of feeds.

The tables in the liferea db might have some suitable datetime field
which can be used to do fetch newly added data.

I can think of one other way to handle updates, but that requires
modification to the sqlite db. During initial scan, add a sqlite
Trigger to the database which adds any new rows into into a new table
NotificationTable (at the beginning of re-indexing, simply clear this
NotificationTable). That way, later when we are looking for
incremental changes, we can simply fetch new rows from the
NotificationTable. Deletes can be similarly handled, keep a flag field
in the NotificationTable indicating whether it is a delete or an add.
The same strategy would work for all db based backends.

The above will not interfere with the usual functioning of the
liferea. The only thing I dont like about the scheme above is that it
modifies user data.

- dBera

PS: FYI, there is a bugzilla entry for this bug. I dont have the
number right now, but feel free to add comments to that bug too.

-- 
-----------------------------------------------------
Debajyoti Bera @ http://dtecht.blogspot.com
beagle / KDE fan
Mandriva / Inspiron-1100 user



-- 
-----------------------------------------------------
Debajyoti Bera @ http://dtecht.blogspot.com
beagle / KDE fan
Mandriva / Inspiron-1100 user


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]