Re: [Tracker] Large sparql query results



On Thu, 2010-03-04 at 22:07 -0500, Spivak, Max wrote:

Hi Max,

I have a question about large query results from tracker-store. in my
experience with SQL databases and their client api, when a query
results are generated, only a subset are initially transferred to the
client. The rest are loaded on demand as the client iterates over the
results. 

This usually only is the case when the client requests iterating over
the result using a so called cursor.

Usually are those cursors also one-direction only (you can't go back).

Because SQLite isn't MVCC it would mean that for the whole duration that
a client keeps such a cursor open, all other database connections will
be more or less blocked from doing things.

This is why we don't support a cursor API.

You can emulate a cursor API with LIMIT and OFFSET, of course.

How does libtracker-client handle this? What if the result set is too
big to fit in the client memory? Is there any sort of incremental
result returning ability or callback support with partial results? t

We at this moment don't provide this ourselves in libtracker-client.

LibQtTracker has a "streaming model" which will do this for you on a
QAbstractDataModel as far as I know.

http://maemo.gitorious.org/maemo-af/libqttracker


Cheers,

Philip

-- 
Philip Van Hoof, freelance software developer
home: me at pvanhoof dot be 
gnome: pvanhoof at gnome dot org 
http://pvanhoof.be/blog
http://codeminded.be




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]