On Thu, 2014-09-04 at 07:07 +0200, Milan Crha wrote:
Hi, I see. Do you compute the hash on your own, or it is provided by the OAB?
I'm computing it on my own. I'm just using the SHA1 of the binary record in the OAB file, since that's easy enough to calculate. Then stick it in the 'bdata' field in the sqlite. The update process in my tree now goes as follows: - Build a GHashTable of uid -> bdata (i.e. sha1) for each entry in the addressbook¹. - For each item in the new OAB: - look up the UID in the hash table to find the old SHA1 if it exists, and *remove* the UID from the hash table. - If the record existed and the SHA1 matches the record now bring processed, do nothing. - If the SHA1 doesn't match or the record didn't previously exist, insert it into the database. - When all records are processed, any UID *left* in the GHashTable is for an entry that no longer exists. Delete them. This works fairly nicely, although there are some optimisations I can still do. Firstly, I'd like a way to 'SELECT uid,bdata from folder_id'. Getting the initial list of UIDs to build my hash table takes about 95ms. Getting the bdata for each one individually after that takes another 7½ seconds. Tristan? (I did try it with the e_book_sqlite_get_extra_data() call in the OAB processing loop instead of storing them in the GHashTable values in advance. That was actually slower; overall update time for a NULL update went from 33 seconds to 38 seconds. The majority of the time is still taken in the OAB binary processing. We actually generate the full EContact for every record, *then* pass it back up to do the SHA1 comparison and delete it unused for most records. I should fix it to do the SHA1 comparison first. -- dwmw2
Attachment:
smime.p7s
Description: S/MIME cryptographic signature