Re: [xml] Question about memory constraints when using SAX parser

On Fri, Nov 19, 2004 at 06:59:20PM +0200, Veiko Sinivee wrote:

I'm writing a digital signature and xml encryption library. I want
to support encryption & decryption of very large files too. Well
encryption goes fine because I read, encrypt, base64 encode and write
data to the file in blocks of 8 KB or so. For decryption I choose to use
the SAX interface because it's probably the fastest and requires the 
least amount of memory. Now I found, that xmlCharaters() callback
function is still called with a lot of data. Earlier versions of libxml2
used to return data in blocks of x KB. Now If I have something like:

700 MB of base64 encoded data here

then the parser attempts to collect all of that data and return it
in one event to my callback function. This obviously doesn't work.
How can I switch it out and tell the parser to return at most x KB
in any event ?

  It completely depends how you invoke the parser !
If you mmap'ed the 700MB file and no modification need to be done
to the characters sent in the callback you will directly get pointers
from the mmap'ed aread without allocation.

  In a nutshell your report as is cannot be answered


Daniel Veillard      | Red Hat Desktop team
veillard redhat com  | libxml GNOME XML XSLT toolkit | Rpmfind RPM search engine

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]