Re: [xml] How to parse a file too large



On Mon, Apr 05, 2010 at 11:30:08AM -0300, Fábio Bertinatto wrote:
Hello friends!

I'm using the libxml2 library to parse a 15000 lines file.

After parse the entire file I have freed the memory and used the
malloc_trim(0) function to return the memory to the kernel (as described in
the xmlsoft website).

The problem is that every time I parse the file the process increases 200kB
of memory and I need parse the file several times.

  paphio:~/XML -> wc dba100000.xml
    800004  1000004 14282040 dba100000.xml
  paphio:~/XML -> xmllint --repeat --noout dba100000.xml

  and looking in top one can see the memory used doesn't increase.

  at the end of the run since it's build with the memory debug code:

  paphio:~/XML -> cat .memdump
  02:07:03 PM

  MEMORY ALLOCATED : 0, MAX was 175216622
  BLOCK  NUMBER   SIZE  TYPE
  paphio:~/XML ->

No leak, maximum memory used is the same than for a single instance.
Either you didn't properly freed the tree, or you're unable to get a
precise memory usage from your libc memory allocator, but libxml2
do free everything.

Daniel

-- 
Daniel Veillard      | libxml Gnome XML XSLT toolkit  http://xmlsoft.org/
daniel veillard com  | Rpmfind RPM search engine http://rpmfind.net/
http://veillard.com/ | virtualization library  http://libvirt.org/



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]