Re: [xml] How to parse a file too large



Daniel, thank you for your reply.

You are rigth.

I dont understand why this is happening since I freed all the memory...

I know this is not cool, but I can send my code to see if anyone gives any clue?
I´m working in this code for weeks...

Best Regards,

Fábio



2010/4/6 Daniel Veillard <veillard redhat com>
On Mon, Apr 05, 2010 at 11:30:08AM -0300, Fábio Bertinatto wrote:
> Hello friends!
>
> I'm using the libxml2 library to parse a 15000 lines file.
>
> After parse the entire file I have freed the memory and used the
> malloc_trim(0) function to return the memory to the kernel (as described in
> the xmlsoft website).
>
> The problem is that every time I parse the file the process increases 200kB
> of memory and I need parse the file several times.

 paphio:~/XML -> wc dba100000.xml
   800004  1000004 14282040 dba100000.xml
 paphio:~/XML -> xmllint --repeat --noout dba100000.xml

 and looking in top one can see the memory used doesn't increase.

 at the end of the run since it's build with the memory debug code:

 paphio:~/XML -> cat .memdump
 02:07:03 PM

 MEMORY ALLOCATED : 0, MAX was 175216622
 BLOCK  NUMBER   SIZE  TYPE
 paphio:~/XML ->

No leak, maximum memory used is the same than for a single instance.
Either you didn't properly freed the tree, or you're unable to get a
precise memory usage from your libc memory allocator, but libxml2
do free everything.

Daniel

--
Daniel Veillard      | libxml Gnome XML XSLT toolkit  http://xmlsoft.org/
daniel veillard com  | Rpmfind RPM search engine http://rpmfind.net/
http://veillard.com/ | virtualization library  http://libvirt.org/



--


Atenciosamente,


Fábio J. Bertinatto
NetEye Tecnologia
E-mail: bertinatto neteye com br
Fone: (51) 3590-8637
Site: www.neteye.com.br


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]