Re: [xml] Memory not going away..

On Tue, Mar 08, 2005 at 05:32:20PM -0500, David W. Bauer Jr. wrote:
Yes, but then I should be able to get that memory back in future calls to
(m/c)alloc.  The real problem I am fighting with here is that my XML files
are large (10^8 bytes), and I am unable to parse the entire file with
xmlParseFile _and_ create my data structures.  I realize that I should be
using some form of reader, but all I really need is for the xmlDoc to
release the memory on calls to xmlFreeDoc.. which, it is still not doing
after all this time.

   it *does* !!!! Take a debugger and put a breakpoint in your free()
routines, you will see that it is called :-(

I wrote a test script which tests the basic behaviour, and the doc is

   The testing methodology is flawed, the libc memory allocation routine
may just compact and free the extra heap because you just added more calls
to malloc() or free().

freed.. but when I carry over the solution to my code, the file will not
go away.  The .memdump IS listing items now, and I am struggling to figure
out why these objects are not being removed.

Coincidentally, if I have system with 3 GB of RAM, and I allocate and free
3GB of ram in my process, but then do not exit.. any other processes would
be forced into swap waiting for my process to complete and free up
resources (namely, 3GB of RAM).  In Linux, calls to free in the process
return the memory to the OS.

   not instantly, this is wrong, you don't get a syscall per call to free().
There is something else going on, I can't tell what, and I suspect your
testing methodology based on OS lookup to not match what happen at the
libc interface.


Daniel Veillard      | Red Hat Desktop team
veillard redhat com  | libxml GNOME XML XSLT toolkit | Rpmfind RPM search engine

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]