Re: [xml] stopping parsing

On Fri, Oct 31, 2003 at 10:35:35AM +0100, Bjorn Reese wrote:
On Fri, 2003-10-31 at 00:01, Daniel Veillard wrote:

  Hum, the problem is that the parser drives the data consumption.
And unless adding check for the condition set by xmlStopParser() in
a lot of places it's hard to make an instantaneous stop.

Would longjmp in the SAX callbacks do the trick, or would that
leave the parser in an inconsistant state (risking memory leaks
and worse)?

  Hard to tell without testing. I love setjmp/longjmp for historical
reasons, but I'm not sure I want to rely on them for the parser.
In the pre-2.6.x area memory leaks would be nearly garanteed with
this approach. Now that we use a dictionnary attached to the parser
context it is less obvious, but still I think there are serious risks
for example sometimes attribute values need to be duplicated as new
strings,  if the client code does a longjmp to a state started in 
xmlParseDocument() from the startElement() callback, then yes I'm 
afraid this could leak. Similary when element content is passed down
and needed to be copied, longjump'ing from the character() callback 
might leak too.
  Really, I think staying in the structured processing is really safer.


Daniel Veillard      | Red Hat Network
veillard redhat com  | libxml GNOME XML XSLT toolkit | Rpmfind RPM search engine

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]