[xml] xmlreader and chunked parsing?
- From: Nick Kew <nick webthing com>
- To: <xml gnome org>
- Subject: [xml] xmlreader and chunked parsing?
- Date: Sat, 1 Nov 2003 08:25:59 +0000 (GMT)
I regularly use the SAX API with chunked parsing ( [x|ht]mlParseChunk
and family), as this is ideal for a pipelined processing environment
where data are naturally available in chunks.
I've had a brief look at xmlreader with a view to considering it as
an alternative, but I haven't found anything similar. I could in
principle use it with something like
while ( ! end ) {
status = [ process something in xmlreader ]
switch ( status ) {
case OK: [ process and continue ]
case Out of Data:
[ if the parser internally remains in a consistent
state then we can feed it another chunk and continue ]
other: [ handle error ]
}
}
The crucial question is: can I catch out-of-data whilst preserving
internal parser state, and without significant overhead? Is this
realistic, or would I be wasting my time trying?
--
Nick Kew
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]