[xml] xmlreader and chunked parsing?

I regularly use the SAX API with chunked parsing ( [x|ht]mlParseChunk
and family), as this is ideal for a pipelined processing environment
where data are naturally available in chunks.

I've had a brief look at xmlreader with a view to considering it as
an alternative, but I haven't found anything similar.  I could in
principle use it with something like

while ( ! end ) {
  status = [ process something in xmlreader ]
  switch ( status ) {
    case OK: [ process and continue ]
    case Out of Data:
        [ if the parser internally remains in a consistent
          state then we can feed it another chunk and continue ]
    other: [ handle error ]

The crucial question is: can I catch out-of-data whilst preserving
internal parser state, and without significant overhead?  Is this
realistic, or would I be wasting my time trying?

Nick Kew

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]