"Re: [xml] xmlParseChunk with UTF-16LE"



Hi,

On 12/1/2003 1:01 PM Daniel Veillard wrote:

On Mon, Dec 01, 2003 at 12:34:24PM +0100, Kasimier Buchcik wrote:

I seem to be confused here. How can I pass the real encoding (UTF-16LE) 
at parser creation time?


  With xmlCtxtResetPush() for example.

I forgot to write that we are using libxml2 ver. 2.5.10 at this time. We 
cannot switch to the current version yet. "xmlCtxtResetPush" seems not 
to be existent in that version.


At last: I don't have to use the push parser necessarily, for I just 
used it to fake easily the first 4 bytes of an UTF-16 encoded entity, 
which are autodetected by libxml2. If I could set the "real" encoding, I 
would prefer to use "xmlCreateMemoryParserCtxt" and "xmlParseDocument".


  Then there is a *lot* of parser creation interface taking an
encoding declaration, why don't you use one of them ?!? ALL the
xmlReadxxx APIs take a const char *encoding argument.

Well, we did not want to use the xmlReadxxx API since it is slower 
(isn't it still?). Could you think of a way to accompish this with 
version 2.5.10?

Thanks,

Kasimier




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]