"Re: [xml] xmlParseChunk with UTF-16LE"



Hi,

On 12/1/2003 1:20 PM Kasimier Buchcik wrote:
On 12/1/2003 1:01 PM Daniel Veillard wrote:
On Mon, Dec 01, 2003 at 12:34:24PM +0100, Kasimier Buchcik wrote:

I seem to be confused here. How can I pass the real encoding (UTF-16LE) 
at parser creation time?

 With xmlCtxtResetPush() for example.

I forgot to write that we are using libxml2 ver. 2.5.10 at this time. We 
cannot switch to the current version yet. "xmlCtxtResetPush" seems not 
to be existent in that version.

At last: I don't have to use the push parser necessarily, for I just 
used it to fake easily the first 4 bytes of an UTF-16 encoded entity, 
which are autodetected by libxml2. If I could set the "real" encoding, I 
would prefer to use "xmlCreateMemoryParserCtxt" and "xmlParseDocument".

 Then there is a *lot* of parser creation interface taking an
encoding declaration, why don't you use one of them ?!? ALL the
xmlReadxxx APIs take a const char *encoding argument.

Well, we did not want to use the xmlReadxxx API since it is slower 
(isn't it still?). Could you think of a way to accompish this with 
version 2.5.10?

Sorry for asking once again, but since there was no resonance to my last 
mail, I assume that there is no way of correctly parsing a XML document 
with an actual encoding diverging from the declared encoding with the 
version 2.5.10.

I need this to be cofirmed. Note that this is not a big problem, but I 
just need to know if I have to freeze this issue on my side until we can 
change to a more up-to-date version of libxml2.

Thanks,

Kasimier




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]