Re: "Re: [xml] xmlParseChunk with UTF-16LE fails on special occasion"
- From: Daniel Veillard <veillard redhat com>
- To: Kasimier Buchcik <kbuchcik 4commerce de>
- Cc: xml gnome org
- Subject: Re: "Re: [xml] xmlParseChunk with UTF-16LE fails on special occasion"
- Date: Mon, 1 Dec 2003 07:01:06 -0500
On Mon, Dec 01, 2003 at 12:34:24PM +0100, Kasimier Buchcik wrote:
I seem to be confused here. How can I pass the real encoding (UTF-16LE)
at parser creation time?
With xmlCtxtResetPush() for example.
At last: I don't have to use the push parser necessarily, for I just
used it to fake easily the first 4 bytes of an UTF-16 encoded entity,
which are autodetected by libxml2. If I could set the "real" encoding, I
would prefer to use "xmlCreateMemoryParserCtxt" and "xmlParseDocument".
Then there is a *lot* of parser creation interface taking an
encoding declaration, why don't you use one of them ?!? ALL the
xmlReadxxx APIs take a const char *encoding argument.
Daniel
--
Daniel Veillard | Red Hat Network https://rhn.redhat.com/
veillard redhat com | libxml GNOME XML XSLT toolkit http://xmlsoft.org/
http://veillard.com/ | Rpmfind RPM search engine http://rpmfind.net/
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]