[xml] Re: 'Re: "Control over encoding declaration (prolog and meta)'
- From: Igor Zlatkovic <igor zlatkovic com>
- To: Kasimier Buchcik <kbuchcik 4commerce de>, xml gnome org
- Subject: [xml] Re: 'Re: "Control over encoding declaration (prolog and meta)'
- Date: Thu, 15 Jan 2004 15:57:06 +0100
Kasimier Buchcik wrote:
Ok, this issue is DOM 3 related. As you might remember I'm still
struggeling with "to DOMString serialization" and "from DOMString
parsing", which has to be always UTF-16 encoded, regardless of the
content; so if I have e.g. an ISO-8859-1 document I still need it to be
serialized to UTF-16, but it still *has to* contain an encoding
declaration of ISO-8859-1. It sounds like no big deal, but if I don't
have control over both, the target encoding and the declared encoding, I
can't fullfill the requirements of the DOM 3 spec.
You want to put the serialised data in a DOMString? Having a document as
it would be on the permanent storage, verbatim, in a DOMString? Why in
the name of God would you ever want to do this? DOM is there to access
parsed data, not serialised data, no?
> Finally I must admit that there would be a workaround for me: I could
> serialize with the existing API, then encode to UTF-16LE. But since we
> are using quite huge documents, I guess it will not acceptable in
> matters of performance and seems rather stupid.
You have huge documents and want to put them in a DOMString, unparsed. A
DOMString is an object instance which exists in computer's RAM, no? If
those documents fit in RAM, then how huge can they be?
I don't see a problem in post-recoding here, putting serialised data in
a DOMString instance sounds a much worse idea. What are you trying to
do? Perhaps there is a better solution.
Ciao,
Igor
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]