Re: [xslt] xsltproc memory consumption w/ large DTD / docbook



On Sun, 2 Feb 2003, Sebastian Rahtz wrote:

> >   DTD won't go away soon, especially since the various flavors of
> > schemas don't handle the entity functionalities at this point. We can
> > hope for a better future, but still one need to fix the problems
> > arising right now :-)
> The problem being that each document load involves a DTD load,
> and the user is running out of memory? pre-process the files
> using some processor like rxp, which would expand all the entities
> and throw away the DTD references. build a complete parallel
> tree of files and generate the website from that. clumsy, but
> very easy to implement (since disk space is about the cheapest thing
> component around).
I've just played around with that a bit. rxp doesn't seem to be that a
good choice because it gives lots of warnings on the DocBook DTD. But you
can do the same with xmllint --noent --dropdtd and xsltproc --novalid. It
works quite well and gives some speedup. But it also breaks the <olink>
linking mechanism because you just have the name of an entity in its
targetdocent attribute which is then resolved using the
unparsed-entity-uri XPath-function at transformation-time. Do you have an
idea how to get around this?
-- 
bye, Micha



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]