Re: [xml] Catalog Loader extended for http://
- From: Daniel Veillard <veillard redhat com>
- To: Johann Richard <Johann richard dspfactory ch>
- Cc: xml gnome org
- Subject: Re: [xml] Catalog Loader extended for http://
- Date: Tue, 20 Aug 2002 09:55:48 -0400
On Tue, Aug 20, 2002 at 03:38:12PM +0200, Johann Richard wrote:
Hi all,
I didn't file this as a bug/enhancement report as I think it's probably worth some discussion before.
I recently realized that libxml2's Catalog resolving mechanism does load (nested) Catalogs w/ "fopen()",
thus being limited to the local file system. Would it be possible to use libxml2's EntityLoader mechanism
to use also remote (http://) Catalogs? And could Daniel (or whoever is in charge of the catalog.c) include
this in a future version?
Hum, there is a number of reasons both practical and theorical which makes
it a not so good idea:
- catalogs are here basically to ensure that you will get instantaneous
access to some information and extract them from a finit universe
(the set of local resources referenced from your on-disk catalog)
starting to add HTTP resources to the set makes the universe infinite
and possibly not atomic (i.e. resources fetched may change in the process
of loading the various parts of the catalog)
- using the entity loader would make the catalog resolution recursive
for the catalog building, which is clearly forbidden in the XML Catalog
spec, technically that could be disabled but well ... that would be messy
- Moreover SGML catalogs need to have a full load before doing a
resolution (while XML catalogs are loaded progressively), and forcing
the fetch of a lot of remote resources before being able to process
a local one seems really contradictory of Catalogs design.
Possible use for such an extended Catalog loader can be that a web server dynamically creates such a
catalog for a Revision System repository (cvs, subversion, ... ), including version numbers, or that one
can access remote files, including XML sources, by specifying an URI or whatever that will resolve by using
a dynamic list of files on the server [etc.]. I think, you should get the point.
Seems to me a simpler solution would be to either NFS mount that part, or
WebDAV mount or even get those resources cached locally on disk
I am also interested to know if others would like to see such an extension or if I am the only one to use
remote catalog files :-D And if I got something wrong, then let me know!
My main problem is that Catalogs are here to make sure processing
speed will be basically bound to processor speed, not to resources
access speed (well that why I use them !) and breaking that assumption
that for example xsltproc --nonet might still fetch and wait on remote
resources for the catalogs doesn't sounds good to me.
Wouldn't making a those resource availble locally be simpler and less
error-prone ?
Daniel
--
Daniel Veillard | Red Hat Network https://rhn.redhat.com/
veillard redhat com | libxml GNOME XML XSLT toolkit http://xmlsoft.org/
http://veillard.com/ | Rpmfind RPM search engine http://rpmfind.net/
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]