Re: Minutes of Gnome 2 release team meeting (late): 2002-05-17
- From: James Henstridge <james daa com au>
- To: veillard redhat com
- Cc: Jens Finke <jens triq net>, Mikael Hallendal <micke codefactory se>, Malcolm Tredinnick <malcolm commsecure com au>, GNOME Hackers <gnome-hackers gnome org>, GNOME Desktop Devel <desktop-devel-list gnome org>
- Subject: Re: Minutes of Gnome 2 release team meeting (late): 2002-05-17
- Date: Tue, 28 May 2002 16:57:07 +0800
Daniel Veillard wrote:
On Tue, May 28, 2002 at 03:32:53PM +0800, James Henstridge wrote:
Jens Finke wrote:
On 28 May 2002, Mikael Hallendal wrote:
Heh, if all of this has to be done something is wrong (you also have to care
about the fact that the user might very well have the xmlcatalog file in
his $HOME). So just checking /etc/xml/catalog will not be enough.
First, sure one can ask libxslt/libxml2 to not fetch anything other the net,
the xsltproc command has an option to disable net access, it's simply based
on providing your own resolution routine for external entities fetching,
it's a matter of cut'n pasting 50 lines from xsltproc.c . I'm pretty sure
I indcated this already to the people working on Yelp.
In my opinion, if a program says "please include the contents of the
stylesheet available at http://...", I would expect it to do go out onto
the internet to fetch it. The XML catalog is simply a cache for XML
Depends, that's the point of catalogs, sometimes you really don't
want this to happen, while maintaining your data portable. Having
file:///usr/local/share/docbook/xsl/html/docbook.xsl
references in your data is a garanteed way to:
1/ maintainance nightmares (oh the stylesheets are located/named
differently on your system ... yes some people use filesystem
where file:///usr/local/ ... won't work).
2/ guaranteed flow of bug report on the long term, the fact that
XML catalogs are not well deployed is a fact, but it will change
thing about the libfoo.so.12.4.5 resolution, guess what, to avoid crazyness
they are using a catalog, same problem, you just don't want to face this kind
of maintainance on the long term.
Daniel, I am not trying to be confrontational about this. I was trying
to point out that:
1. if you make use of http:// urls for resources, you shouldn't be
surprised if http requests are made on systems that don't have a
correctly setup catalog.
2. if you use file:// urls, then this problem goes away, but you run
into the problem that your documents loose their system
independence, which is the reason why http:// identifiers were
being used for the resources in the first place.
3. having a correctly configured catalog solves both the system
independence and no network access requirements quite elegantly.
(I probably didn't make myself clear enough). I think the best solution
is to require a working catalog (as we do now). It shouldn't be
difficult to add the checks to make sure that this is so. The commands
Malcolm posted could easily be converted to autoconf checks (make them
check all the catalogs in XML_CATALOG_FILES too, if that var is set),
and error out if the files can't be found. The error message could even
point people at the simple to install docbook packages provided by the
scrollkeeper guys.
If having the stylesheets installed is considered to be a requirement
for yelp, then yelp's configure script should be erroring out if they
are missing. At the moment, it sounds like many people are building and
installing yelp without an error, but missing the required docbook files.
resources such as DTDs and stylesheets, so that they don't need to be
downloaded. If you want to make sure that you only access local files
when processing a document, then don't use http:// urls for the
resources you use (use file:// ones instead). You lose the system
Okay, you give the advice, you also promise to handle the long term
maintainance of that, okay ?
That is a solution to the problem Mikael brought up (making sure that we
don't do any network access), but I also pointed out that this has the
drawback of losing system independence. Maybe I should have made it
more clear that I don't see this as a good tradeoff.
independence by using local paths, but you know it won't make a network
request for the file.
Wrong way to ensure this !!! Really.
yes.
Maybe it would be nice if libxml could dynamically build a cache of such
resources (download files to somewhere and add pointers to a per user
catalog file for them). That would be the other way around the slowdown
on broken systems (so that it would only be slow once).
No, that's a policy, I won't implement such a policy in the library,
but apps should feel free to do so if they can.
This might be worth investigating for yelp (in the post 2.0 timeframe).
There is always the possibility that some docs that are registered with
scrollkeeper will reference some resource we don't have (maybe some kde
documentation?). Caching these would improve the speed of loading those
documents and from what you said above, libxml2 already provides the
hooks necessary to do this.
James.
--
Email: james daa com au | Linux.conf.au 2003 Call for Papers out
WWW: http://www.daa.com.au/~james/ | http://conf.linux.org.au/cfp.html
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]