Re: Request: Test suite for EFS.



> I still think that a "newbie (who just wants to have a document)" would
> not be useing the command line to do his/her copying. They would be useing
> gmc or nautilus. The guru's would be the only ones to be doing that
> command line copying, and they could grasp the concept. Most windows users
> never touch DOS, so why do you think the newbie gnome users would touch
> the console?

We aren't thinking ahead here. Documents aren't just copied and opened in
their parent app. They are also sent as attachments in email, ftp'ed,
and so on and so forth. If we leave a "document" as a plain directory
with stuff in it, "newbies" (who as you say won't use the command line, which
BTW I disagree with) will have a hell of a time doing all that.

I'm not a newbie, but I still think that a plain directory is not the correct
solution. Here is what I've gathered so far. People on this issue fall into
four groups:

	(1) A document should be a directory with files and subdirectories.
	(2) A document should be one file.
		(a) The file should be serialized text (i.e. XML)
		(b) The file should be in "package" form.
			i.  The "package" format should be an existing
			    widely used format (i.e. tar, tar.gz, zip)
			    even if there are a few deficiencies.
			ii. The "package" format should be something newer
			    that doesn't have those deficiencies. (i.e EFS)

My take:

As evident in my first paragraph, I disagree with (1). There is potential for
lots of problems. Besides, my common sense tells me that a document is one
entity, and should be treated as such.

Now for (a). XML is great, but not a solve all solution. In this case it isn't
so cool an idea because if the problems of trying to access part of a LARGE
document, or searching for a field in MANY documents, and so on and so forth
with saving after a minor change, etc. Yes it is true one CAN still solve 
most of this in XML. For example the first fields in the file could be an
"index" with references to where in the file each component is. But it still
seems we are bending over backwards here.

Okay, so we either go for i. or ii. 

i. sounds good. The contents of the zip or tar is basically a directory
structure, and induvidial components could be accessed easily. I guess the
text/main part of any document would be an XML file in the root, and it would
have references to other XML files, or image files, or whatever is embedded
using bonobo or what not.
 
ii. sounds good too. I don't know the particulars of the EFS, but it if has
serious advantages over i., it ought to be done. All we'd need is a couple
of command line tools that converts EFS files to directories and vice versa.
(the equivalent of tar, zip, and unzip). For GUI access,
Gnome-vfs and Nautilus could have a module to view/modify the contents of EFS
files, just like they can do for RPM's and tarballs right now. That way EFS
files are as easily accessible as RPM's and tarballs, yet applications that
use them natively can interface with them using the efs library.

	= L

/-------------------------------------------------------------------\
|   LOBAN AMAAN RAHMAN  <-- anagram of -->  AHA! AN ABNORMAL MAN!   |
|  MSC #763, Caltech, Pasadena, CA 91126, USA. Tel: 1-626-395-1407  |
|     loban@earthling.net, loban@caltech.edu, http://i.am/loban     |
\-------------------------------------------------------------------/



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]