[Glade-devel] libglade ideas
- From: taf2 lehigh edu (todd)
- Subject: [Glade-devel] libglade ideas
- Date: Thu, 08 Apr 2004 03:46:41 -0400
This is a multi-part message in MIME format.
--------------060505050303070602090108
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Olexiy Avramchenko wrote:
todd wrote:
Hi all,
I'm building an application that is relying heavily on libglade
and another xml binding. I've noticed that because every part of the
UI is
built using glade and loaded using libglade that there is a
noticeable performance lag in loading the .glade files. To give you
an idea of the
size of these interface files here is a line count.
195 simo-attached-documents.glade
1030 simo_calendar_view.glade
202 simo_dialogs.glade
1306 simo_event_view.glade
278 simo-fsa-msa-hsa-reimbursements-eob.glade
3314 simo.glade
1799 simo-home-view.glade
1305 simo-person-med-visit-detail.glade
101 simo-person-med-visit-summary.glade
100 simo-person-summary.glade
1009 simo-person-view.glade
563 simo-plan-reimbursements-eob.glade
152 simo_plan_view.glade
158 simo-provider-payments.glade
343 simo-sidebar-dialogs.glade
11551 simo-wizards.glade
23406 total
I'm not loading them all at once; In fact I only load them once and
only when that part of the UI is requested. My idea is to provide a
method for
dumping the .glade into a binary format. after looking into the
libglade source it looks like it could be possible to bipass the xml
parse code if this
binary file exists and then just fread the data into the libglade
hash tables. This should improve the load time by reducing the disk
io. (I think).
Just compress your glade xml files with gzip if you want to reduce
disk io. Libglade and glade UI builder works well with compressed xml
files.
I'm at a loss here because doesn't it take quiet a bit of time to
uncompress a file compared to just reading the uncompressed file? My
understanding is that
yes i agree this would reduce the disk io if you are reading the whole
file into memory and then uncompressing it. But if you are going to be
safe and only
read a little uncompress and read a little more. then isn't this going
to be a slower process because you now might be reading less from disk
but are going
through the extra step of uncompressing. Perhaps, the decompressing
stage is very fast on high end machines, but on slower machines wouldn't
that be
an increased performance hit? I'm not ruling out the idea of using
compressed xml files for my project i'm just thinking it might be even
better to convert
the xml files into to a binary format that might be larger then the xml
but byte aligned with the data structures within libglade. Then when
you load its a
much more straight forward process of loading the binary into memory as
all the bytes are already in the right location so there is effectly
zero parsing overhead.
If i'm not mistaken this is what mozilla has started doing to improve
the performance of the browser. They cache the xul files in a binary
format after the first
run of the application. What i'm saying is why can't we do something
similar with glade files.
You can edit text/xml file with any editor from vi to MS Word and
compressed text will take less space than parsed binary file, I beleive.
This is true but which one should load faster?
char *tag;
...
while( i < stream_len ){
if( buf[i] == '<' ){
tag[j] = buf[i];
}
...
func = lookup( tag );
...
func( tag, ... );
}
func(...)
isdigit(value[0])
x = atoi(value);
or
int x;
fread( &x, sizeof( int ), 1, file );
This is pretty contrived but the point is it should take much less code
and therefore time ( i would imagine ) to load a binary file versues an
xml file.
If i'm right about this then doing something like this gives you the
best of both worlds. Because, first you get the portablity and
expressiveness of an xml format,
and second you get the speed of loading a binary format after you get
the xml file on your machine and load the application once.
http://www.catb.org/~esr/writings/taoup/html/textualitychapter.html
the stuff about transmission of data is why libglade uses xml but
doesn't have to be why it can't load an xml file once and there on out
load a binary format that is optimal for
the machine that the xml resides. you could keep an md5 checksum of
the libglade xml to ensure that before loading the binary you do a quick
check of the two. this would
of course increase the disk io. but could reduce the cost of loading a
complex xml format verus a straight forward binary one specific to the
machine it is being run on.
just some ideas and something i'd like to help build and test to see
whether or not it actually improves performance or not. I mean maybe
you're right maybe just compressing
the xml files yields sufficient amount of performance gain on the disk
io side that it makes up for any extra overhead added for decompressing
the files. At any rate i think its
something interesting to think about.
-todd
--------------060505050303070602090108
Content-Type: text/x-vcard; charset=utf8;
name="taf2.vcf"
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment;
filename="taf2.vcf"
begin:vcard
fn:Todd Fisher
n:Fisher;Todd
email;internet:taf2 lehigh edu
x-mozilla-html:FALSE
url:http://severna.homeip.net
version:2.1
end:vcard
--------------060505050303070602090108--
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]