GHashTable memory usage



Hi,

I am currently using the GLIB library within my program for the purpose
of creating search engine based on Hash Tables. For that purpose I'm
using the GLIB functions such  g_hash_table_*. Through my usage of those
functions I had noticed a peculiar behavior, after loading a fairly
large data file my program takes up approximately 331 Megs of memory
which really pushes the limits of my system which currently has only 256
of physical memory. What happens is that for some reason it seems that
when the system reaches a very high load the hash table seems to get
compressed by a fairly large amount (150-70 megs) which generally causes
the program to reduce it own memory usage down to 184 megs or so.
Unfortunately when I add additional RAM to the system this optimization
doesn't seem to occur and I have not been able to consistently replicate
the conditions which cause this optimization, other then the time when
it seems the system is running out of physical memory. I was wondering
if perhaps Hash Table functions inside glib have some internal
optimization functions which cause this to occur. What I would really
like to know if it is possible to replicate usage of those functions
without the system coming close to dying.

The current system config is as follows:
Dual Celeron 433
256 megs of Ram
Linux 2.2.16 [Slackware]




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]