Re: GHashTable memory usage



On Thu, 20 Jul 2000, Slava Poliakov wrote:

 Hi,

 I would recommend trying other means for this, for example the following:

* hashtable from C++ STL by SGI (provided with gcc) - seems it should be faster
  since some code will be inlined (such as comparation functions, hash
  functions, and your own allocator - all this can be inlined  if you
  write it in a special way) - see hash{_map,_set,table}.h I don't know
  about its behaviour with such amounts of data (but at least you will be able
  to store the "value" structrure *inside* the hash item, not the pointer to 
  it like in glib - this can cause less fragmentation (and you can use your
  own allocator!). Also, hash_multimap is also available (look in hash_map).

  On most linuxes, these files are in /usr/include/g++-2

  In general, write and use C++ templates - it's faster, flexible and
  powerfull than it could be done in C with the same level of flexibility. 

* probably using some database (like Postgressql or db*).

 Feel free to contact me for discussion.

> Hi,
> 
> I am currently using the GLIB library within my program for the purpose
> of creating search engine based on Hash Tables. For that purpose I'm
> using the GLIB functions such  g_hash_table_*. Through my usage of those
> functions I had noticed a peculiar behavior, after loading a fairly
> large data file my program takes up approximately 331 Megs of memory
> which really pushes the limits of my system which currently has only 256
> of physical memory. What happens is that for some reason it seems that
> when the system reaches a very high load the hash table seems to get
> compressed by a fairly large amount (150-70 megs) which generally causes
> the program to reduce it own memory usage down to 184 megs or so.
> Unfortunately when I add additional RAM to the system this optimization
> doesn't seem to occur and I have not been able to consistently replicate
> the conditions which cause this optimization, other then the time when
> it seems the system is running out of physical memory. I was wondering
> if perhaps Hash Table functions inside glib have some internal
> optimization functions which cause this to occur. What I would really
> like to know if it is possible to replicate usage of those functions
> without the system coming close to dying.
> 
> The current system config is as follows:
> Dual Celeron 433
> 256 megs of Ram
> Linux 2.2.16 [Slackware]
> 
> _______________________________________________
> gtk-list mailing list
> gtk-list@gnome.org
> http://mail.gnome.org/mailman/listinfo/gtk-list
> 

 Best regards,
  -Vlad





[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]