Re: https server - memory leaks



Can you define clients? I only used one PC/one MAC Address/localhost and that very same curl request multiple times (bash loop). My tests were limited to 100 requests, but until then it grew linear - which feels just wrong given my testing method...

Any additional bits would be appreciated :>

On Tue, Dec 3, 2013 at 10:26 PM, Dan Winship <danw gnome org> wrote:
On 12/03/2013 10:50 AM, Bernhard Schuster wrote:
I am running a decent HTTP server which fulfills just one specific task. I trimed (almost - setup stuff) all memory leaks so far, except for one. Massif/Valgrind tell me that it is part of libgiognutls.so/libgnutls.so (g_memdup -> g_bytes_new). I had a very brief look at the code but could not spot anything suspicous in gtls{server,}connection.[ch] - but as said I only had a very brief look. Also when I just do a few requests and set breakpoints on g_bytes_new and g_bytes_unref their count is equal, and g_memdup occurs by far too frequently to set breakpoints on.
This looks like the session cache; it's not actually leaking, it's just keeping a bit of information about each recent connection that allows those clients to do a simplified (ie, faster) TLS handshake the next time they connect. Currently it only evicts cache entries when they're more than an hour old, so if you get lots of connections from lots of different clients, the cache might get pretty large... maybe the expiration time should be shorter. Or adjustable... -- Dan


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]