cURL / Mailing Lists / curl-library / Single Mail

curl-library

Re: freeing memory

From: Daniel Stenberg <daniel-curl_at_haxx.se>
Date: Fri, 28 May 2004 08:48:11 +0200 (CEST)

On Thu, 27 May 2004, Patrick Mealey wrote:

> Due to some design restrictions on my program, I have a situation where I
> need to free up all dynamically allocated memory in between calls to
> libcurl.

If I were you, I would *seriously* reconsider that decision.

> From my inspection of easy_cleanup() and global_cleanup(), there are some
> pointers in the session handle that do not get free'ed.
>
> 1) Is this true?

No.

If there are any left-overs, those are memory leaks and we should fix them.

We check for memory leaks when the test suite is run (when the lib is
debug-built), and we do this on a fair amount of platforms and option combos
and I can happily report that we don't often find any leaks. (We also run the
tests using valgrind when available, and that too scans for memory leaks.)

> 2. If true, can I just write a function to walk through the curl handle
> freeing up anything I find? Has anyone attempted this before? Is there a
> simpler way to do it?

If you mention/identify any data you think is left allocated after this
procedure, we can take each specific case in consideration and fix.

> Keep in mind, I will have to do multiple POSTs using libcurl during the same
> process session. I just have to do them as though it has never been done
> before (i.e. always do global_init, always get a new handle, always set
> options amd headers, always do a cleanup, etc.).

I can't but to object against this stupid aproach. Your program will run much
slower than it needs to. Also, I'm not sure that you can do this without
problems in regard to how we do the global_init/global_cleanup of the OpenSSL
stuff. That area is a what we could call a white spot on my map.

-- 
     Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se
      Dedicated custom curl help for hire: http://haxx.se/curl.html
Received on 2004-05-28