curl / Mailing Lists / curl-library / Single Mail
Buy commercial curl support from WolfSSL. We help you work out your issues, debug your libcurl applications, use the API, port to new platforms, add new features and more. With a team lead by the curl founder himself.

Re: Memory leak with curl_multi_socket_action

From: James Read via curl-library <curl-library_at_cool.haxx.se>
Date: Tue, 26 May 2020 15:15:24 +0100

On Tue, May 26, 2020 at 2:31 PM James Read <jamesread5737_at_gmail.com> wrote:

>
>
> On Tue, May 26, 2020 at 12:47 PM James Read <jamesread5737_at_gmail.com>
> wrote:
>
>>
>>
>> On Tue, May 26, 2020 at 7:30 AM Patrick Monnerat via curl-library <
>> curl-library_at_cool.haxx.se> wrote:
>>
>>> On 5/26/20 1:15 AM, James Read via curl-library wrote:
>>> >
>>> > git clone https://github.com/JamesRead5737/libcurlmemoryleak.git
>>> >
>>> > No need to make. Just compile with gcc crawler.c -g -lssl -lcurl
>>> > Run valgrind with valgrind -v --tool=memcheck --leak-check=full
>>> > --show-reachable=yes --track-origins=yes --log-file=memcheck.log
>>> ./a.out
>>> >
>>> > This should reproduce what I've been talking about.
>>> >
>>> Upon termination, you call curl_multi_cleanup() that still has easy
>>> handles attached to. As a consequence, they are never freed. See 1st
>>> description paragraph of
>>> https://curl.haxx.se/libcurl/c/curl_multi_cleanup.html.
>>>
>>> Unfortunately, there is no API that lists attached easy handles, so you
>>> have to keep a list externally and unlink+cleanup them explicitly before
>>> curl_multi_cleanup call. I tried that (with a very primitive
>>> single-linked list in ConnInfo) and it works: all memory leaks disappear.
>>>
>>
>> Does the example at https://curl.haxx.se/libcurl/c/ephiperfifo.html have
>> the same problem? The code I'm trying to debug seems to follow that pattern
>> very closely. Do you have any example code that fixes the problem?
>>
>
> I guess on reflection what I really need to do is figure out how I can
> finish off all running connections before exiting when a ctrl c is
> encountered. Would that clear up the memory leaks?
>

OK. I modified the code to exit cleaner and clean up in progress downloads.

git clone https://github.com/JamesRead5737/libcurlmemoryleak.git

This seems to have cleared up some of the memory loss reported in valgrind
but valgrind is still reporting memory leaks. Also consider the following
output from my program:

Parsed sites: 0, 54 parallel connections, 55 still runninggg Exiting
normally.
Parsed sites: 0, 0 parallel connections, 14 still runningg Finished all in
progress downloads.
Exiting.

When parallel connections reaches 0 g->still_running is still reporting a
number of easy handles in progress. How can this be? Surely, the answer to
this question is the answer to the memory leaks?

James Read

>
> James Read
>
>
>>
>> James Read
>>
>>
>>
>>>
>>> Patrick
>>>
>>> -------------------------------------------------------------------
>>> Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
>>> Etiquette: https://curl.haxx.se/mail/etiquette.html
>>
>>

-------------------------------------------------------------------
Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
Etiquette: https://curl.haxx.se/mail/etiquette.html
Received on 2020-05-26