curl-library
Re: Bug in libcurl in multithreaded program
Date: Wed, 22 May 2002 15:07:28 -0400
I wasn't aware of the CURLOPT_DNS_USE_GLOBAL_CACHE option. (Is the default
on? In this case, the example multithreaded program should be updated)
However, even after I set it to false the problem persists.
My program is fairly simple. Basically, generate a list of ftp and http
urls to a specified server. Create a bunch of threads. Each thread randomly
picks a url from the list, fetches the file, and repeats. In terms of using
libcurl, the only functions i'm using are curl_easy_init and curl_setopt to
set the url. Since the threads keep on fetching until the program is killed
I never call curl_easy_cleanup. I've tried compiling both the library and
my program with gcc -O0 (no optimizations) but the segfaulting persists.
A couple other interesting things. If I continue from gdb instead of
stopping the rest of the threads will continue running (until another dies)
so it doesn't appear to corrupt any global variables in libcurl. Also,
everytime I do a backtrace is has just finished an ftp transfer (not http).
Hope this helps,
Avery Fay
|---------+---------------------------->
| | Daniel Stenberg |
| | <daniel_at_haxx.se> |
| | |
| | 05/22/2002 02:44 |
| | PM |
| | |
|---------+---------------------------->
>------------------------------------------------------------------------------------------------------------------------------|
| |
| To: Avery Fay <avery_fay_at_symantec.com> |
| cc: libcurl Mailing list <curl-library_at_lists.sourceforge.net> |
| Subject: Re: Bug in libcurl in multithreaded program |
>------------------------------------------------------------------------------------------------------------------------------|
On Wed, 22 May 2002, Avery Fay wrote:
> I recently wrote a program using libcurl that starts up a bunch of
threads
> (default: 16) and has each thread repeatedly get files from an FTP server
> and web server (IIS). The program currently runs on Debian linux with
> kernel 2.4.18. libcurl version is 7.9.7 (latest). Program seg faults
after
> a certain amount of time (very fast if the files are small but takes
longer
> for larger files; small files make connections rapidly etc.). Here is a
> backtrace:
>
> Program received signal SIGSEGV, Segmentation fault.
> [Switching to Thread 1026 (LWP 21615)]
> 0x4004555c in Curl_hash_clean_with_criterium () from
/usr/lib/libcurl.so.2
> (gdb) bt
> #0 0x4004555c in Curl_hash_clean_with_criterium () from
> /usr/lib/libcurl.so.2
> #1 0x4002f82f in Curl_global_host_cache_dtor () from
/usr/lib/libcurl.so.2
You're using a global DNS cache here, right?
That is not thread-safe. You cannot use a global DNS cache within libcurl
when using it multi-threaded.
Switch CURLOPT_DNS_USE_GLOBAL_CACHE to FALSE and it is likely to work
better.
No, there is currently no way for the different threads to share the same
DNS
cache. We've discussed how to support that but we haven't implemented
anything like it yet.
If this is not the reason you have this problem, please elaborate a bit
further on what libcurl options you use and what your program actually do
libcurl-wise.
-- Daniel Stenberg -- curl groks URLs -- http://curl.haxx.se/ _______________________________________________________________ Don't miss the 2002 Sprint PCS Application Developer's Conference August 25-28 in Las Vegas -- http://devcon.sprintpcs.com/adp/index.cfmReceived on 2002-05-22