curl-library
libcurl segfault in curl_free()
Date: Mon, 13 Feb 2006 14:28:08 -0700 (MST)
Hello.
I've got a multi-threaded app that uses curl in each thread. The core of
the thread is here:
tmp_url = get_next_url();
while (tmp_url) {
curl = curl_easy_init();
curl_easy_setopt(curl, CURLOPT_URL, tmp_url->url);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, outfile);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION,
my_write_func);
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1);
// curl_easy_setopt(curl, CURLOPT_NOPROGRESS, 1);
curl_easy_setopt(curl, CURLOPT_TIMEOUT, 30);
curl_easy_perform(curl);
ret = curl_easy_getinfo(curl, CURLINFO_SIZE_DOWNLOAD,&size);
curl_easy_getinfo(curl, CURLINFO_RESPONSE_CODE,&retcode);
if (retcode == 200) {
tmp_url->new_size = size;
} else {
tmp_url->new_size = tmp_url->old_size;
}
fprintf (stderr,"thread #%02d, size:(%6g/%6g), ret: %3d,
url: %s\n",thread_num,tmp_url->old_size,size,retcode,tmp_url->url);
curl_easy_cleanup(curl);
tmp_url = get_next_url();
}
When the threads all exit, I (occasionally) get a segfault in curl_free().
It doesn't happen all of the time. Might need to run it several times
before I can get a segfault, but it *does* happen. The source spawns 99
threads - you can change it if you'd like, but keep the data file the
same, shorter data files segfault less often, so the longer datafile will
hopefully produce a segfault quicker.
How to reproduce:
cd /tmp
wget http://badcheese.com/~steve/crawler.tar.gz
tar xzvf crawler.tar.gz
cd crawl
make
./crawl
Any help would be greatly appreciated!
- Steve
-- EMAIL: (h) steve@badcheese.com WEB: http://badcheese.com/~steveReceived on 2006-02-13