curl-library
Re: libcurl segfault in curl_free()
Date: Mon, 13 Feb 2006 17:31:10 -0700 (MST)
I'm using the C library, not the perl lib, and I fixed the problem.
Thanks.
- Steve
On Mon, 13 Feb 2006, Kevin Carothers wrote:
> Date: Mon, 13 Feb 2006 16:01:39 -0800
> From: Kevin Carothers <kevindotcar_at_gmail.com>
> Reply-To: libcurl development <curl-library_at_cool.haxx.se>
> To: libcurl development <curl-library_at_cool.haxx.se>
> Subject: Re: libcurl segfault in curl_free()
>
> Hi Steve,
>
> I have no clue what your problem is, but I'd like to ask you a favor-
> Can you send me a copy of your "WWW::Curl::easy" pm file?
> I need it to try and get a working version of Curl so I can integrate
> libCurl on a win32 machine.
>
> Thanks, and good luck with your problem
>
> Kevin
>
> On 2/13/06, Steve Webb <steve_at_badcheese.com> wrote:
> >
> > Hello.
> >
> > I've got a multi-threaded app that uses curl in each thread. The core of
> > the thread is here:
> >
> > tmp_url = get_next_url();
> > while (tmp_url) {
> > curl = curl_easy_init();
> > curl_easy_setopt(curl, CURLOPT_URL, tmp_url->url);
> > curl_easy_setopt(curl, CURLOPT_WRITEDATA, outfile);
> > curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION,
> > my_write_func);
> > curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1);
> > // curl_easy_setopt(curl, CURLOPT_NOPROGRESS, 1);
> > curl_easy_setopt(curl, CURLOPT_TIMEOUT, 30);
> > curl_easy_perform(curl);
> > ret = curl_easy_getinfo(curl,
> > CURLINFO_SIZE_DOWNLOAD,&size);
> > curl_easy_getinfo(curl, CURLINFO_RESPONSE_CODE,&retcode);
> > if (retcode == 200) {
> > tmp_url->new_size = size;
> > } else {
> > tmp_url->new_size = tmp_url->old_size;
> > }
> > fprintf (stderr,"thread #%02d, size:(%6g/%6g), ret: %3d,
> > url: %s\n",thread_num,tmp_url->old_size,size,retcode,tmp_url->url);
> > curl_easy_cleanup(curl);
> > tmp_url = get_next_url();
> > }
> >
> > When the threads all exit, I (occasionally) get a segfault in curl_free().
> > It doesn't happen all of the time. Might need to run it several times
> > before I can get a segfault, but it *does* happen. The source spawns 99
> > threads - you can change it if you'd like, but keep the data file the
> > same, shorter data files segfault less often, so the longer datafile will
> > hopefully produce a segfault quicker.
> >
> > How to reproduce:
> >
> > cd /tmp
> > wget http://badcheese.com/~steve/crawler.tar.gz
> > tar xzvf crawler.tar.gz
> > cd crawl
> > make
> > ./crawl
> >
> > Any help would be greatly appreciated!
> >
> > - Steve
> >
> > --
> > EMAIL: (h) steve@badcheese.com WEB: http://badcheese.com/~steve
> >
> >
>
-- EMAIL: (h) steve@badcheese.com WEB: http://badcheese.com/~steveReceived on 2006-02-14