curl-library
CURLOPT_TIMEOUT curl_eas_setopt
Date: Sat, 22 Feb 2003 22:56:01 -0500
Hi,
I'm writting a multithread web crawler each thread does basically
the following:
1. create an easycurl handle
2. setopts _NOSIGNAL(1), _NOPROGRES(1), _WRITEFUNCTION(myfunc),
_FAILONERROR(1), _FOLLOWLOCATION(1), _MAXREDIRS(3),
_USERAGENT("mycrawl+email", and _TIMEOUT(3)
3. enter bounded download loop
4. select a URL from my list of known urls
5. setopt _FILE(memlocation), _URL(url to download)
6. call curl_perform
7. parse content in memlocation to populate list of known urls
8. continue looping until bound max
9. out side of bounded loop cleanup curl handle
everything works great except when i set the _TIMEOUT option in step 2.
When i do that
I get a core dump, that always points to domain name resolver somewhere
in the curl library.
However, if I don't set the timeout the process will not crash but my
worker threads will hang
in the curl perform function for upwards of 200 seconds.
Any ideas? This behavior usually starts to show up after 500+ downloads...
-todd
-------------------------------------------------------
This SF.net email is sponsored by: SlickEdit Inc. Develop an edge.
The most comprehensive and flexible code editor you can use.
Code faster. C/C++, C#, Java, HTML, XML, many more. FREE 30-Day Trial.
www.slickedit.com/sourceforge
Received on 2003-02-23