curl-library
minor bug in url.c?
Date: Fri, 5 Oct 2001 11:28:17 -0700 (PDT)
I think I noticed a small bug in url.c:
case ENOMEM:
failf(data, "Insufficient kernel memory was available: %d",
errno);
break;
default:
failf(data,"errno %d\n"); <--------- this doesn't actually pass
errno to failf, so what would you get for output?
} /* end of switch */
Also, I have two short questions (not really bugs).
First of all, I noticed BUFSIZE is set to 50K by default for each
curl_easy_init. This eats up quite a bit of ram in a multithreaded
environment, especially in how zthreads implements c++ mulithreaded (i.e. all
code is stored in the task class, not in the worker thread itself). Is there
any huge disadvantage to resetting it to somewhere around 5K? A lot of my
requests are for the header only, so I don't want to be wasting all that ram.
Secondly, I made the follow hack to allow for a fixed limit on download size.
Because I've found that some servers don't honor the Range: directive, and I
don't want to get stuck downloading a 300 meg file by accident. Its
currently implemented as follows, but it would be much nicer to implement
this as a curl_easy_setopt I think:
if(data->set.no_body)
return CURLE_OK;
conn->maxdownload=ABSOLUTE_DOWNLOAD_LIMIT;
if(!conn->bits.close) {
/* If this is not the last request before a close, we
must
set the maximum download size to the size of the
expected document or else, we won't know when to stop
reading! */
if(-1 != conn->size)
conn->maxdownload = (conn->size >
ABSOLUTE_DOWNLOAD_LIMIT ? ABSOLUTE_DOWNLOAD_LIMIT : conn->size);
/* If max download size is *zero* (nothing) we already
have nothing and can safely return ok now! */
if(0 == conn->maxdownload)
return CURLE_OK;
/* What to do if the size is *not* known? */
}
break; /* exit header line loop */
}
Comments? Thanks,
Lucas.
__________________________________________________
Do You Yahoo!?
NEW from Yahoo! GeoCities - quick and easy web site hosting, just $8.95/month.
http://geocities.yahoo.com/ps/info1
Received on 2001-10-05