curl-users
Re: OS2 get multiple urls doesn't work
Date: Thu, 14 Sep 2000 09:34:22 +0200 (MET DST)
On Thu, 14 Sep 2000 datablitz_at_optusnet.com.au wrote:
> When I try to get multiple urls (ie. using []s) curl fails to retrieve
> them. Single urls work perfectly. I have tried this with different
> sites etc, but have never got it to operate.
I don't know what the problem might be, but the version string in your
requests show you're using an ancient curl version. Try upgrading to a
version 7 and retry.
I don't know if you'll find an available OS/2 binary for 7.2.1 though. Make
your own or get someone to build one for you.
I don't have the time to hunt or fix bugs in older releases.
[snip]
> RFC1.HTML as it appeared on my system:-
>
> <TITLE>Lookup Error</TITLE>
> <H1>Lookup Error</H1>
> Can't retrieve your request -
> '/export/1/ftp/mirror/internic/rfc/rfc0001,rfc,0001,rfc1*' not found
Hm, does that work if you retrieve it as a single URL then? That output
sounds as if there's something wrong in the server end.
> For a single url, transfer speed is normally over 3000 bytes/sec. Note
> the slow speed after trying a multiple transfer.
The multiple URL thing is all in the 'curl' part. The libcurl part has no
special support for multiple URLs but to it everything is a single URL and
thus it just can't make a difference, multiple or not. What makes a
difference is the size of the request, as smaller files are likely to get
transfered slower than larger files.
> Maybe you can help me with what I was looking for in the RFCs. Is there
> any way to get a list of all urls in a directory on a 'standard' website.
Yes I can. The answer is no, there's no way. HTTP has no "directory" concept.
-- Daniel Stenberg - http://daniel.haxx.se - +46-705-44 31 77 ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`olReceived on 2000-09-14