cURL / Mailing Lists / curl-users / Single Mail

curl-users

RE: designing multiple file support in libcurl

From: Rich Gray <Rich.Gray_at_PlusTechnologies.com>
Date: Mon, 12 Feb 2001 13:15:47 -0500

Why not just have a "cached session" mode?? Once set to this mode, libcurl
will maintain a session table which will automatically be added to on each
new host/protocol session transfer. Sessions are not automatically closed
at the end of a transfer. If you go back to the same host/protocol with a
subsequent request, the session is simply re-used. Sessions would be
closed at the end of the curl run (or by explicit command.)

Thus if you
1) get an https document from host a
2) ftp it to host b
3) get another https doc from host a
4) ftp it to host b
5) ftp something from host b
6) http it to host c

The same sessions could be used for (1,3) and (2,4 & 5).

Seems like the only problem you are likely to get into here is one session
timing out while you are doing a transfer on another. Perhaps there can be
provision for an automatic re-connect.

There was a suggestion earlier that curl should sort multiple requests so as
to do them in an efficient order (all of a sessions's transactions
together.) While I can't come up with a solid reason why I think this is a
bad idea, it seems way from goodness. I think there may be cases where
there is a "statefullness" involved in the transfers and doing them out of
order may screw that up. Let the programmer drive the show. If the
programmer wants things in a certain order, let it be so. Perhaps sorting
the transactions can be an option.
 
MHO,
Rich

> -----Original Message-----
> From: Daniel Stenberg [mailto:daniel_at_haxx.se]
> Sent: Monday, February 12, 2001 9:02 AM
> To: libcurl Mailing list
> Cc: Curl Mailinglist
> Subject: designing multiple file support in libcurl
>
>
> How would the interface work?
>
> (This is cross-posted to both the curl mailing lists, the
> discussion of this
> should probably be held in the libcurl list.)
>
> I've been toying with the idea (and some source code) to make
> libcurl capable
> of transferring any amount of files using the same socket
> connection. I
> imagine I'd enable this at least for http and ftp, probably
> with ftp coming
> first.
>
> How would a libcurl-using dude want to add more than one URL?
> How should curl
> do to signal the application when there's a another file coming?
>
> When adding more than one URL there's this chance that the
> following URLs
> aren't using the same server and then curl won't be able to
> use the same
> connection for them. I thought I'd let the library find that
> out itself, so
> that we can pass a long list of URLs and libcurl will
> download those in the
> list that are on the same server, just skipping the rest.
>
> Would it be enough to have a setopt() option that takes a
> linked list with
> URL strings? Or should I just allow CURLOPT_URL being used
> multiple times?
>
> Would it be enough if I introduced another callback hook that
> gets called
> when a new file is being downloaded? It could have a few interesting
> parameters as well as the one passed to the write callbacks.
>
> We should also consider the effects on the reversed transfer,
> when uploading
> multiple files to the remote server...
>
> The mail is a bit chaotic, but so is my brain at the moment! ;-)
>
> --
> Daniel Stenberg -- curl project maintainer -- http://curl.haxx.se/
>
Received on 2001-02-12