cURL / Mailing Lists / curl-users / Single Mail

curl-users

Re: Threading/forking support in curl

From: Daniel Stenberg <daniel-curl_at_haxx.se>
Date: Wed, 8 Sep 2004 18:53:21 +0200 (CEST)

On Wed, 8 Sep 2004, Seth Mos wrote:

> I am in search of threading/forking support for a download manager like curl
> or wget. However this seems to be a missing option.
>
> If it does exist in curl, please point it out.

It does not. And I personally have have no plans on adding such support
either.

> The idea is to fetch the first page and then fork a seperate process for
> fetching the sub-pages. This is ofcourse a good way to kill off a dsl line,
> so it need a config option to limit the amount of forks to say 5 or
> specified.

Speaking HTTP with many connections from the same client to the same site is
not considered goodness according to RFC2616.

Besides, curl doesn't get sub-pages by itself.

> The need for this asynchronous fetching arises when the server hosting the
> page is on a relatively high latency link (not neccesarily low speed) and
> thus serializing the request quee turns into a huge traffic jam and takes
> really long.

That problem is generally solved by pipelining in HTTP land. Which curl
doesn't support either! ;-)

> I did write a shell script utility for something else already which has a
> process limiter, I still have the "did we fetch this page yet?" problem.
> Which for large sites is best done using something like db I guess.

We provide libcurl for purposes such as this. Write your own client to
transfer stuff the way you want it - libcurl makes all the transfer stuff, you
do the rest.

-- 
      Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se
       Dedicated custom curl help for hire: http://haxx.se/curl.html
Received on 2004-09-08