curl-and-python
Re: Function to retrieve mulltiple URLs asyncrously
Date: Thu, 10 Mar 2005 19:18:53 -0800 (PST)
Sorry for not being clear.
Currently, the function only returns one (the last?)
round of URL data. That is, if you ask it to it
retreive 200 URLs using 40 connections, you will only
get back 40 pages. The rest are retrieved, but are
somehow not returned (I can't figure out why).
Also, I was having a bit of a problem with getting
stuck at the select() - so I crudely modified it. I'm
sure someone knows how to do it the right way.
Also, please cc me on any responses, thanks.
--- Daniel Stenberg <daniel-curl_at_haxx.se> wrote:
> On Mon, 7 Mar 2005, gf gf wrote:
>
> > Has anyone written a function to retrieve multiple
> URLs asyncronously? I
> > tried to hack the retriever-multi.py to do so (see
> below), but am having
> > some trouble with it.
>
> ... which means?
>
> --
> Daniel Stenberg -- http://curl.haxx.se --
> http://daniel.haxx.se
> Dedicated custom curl help for hire:
> http://haxx.se/curl.html
>
__________________________________
Do you Yahoo!?
Yahoo! Small Business - Try our new resources site!
http://smallbusiness.yahoo.com/resources/
_______________________________________________
http://cool.haxx.se/mailman/listinfo/curl-and-python
Received on 2005-03-11