curl-and-python
RE: Function to retrieve mulltiple URLs asyncrously
Date: Fri, 11 Mar 2005 13:48:33 +0100
that code is part of an outer loop which subsequently calls info_read and
select. this whole procedure (perform, info_read, select) is done until
there is no more work to do.
one thing though -- the code does a select(..) without at timeout, so unless
there is activity on the file descriptors this will block infinitely. is
this harmless or does the multi api assume that select times out
periodically?
- kjetil
-----Original Message-----
From: curl-and-python-bounces_at_cool.haxx.se
[mailto:curl-and-python-bounces_at_cool.haxx.se] On Behalf Of Daniel Stenberg
Sent: 11 March 2005 13:08
To: curl stuff in python
Cc: 'gf gf'
Subject: RE: Function to retrieve mulltiple URLs asyncrously
On Fri, 11 Mar 2005, Kjetil Jacobsen wrote:
> while 1:
> ret, num_handles = m.perform()
> if ret != pycurl.E_CALL_MULTI_PERFORM:
> break
> while(CURLM_CALL_MULTI_PERFORM ==
> curl_multi_perform(multi_handle, &still_running));
>
> what am i missing here?
It might very well be me who's missing this here.
When that loop ends, it doesn't signify anything else than that the app
should wait for more action on the socket(s) or wait for a timeout and then
do it all over again. Untill 'still_running' turns zero.
If that was what the python code does, then I was wrong.
-- Daniel Stenberg -- http://curl.haxx.se -- http://daniel.haxx.se Dedicated custom curl help for hire: http://haxx.se/curl.html _______________________________________________ http://cool.haxx.se/mailman/listinfo/curl-and-python _______________________________________________ http://cool.haxx.se/mailman/listinfo/curl-and-pythonReceived on 2005-03-11