curl-library
progressfunction called too much
Date: Thu, 30 May 2013 18:33:18 +0200
Hello,
i am using pycurl in a small script to download coursera videos, it's working
fine but the callback i installed to display a progress bar in console it's
called 22188558 times in 70 seconds of execution, it hangs 1 core of the machine
all the time.
given the documentation states should be called "roughly once per second or
sooner" this seems too much for me, i am using this code to use it, it's python
but i think it's pretty readable.
curl = pycurl.Curl()
curl.setopt(curl.URL, url)
if rate_limit is not None:
curl.setopt(curl.MAX_RECV_SPEED_LARGE, rate_limit)
file_store = open(filename, "wb")
curl.setopt(curl.WRITEDATA, file_store)
curl.setopt(curl.NOPROGRESS, 0)
curl.setopt(curl.PROGRESSFUNCTION, curl_progress)
curl.setopt(curl.FOLLOWLOCATION, 1)
#cookies
curl.setopt(curl.COOKIEJAR, cookies_filename)
curl.setopt(curl.COOKIEFILE, cookies_filename)
try:
curl.perform()
except:
import traceback
my_logger.error(u"Error downloading file: %s" % traceback.format_exc())
#cleaning
curl.close()
file_store.close()
no matter that i define curl_progress as:
def curl_progress(dl_total, dl_now, ul_total, ul_now):
return 0
it's called so much times that hangs 1 cpu core anyway.
Maybe i am missing some option? Thanks.
Regards,
Miguel Angel.
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2013-05-30