curl-library
Re: Multiple Calls to Curl to do Multiple Downloads with each Call
Date: Fri, 21 Feb 2014 14:10:44 -0500
Hi Phil!
I do not have an answer to your specific question, but I do have a
design suggestion, below.
On Fri, Feb 21, 2014 at 12:35 PM, Phil Welch <philmorew_at_yahoo.com> wrote:
> I've used Curl in both C and PHP and was very pleased with it's functionality. In addition, I have used it to process single files and an array of files ala 10_at_a_time. What I have not been able to do is to call curl so that I can download say 4000 images then do a bunch of other non related processing [during which processing I am reloading the array] and then to process the newly refilled array, even though I follow essentially the same process that works with a single file. In gdb it seems like a pointer failure that occurs while processing from as little as 1 and as many as 10 before abending in various places. Here's an example:
>
> *** glibc detected *** ./insertProducts: corrupted double-linked list: 0x0000000000fb2270 ***
>
> Although I am using a fair number of pointers, I am not doing anything with a double-linked list.
>
> Because I have about a thousand lines of C code that needs to process between calls to Curl, embedding them within Curl [as I've done on occasion when processing 1 at a time] is a bit challenging to implement - not impossible be difficult.
Rather than embed your processing code within curl (i.e., as I believe you
mean, running that code from the curl callback), you could use curl to
download the images and post them to a producer-consumer queue in a
dedicated "curl" thread. Your large processing code would then run in a
separate processing thread. It would pluck images (or whatever) off of the
producer-consumer queue, and process them, as well as run whatever other
tasks it needs to attend to.
If you use a curl callback in the curl thread, it would simply accumulate data
from the curl buffer until an image was completely downloaded, and then
post the image to the queue to be processed by your code in the processing
thread.
It sounds like your using linux, so you could use pthreads. (You could most
likely use pthreads on windows or other unix versions, as well.)
> So, before I move in that direction I thought I'd ask if there's any example code where Curl is called multiple times [and] processes an array of url downloads during each iteration. Again, 1 at a time is not a problem; many at a time is.
(Again, I don't have an answer for your specific question.)
> Any thoughts would be appreciated.
>
> Phil Welch
Good luck.
K. Frank
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette: http://curl.haxx.se/mail/etiquette.html
Received on 2014-02-21