curl-and-php
Re: parallel execution
Date: Tue, 10 Jun 2003 08:39:35 -0500
On 6/10/03 5:24 AM, "Daniel Stenberg" <daniel_at_haxx.se> wrote:
>> How can i minimize the execution of getting data from more than 10 sites?
>> Any idea of parallel processing ?
>
> This can be solved using the multi interface of libcurl (which I believe is
> being incorporated into the PHP/CURL module, slowly, as we speak) or by using
> threads and do separate requests in individual threads. I don't know if PHP
> supports threads any good.
PHP threading is minimal at this point. Some of the file commands have
callback abilities, but it's either not working sufficiently or only in the
CVS for version 5.
I've faced similar obstacles not only by having to fetch from multiple
backends, but also having slow response times from the backends.
My solution was this:
- create a tmpdir
- write one or more curl config files, each with
a distinct output and error file (going to the same
dir)
- execute the curl binary in the background for
each config file
- monitor the pid in a php loop
- start parsing results as soon as processes go
away
This also allowed me to give the user some feedback about progress and the
ability to cancel the processes in the web interface.
HTH,
Juergen
-------------------------------------------------------
This SF.net email is sponsored by: Etnus, makers of TotalView, The best
thread debugger on the planet. Designed with thread debugging features
you've never dreamed of, try TotalView 6 free at www.etnus.com.
Received on 2003-06-10