curl-users
Sporadic Problem
Date: Sun, 21 Jan 2001 13:25:46 EST
Greetings.
I have a rather curious problem that curl keeps throwing at me. I'm trying to
retrieve a set of webpages by sending them a cookie. Imagine a set of 15
pages with individual stock information on them that each require the *exact*
same cookie to retrieve. pretty simple.. huh?
When I run the script, I get sporadic results. Occasionally, it will grab all
the data fine. Other times, curl will return no output for all the pages.
More frequently, however, curl will return no output for a couple of pages,
but get the rest. There is no pattern as to which pages it doesn't return. I
found that if I put a for loop after the curl command that iterates 500
times, curl gets better results. Here is some of the code that I'm using.
for ($i=0; $i<500; $i++)
{
if ($i == 100)
{
}
}
$command = "$curl \"$url\" $par -s -S -A \"Mozilla/4.0\"";
my @output = `$command`;
for ($i=0; $i<@output; $i++)
{
#print "out: $output[$i]\n";
}
if (!@output)
{
print "no output\n";
}
I'm sorry if this quesiton is ambigous. I'm at wits end and I don't know of a
better way to describe because I can't find any pattern as to why it almost
"chooses" which pages to return each time.
Anybody every experience anything like this?
Thanks.
Received on 2001-01-21