cURL / Mailing Lists / curl-users / Single Mail


GET request

From: Nick Chirca <>
Date: Mon, 26 Nov 2001 01:25:50 +0200 (EET)

I am having some PERL/curl problems. This may be a little off topic,
because it's not directly related to curl/libcurl, but someone from these
lists may have a solution for me. What I am doing is try to get some
information out of a web site. What I do is download a main page, extract
the 10 (or less) links out of it, download each of the 10 links (or less),
one by one, extract the info out of each, then download another main page
and so on. The problem is that in this loop, at some point, the page I am
asking for is not coming, despite the fact that the HTTP request has been
sent and the script is blocked.I had the same problem when using libwww
perl module. I don't know how to handle this ? I mean, what can I do to
prevent this from happening ? I had similar problems when I tried to
extract info (download pages one at a time, one by one) from another web

And it seems that some strange things happen... What happened is that the
sysadmin from the domain deleted my account because of this script I
left in the background. I managed to actually kill the main script when it
had no activity (when it couldn't download a page), but he said that my
script damaged the entire internet traffic from that server... one of two
1-my simple HTTP client used all the bandwith (which is impossible to my
2-the server that my HTTP client was on actually was messed up, somehow by
my script.

I can't understand what happened, because the admin said that when he
stoped my perl scripts, the Internet could be reached again... How could
my perl script do that ?

If I didn't provide enough details, sorry. I could have attached the code,
but it's kind of big and messy. Any ideas ?

"It's nice to be important, but it's more important to be nice !"
						SCOOTER <-- My personal web site (info about me&my life) <-- My resume/CV.
Received on 2001-11-26