curl-users
Re: GET request
Date: Mon, 26 Nov 2001 10:44:40 +0100 (MET)
On Mon, 26 Nov 2001, Nick Chirca wrote:
> I am having some PERL/curl problems. This may be a little off topic,
> because it's not directly related to curl/libcurl, but someone from these
> lists may have a solution for me.
NOTE: you mailed this to the curl-main mailing list and to the admin address
of the libcurl list. There's no point in mailing the admin address with
normal list posts.
[ getting lots of web pages explained ]
> And it seems that some strange things happen... What happened is that the
> sysadmin from the vl.ro domain deleted my account because of this script I
> left in the background. I managed to actually kill the main script when it
> had no activity (when it couldn't download a page), but he said that my
> script damaged the entire internet traffic from that server... one of two
> happened:
> 1-my simple HTTP client used all the bandwith (which is impossible to my
> knowledge)
> 2-the server that my HTTP client was on actually was messed up, somehow by
> my script.
>
> I can't understand what happened, because the admin said that when he
> stoped my perl scripts, the Internet could be reached again... How could
> my perl script do that ?
>
> If I didn't provide enough details, sorry. I could have attached the
> code, but it's kind of big and messy. Any ideas ?
Well, we can't actually know anything since we don't know anything about your
script, your bandwidth, the site you're talking about and its bandwidth.
Many site owners considers it a bad thing to "suck" every web page from
his/hers site, especially if done with a very high speed (no pause between
the fetches). Also, if the remote site has a limited bandwidth while you have
much better, there's indeed the possibility that you starve out other
visitors from the site while you're downloading the (entire) site.
If the server was messed up by your script, then it surely must be a silly
server, but then there are no laws against that and many a server out there
are truly stupid.
You might get more popular if you make sure your script doesn't work quite as
fast as it can. Most search engines, for example, only make one request a
second during harvesting.
-- Daniel Stenberg -- curl groks URLs -- http://curl.haxx.se/Received on 2001-11-26