curl-and-php
Bypass robots.txt
From: Chris Gralike <Chris.Gralike_at_AMIS.nl>
Date: Tue, 25 Apr 2006 12:20:25 +0200
Date: Tue, 25 Apr 2006 12:20:25 +0200
I think you can best use the other php validation functions to validate
te robots.txt and parse it. Something that might look like :
$remote_site = 'http://www.remotesite.com/';
$file = 'robots.txt';
$full_path_file = $remote_site.$file;
If( is_file( $full_path ) ){
Fopen .....;
(validate and parse)
(Act on findings...)
}else{
(no robots.txt, then....)
}
Kind Regards,
Chris Gralike
AMIS Services BV
Edisonbaan 15
3439 MN NIEUWEGEIN
Postbus 24
3430 AA NIEUWEGEIN
T: 030-601-6079
F: 030-601-6001
M: Chris.Gralike_at_AMIS.nl
W: http://www.AMIS.nl
_______________________________________________
http://cool.haxx.se/cgi-bin/mailman/listinfo/curl-and-php
Received on 2006-04-25