ActivePerl Documentation
|
NAMEWWW::RobotsRules - Parse robots.txt files
SUPPORTED PLATFORMS
SYNOPSISrequire WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt);
# Now we are able to check if a URL is valid for those servers that
# we have obtained and parsed "robots.txt" files for.
if($robotsrules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTIONThis module parses a /robots.txt file as specified in ``A Standard for Robot Exclusion'', described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided:
ROBOTS.TXTThe format and semantics of the ``/robots.txt'' file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used:
ROBOTS.TXT EXAMPLESThe following example ``/robots.txt'' file specifies that no robots should visit any URL starting with ``/cyberworld/map/'' or ``/tmp/'': User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example ``/robots.txt'' file specifies that no robots should visit any URL starting with ``/cyberworld/map/'', except the robot called ``cybermapper'': User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: /
SEE ALSOthe LWP::RobotUA manpage, the WWW::RobotRules::AnyDBM_File manpage
|