Package : perl-WWW-RobotRules
Package details
Summary: Parse /robots.txt file
Description:
This module parses _/robots.txt_ files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters
can use the _/robots.txt_ file to forbid conforming robots from accessing
parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed _/robots.txt_
files on any number of hosts.
The following methods are provided:
URL: https://metacpan.org/release/WWW-RobotRules
License: GPL+ or Artistic
Maintainer: nobody
Description:
This module parses _/robots.txt_ files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters
can use the _/robots.txt_ file to forbid conforming robots from accessing
parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed _/robots.txt_
files on any number of hosts.
The following methods are provided:
URL: https://metacpan.org/release/WWW-RobotRules
License: GPL+ or Artistic
Maintainer: nobody
List of RPMs
- perl-WWW-RobotRules-6.20.0-11.mga9.src.rpm (Mageia 9, x86_64 media, core-release)