libwww-robotrules-perl 6.02-ok1 source package in openKylin
Changelog
libwww-robotrules-perl (6.02-ok1) yangtze; urgency=medium * Build for openKylin. -- openKylinBot <email address hidden> Mon, 25 Apr 2022 22:03:04 +0800
libwww-robotrules-perl (6.02-ok1) yangtze; urgency=medium * Build for openKylin. -- openKylinBot <email address hidden> Mon, 25 Apr 2022 22:03:04 +0800
Series | Published | Component | Section |
---|
File | Size | SHA-256 Checksum |
---|---|---|
libwww-robotrules-perl_6.02.orig.tar.gz | 8.8 KiB | 46b502e7a288d559429891eeb5d979461dd3ecc6a5c491ead85d165b6e03a51e |
libwww-robotrules-perl_6.02-ok1.debian.tar.xz | 1.7 KiB | 43fa443a52b06b0d05a6f13d90efbd425a52172acccf50b7f0a03ec770333482 |
libwww-robotrules-perl_6.02-ok1.dsc | 1.9 KiB | 5d185d7b2699396bef2610640a92db87721b157f98a348c6292ce60f5cd82a82 |
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files