libwww-robotrules-perl 6.02-ok1 source package in openKylin

Changelog

libwww-robotrules-perl (6.02-ok1) yangtze; urgency=medium

  * Build for openKylin.

 -- openKylinBot <email address hidden>  Mon, 25 Apr 2022 22:03:04 +0800

Upload details

Uploaded by:
openKylinBot
Sponsored by:
luzp
Uploaded to:
Yangtze V1.0
Original maintainer:
Openkylin Developers
Architectures:
all
Section:
perl
Urgency:
Medium Urgency

Publishing See full publishing history

Series Pocket Published Component Section

Downloads

File Size SHA-256 Checksum
libwww-robotrules-perl_6.02.orig.tar.gz 8.8 KiB 46b502e7a288d559429891eeb5d979461dd3ecc6a5c491ead85d165b6e03a51e
libwww-robotrules-perl_6.02-ok1.debian.tar.xz 1.7 KiB 43fa443a52b06b0d05a6f13d90efbd425a52172acccf50b7f0a03ec770333482
libwww-robotrules-perl_6.02-ok1.dsc 1.9 KiB 5d185d7b2699396bef2610640a92db87721b157f98a348c6292ce60f5cd82a82

View changes file

Binary packages built by this source

libwww-robotrules-perl: database of robots.txt-derived permissions

 WWW::RobotRules parses /robots.txt files as specified in "A Standard for
 Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters
 can use the /robots.txt file to forbid conforming robots from accessing parts
 of their web site.
 .
 The parsed files are kept in a WWW::RobotRules object, and this object
 provides methods to check if access to a given URL is prohibited. The same
 WWW::RobotRules object can be used for one or more parsed /robots.txt files