libwww-robotrules-perl 6.02-ok2 source package in openKylin

Changelog

libwww-robotrules-perl (6.02-ok2) yangtze; urgency=medium

  * Modify debian/control.

 -- denghao <email address hidden>  Thu, 15 Sep 2022 12:42:51 +0300

Upload details

Uploaded by:
denghao
Sponsored by:
Cibot
Uploaded to:
Yangtze V1.0
Original maintainer:
Openkylin Developers
Architectures:
all
Section:
perl
Urgency:
Medium Urgency

Publishing See full publishing history

Series Pocket Published Component Section
Huanghe V3.0 proposed main perl
Huanghe V3.0 release main perl
Nile V2.0 proposed main perl
Nile V2.0 release main perl
Yangtze V1.0 release main perl
Yangtze V1.0 proposed main perl

Downloads

File Size SHA-256 Checksum
libwww-robotrules-perl_6.02.orig.tar.gz 8.8 KiB 46b502e7a288d559429891eeb5d979461dd3ecc6a5c491ead85d165b6e03a51e
libwww-robotrules-perl_6.02-ok2.debian.tar.xz 1.8 KiB b06e9ff5ab2a7ff9a2002de9a9d893474f729199cdd370961de0f2830f2902cb
libwww-robotrules-perl_6.02-ok2.dsc 1.8 KiB 702c04ca4e38bf7eb9450861dadfd6c616cff90e32523b40a308d5d884caa586

Available diffs

View changes file

Binary packages built by this source

libwww-robotrules-perl: database of robots.txt-derived permissions

 WWW::RobotRules parses /robots.txt files as specified in "A Standard for
 Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters
 can use the /robots.txt file to forbid conforming robots from accessing parts
 of their web site.
 .
 The parsed files are kept in a WWW::RobotRules object, and this object
 provides methods to check if access to a given URL is prohibited. The same
 WWW::RobotRules object can be used for one or more parsed /robots.txt files