Name

NotRobotUA — specify user-agents that will NOT be classified as crawler bots (search engines)

SYNOPSIS

useragent_string...

DESCRIPTION

The NotRobotUA directive defines a list of useragent strings which will never be classified as crawler robots (search engines) visiting the site.

This directive has priority over RobotUA. If the user agent matches NotRobotUA, then the check for RobotUA is not performed and the client is not treated as an unattended robot.

DIRECTIVE TYPE AND DEFAULT VALUE

Global directive

EXAMPLES

Example: Defining NotRobotUA

NotRobotUA <<EOR
  *wget*
EOR

NOTES

For more details regarding web spiders/bots and Interchange, see robot glossary entry.

For more details regarding user sessions, see session glossary entry.

AVAILABILITY

NotRobotUA is available in Interchange versions:

4.6.0-5.9.0 (git-head)

SOURCE

Interchange 5.9.0:

Source: lib/Vend/Config.pm
Line 487

['NotRobotUA',     'list_wildcard',      ''],

Source: lib/Vend/Config.pm
Line 3853 (context shows lines 3853-3857)

sub parse_list_wildcard {
my $value = get_wildcard_list(@_,0);
return '' unless length($value);
return qr/$value/i;
}

AUTHORS

Interchange Development Group

SEE ALSO

RobotIP(7ic), RobotUA(7ic)

DocBook! Interchange!