Home > Fluid Dynamics Search Engine > Help > 1128

Filter Rules: Using the "deny" rule

The "deny" filter rule simply means that any web page affected should not be placed in the index.

The "deny" filter is equivalent to other exclusion methods like "robots.txt" or the robots meta tag, except that "deny"-type rules are always applied, even when the "crawler: rogue" setting is set to true. This is because an administrator's deny rules are always set up by him, whereas other robots exclusion rules might be set up by others and thus need to be overridden.

Another difference between "deny" rules and other exclusion methods is that the "deny" rule can be overridden by the related "always allow" rule. This allows administrators to set up highly granular rules for which files to allow in the index - he could choose to deny all pages at "xav.com", for example, but then override this for the single file "xav.com/contact.html" by adding that latter URL to an "always allow" rule.

If a web page is affected by both "deny" and "requires approval" rules, then it will be denied.


    "Filter Rules: Using the "deny" rule"
    http://www.xav.com/scripts/search/help/1128.html