Home > Guardian Error Handling System > Help > 1013

Best practices for handling hostile probes

A hostile probe is a program, run by an aspiring hacker, which checks for possible security holes on your web site. The typical probe will make sequential HTTP requests for all paths associated with known vulnerabilities. For example, the probe might first request "/_vti_bin/" to check for an open Front Page authoring system. Next it could check for "/default.ida?NN" and "/scripts/cmd.exe" to see if the Code Red or sadmind/IIS security holes have been unpatched. Then the probe could continue on to other paths associated with other vulnerabilities.

The problem with hostile probes are these:

  1. Even on well-maintained systems, there is always a time delay between the publishing of a security hole and the patching of the hole. During the window when the hole is known and unpatched, hostile probes can detect and compromise the system. Disrupting the probe helps to protect the system.

  2. Hostile probes are bad Internet citizens. Shutting them down or slowing them down will make the world a better place. Even if your web server is protected, other neighboring servers may not be.

  3. Probes generate a lot of traffic. If you have 404 email notification enabled, you may be flooded with hundreds or thousands of emails each time your web site is probed.

Guardian helps counter probes in two ways: the denial-of-service response and the blacklist response.

Note that some web hosting companies and legal departments will discourage these types of countermeasures. Even though their concerns are largely without merit, if you cannot afford to have your web host or legal department upset with you, then don't use these countermeasures.

Note also that Guardian will not have much of an effect on your site security in the greater scheme of things. The type of people who run Guardian are the perfectionists who track down every single mysterious HTTP request to their site; these types of people always have the latest patches installed anyway. These responses are mostly just available for fun to make you feel that you are doing something to counter the probes. These responses can also cut down on the amount of email being returned from the 404 email notification.

The blacklist and DOS responses are available in shareware mode but not freeware mode.

Using the DOS Response

The DOS response is a very limited denial-of-service response. The script will return an artificially high Content-Length, and will then spoon-feed content bytes back to the client at a rate of one byte per second, for whatever time value is listed. For many simple HTTP clients based on a single-threaded or fixed-threadpool model, this will hang all of their requests and render the probe inoperable.

Use discretion. While the web server is serving a Guardian DOS response, it will keep the Perl CGI process resident for returning the bytes. Because you don't want too many lingering processes on your server, you should limit DOS responses to fairly short time intervals of ten to fifteen seconds, and you should not respond to all probe patterns with DOS. DOS works particularly well against the Front Page authoring software and as a one-time response to a client who is about to be hit with the blacklist response. Long-running DOS should not be used as the catch-all response to hostile activity because it will cause more problems on your web server than it solves.

Note: this response type does no damage to any software, computer or network, other than to hang the remote thread making the seemingly-hostile request. The response uses minimal bandwidth and processing power on both the client and server. It is analogous to the port 139 cloaking response of Windows 2000 when that operating system is run in secure mode. (With cloaking, Windows 2000 violates the TCP/IP standard by returning no information on a port query, forcing the caller to wait a few minutes before the query times out. This is used to slow down port scans and probes of Windows vulnerabilities. Similarly, the Guardian DOS violates the HTTP spec by returning an artificially high content-length, for purposes of slowing down scans.)

Using the Blacklist Response

The blacklist response will only work on Apache or Zeus web servers which support the .htaccess file. When triggered, the blacklist rule will add a "deny from IP" directive to the .htaccess file, causing all further visits by that client to return "403 Access Denied". The blacklisting will continue until you manually clear out the deny directives from the .htaccess file, which you should do every few days.

Once a client has been blacklisted, all further requests by the client will return the "403 Access denied" response. If you have an "ErrorDocument 403" handler set up, it will be forced to deal with all of these denied requests. Since blacklisting is often done to get rid of probes that cause hundreds of server errors in a very short time (and a resulting email flood from Guardian), it is best to not configure a 403 CGI handler when using the blacklist response. Allow 403 errors to be handled quickly by Apache itself, and only offload 401, 404, and 500 errors to Guardian. Here is an example .htaccess file that is set up this way:

ErrorDocument 403 "Error 403 / Access Denied.
ErrorDocument 404 /guardian/ag.pl
ErrorDocument 500 /guardian/ag.pl

Other Responses

You can use the response "http-redirect: http://localhost/" to send the probe back against itself. This will only work for probes which blindly follow redirects.

Examples

Here is a set of filter rules which will block probes:

==
# losers who probe front page dirs and script vulnerabilities
url-pattern: (/_vti_bin|/cgi-bin/formmail)
blacklist: /usr/www/users/xav/.htaccess
==
url-pattern: (/_vti_bin|/cgi-bin/formmail)
dos: 60
==
# non-robots-compliant spider; causes lots of 404's, etc.
ua-substring: WebZIP
blacklist: /usr/www/users/xav/.htaccess
==
ua-substring: WebZIP
dos: 60
==
# Code Red and sadmind/IIS
url-pattern: (default.ida|cmd.exe|root.exe)
blacklist: /usr/www/users/xav/.htaccess
==

The portion in bold red must be edited to match the path on your specific server.

Note that the blacklist rule is always applied first, followed by DOS. Because the DOS response can take a long time to complete, you want to execute all other applicable rules first. The DOS response is applied to Front Page probes, but not the other probes, because those seem to be fairly immune.

Common Hostile Patterns

/_vti_bin/
Probe of Front Page authoring folders. Do not use the /_vti shortcut, because Microsoft Office applications will often make test requests to /_vti_inf.html when viewing an Office document on the site.

default.ida
Code Red checks for this file to exploit a buffer overflow in the IIS .ida handler.

cmd.exe, root.exe
The sadmind/IIS worm makes long requests to paths which include these strings.

Probes through Proxies

Guardian will abort countermeasure actions if the HTTP_VIA environment variable is present. This variable is present when a web request arrives via a proxy server. The Guardian does not wish to harm proxies because they are innocent intermediaries and they route valid traffic as well as invalid.


    "Best practices for handling hostile probes"
    http://www.xav.com/scripts/guardian/help/1013.html