Robots Exclusion Page

Applies To: Windows 7, Windows Server 2008, Windows Server 2008 R2, Windows Vista

The Robots Exclusion page lets you maintain a Robots.txt file for your Web site that informs search engines about the paths (locations) in your Web site that you want to exclude from or allow for indexing. If your Web site has subdomains, each subdomain must have its own Robots.txt file.

For each path, you specify an action (allow or disallow) and the search engine (user-agent, also called a “robot”) to which the preference applies. To specify all paths in your Web site, use “/”; to specify all user-agents, use “*”. Each combination of path, action, and user-agent creates a directive or “rule” that search engines may or may not honor. You can add multiple rules to the Robots.txt file. The Robots Exclusion page provides a way for you to conveniently create, modify, and delete the rules in the Robots.txt file from inside the IIS Manager interface.

To filter the display of rules in the Robots.txt file, enter search text in the Filter box and click Go. To remove the filter, click Show All. Sort the list by clicking one of the feature page column headings or select a value from the Group by drop-down list. You can group the Robots.txt entries by selecting User-Agent, Action, Path, or No Grouping.

UI Element List

The following tables describe the columns on the feature page and the tasks in the Actions pane.

Feature Page Columns

Column Name Description


Displays a path in your Web site to be allowed or disallowed for indexing by the search engine (user-agent) that you specify.


Displays whether the specified path is to be allowed or disallowed for indexing.


Displays the name of the search engine (“robot”) to which the rule applies.

Actions Pane Tasks

Element Name Description

Add Disallow Rules

Opens the Add Disallow Rules dialog box so that you can add disallowed paths to the Robots.txt file.

Add Allow Rules

Opens the Add Allow Rules dialog box so that you can add allowed paths to the Robots.txt file.

Open Robots.txt File

Opens the Robots.txt file in Notepad.

Add Location

Opens the Add Sitemaps dialog box so that you can add Sitemaps to the Robots.txt file.

You cannot configure the Last Modified Date Sitemap attribute when the Add Sitemaps dialog box is accessed from the Robots Exclusion page.

View in Browser

Displays, in the default browser, the Sitemap that you have selected in the Sitemap Locations pane.


Removes the selected rule or selected Sitemap from the Robots.txt file.

Sitemap Locations Pane Columns

Element Name Description


Displays the location (URL) of each Sitemap that you have added to the Robots.txt file.

See Also


Add Allow Rules and Add Disallow Rules Dialog Boxes
Add URLs and Add Sitemaps Dialog Boxes
IIS SEO Toolkit User Interface (UI) Help