Utility Spotlight: Optimize and Analyze Your Web Site
If maintaining a Web site is part of your charge, you know it’s a challenge balancing ease of use with SEO initiatives. The IIS SEO Toolkit can help.
It’s a challenge to keep your Web site both user- and search engine-friendly. When you’re in charge of your organization’s Web site, you’re always looking for tools that can track down broken links, find missing content and ensure that your pages have the right search engine optimization (SEO) tags. If you’re using IIS to manage your site, Microsoft offers a free tool designed specifically to help with optimization and analysis.
The IIS Search Engine Optimization Toolkit integrates directly with IIS 7 or 7.5. It helps you take care of three basic tasks:
- Run a site analysis with reports and customizable queries
- Design and manage sitemaps to submit to search engines
- Maintain a robots.txt file to tell search engines which pages to ignore
You can download the IIS SEO Toolkit directly from its page at the Microsoft Download Center. There are two versions: IISSEO_x86.msi for 32-bit Windows and IISSEO_amd64.msi for 64-bit Windows.
After you download and install the file, simply run IIS. Among the different modules will be a new icon in the Management group called Search Engine Optimization. Double-click the SEO icon, and you’ll see options for those three basic tasks.
Site Analysis starts with an initial analysis of your site. Create a name for the report and enter your site’s full URL, including the “http.” Click on Advanced Settings to narrow the results by specifying the number of URLs to include and the maximum download size per URL. You can also set the level of authentication needed to access certain content.
The tool will then conduct a full scan of your Web site. The entire process will take several minutes or longer, depending on the size and depth of your site. After the scan is complete, it will generate a summary report showing you the total number of items downloaded, links scanned, violations found and other data. You can then drill down through some of the specific findings.
Click on the Violations tab to display any problems the tool has found, like improper HTML or CSS tags and missing or incomplete SEO information (see Figure 1). Double-click on a specific violation, and it will display details indicating the exact problem, the code in question and recommended actions. You can also view the pages with the most violations and the violations by category—content, SEO and HTML standards.
Figure 1 You can determine the nature of any violations within your Web site code.
Click on the Content tab to break down pages on your site by type, such as text, PDF and image file. You can see specific problems, including duplicate files, broken links and missing content. Again, double-click on a specific item, such as a URL or content type, to bring up a host of details that can help you zero in on specific problems.
The Performance tab analyzes the speed of your pages, specifically the size of each page and how long it takes to load. Finally, the Links tab examines all the hyperlinks on your site—which pages have the most links and which ones are kept from search engines through a robots.txt file.
The tool automatically saves each report you generate, so you can easily return to any of them and compare against others. You can also export reports as CSV files. After you’ve generated a report, you can create a number of customized queries to find specific data. Building a new query is simple—just choose a field name, an operator and a value.
Let’s say you want to see how many JPEG files are on your site. Choose “Content Type” as the field name and “Equals” as the operator. Then type image/jpeg as the value (see Figure 2). If you needed to know if a particular SEO keyword is missing from any of your pages, you could create a query with Keywords as the Content Type. Choose Not Equal as the operator, and then type that keyword as the value.
Figure 2 You can save your queries, and export them as CSV files.
Sitemaps and Robots
Besides running reports and queries, the IIS SEO toolkit lets you create an XML-formatted sitemap. You can submit this to the major search engines to make sure they crawl the right content on your site. When creating a sitemap, you tell the tool which URLs to include and how often to refresh the file based on the frequency of your content changes. You can create multiple sitemaps, as well as a sitemap index—a collection of individual sitemaps.
Finally, the tool also helps you build and maintain a robots.txt file. You can set this up on your site to tell search engines which URLs they shouldn’t crawl. You can easily create disallow and allow rules to determine which content gets indexed. You can also view the entire robots.txt file within Notepad.
Overall, this IIS SEO Toolkit offers a nice array of features to help you manage and maintain your site. It helps you do so with your visitors and search engines in mind. The smooth integration with IIS makes the tool both powerful and easy to use.
Lance Whitney* is a writer, IT consultant and software trainer. He’s spent countless
hours tweaking Windows workstations and servers. Originally a journalist, he took a
blind leap into the IT world in the early ’90s.*