The crawl errors section of Google Webmaster Tools has just undergone a revamped. Crawl errors are issues Googlebot encountered while crawling your site, so this revamp is going to be very useful (and long overdue!)
The basic changes include:
- New reports
- Changes to the levels of details
- Removing some errors
- Ability to mark something as fixed
The overall design looks pretty good, but it seems that not everyone is impressed. When I took a look at some of the more popular webmaster forums the general consensus was that the revamped section was confusing. Many webmasters are asking where the old reports are that they relied on in the past. I guess it will just take some time to get used to, but I do hope that Google is listening to this feedback.
From this update, there are some areas of Webmaster Tools that have been removed. According to Search Engine Land, this includes:
- Ability to download all crawl error sources. Previously, you could download a CSV file that listed URLs that returned an error along with the pages that linked to those URLs.
- 100K URLs of each type. Previously, you could download up to 100,000 URLs with each type of error. Now, both the display and download are limited to 1,000.
- Redirect errors – Inexplicably, the “not followed” errors no longer seem to list errors like redirect loop and too many redirects. Instead it simply lists the response code returned (301 or 302).
- Specifics about soft 404s. The soft 404 report used to specify whether the URLs listed returned a 200 status code or redirected to an error page. But the status code column appears to be empty now.
- URLs blocked by robots.txt . Google says they removed this report because “while these can sometimes be useful for diagnosing a problem with your robots.txt file, they are frequently pages you intentionally blocked”. They say that similar information will soon be available in the crawler access section of webmaster tools.
- Specifics about site level errors. The previous version of these reports listed the specific problem (such as DNS lookup timeout or domain name not found).
- Specific URLs with “site” level errors. Google says you don’t need to know the URL if the issue was at the site level.
However, Google has said that these functions and reports are still available in the API based version of Google Webmaster Tools.
What do you think of the new version of Webmaster Tools? Is there anything you wish Google had kept? Feel free to share your feedback below.