Why do working pages on my website appear as broken?
There may be a time where Site Audit may show that some of your internal pages are reported as broken but are actually still fully up-and-running. While this does not happen often, it can be confusing for some users.
Generally, when this happens it's because of a false-positive. The three most common reasons for a false-positive reading are:
- Our Site Audit crawler could have been blocked for some pages in robots.txt or by noindex tags
- Hosting providers might block Semrush bots as they believe it is a DDoS attack (massive amount of hits during a short period of time)
- At the moment of the campaign re-crawl, the domain could not be resolved by DNS
- Website server cache storing old data and providing it to crawler bots
If you believe it’s happening because of a crawler issue you can learn how to troubleshoot your robots.txt in this article.
You can also lower the crawl speed in order to avoid a large number of hits on your pages at one time. That is why you could see this page as a working one, which is right, and our bot was unable to do so and reported the false-positive result.
To solve the problem with the server cache, please try clearing it and then re-running Site Audit again — this will bring you updated results that include all your fixes.