Google to diagnose multi-domain crawling issues

Google to diagnose multi-domain crawling issues

Google Search Advocate John Mueller provided insights into diagnosing widespread crawling problems.

This guide was shared in response to an incident reported by Adrian Schmidt on LinkedIn. Google’s crawler has stopped allowing simultaneous access to several of its domains.

Despite the disruption, Schmidt found that live testing through Search Console continued to work without any error messages.

Research shows no increase in 5xx errors or problems with robots.txt requests.

What could be the problem?

Mueller’s answer

Mueller addressed the situation and pointed to shared infrastructure as a likely cause:

“If it is shared across a number of domains and focuses on something like crawling, it is likely an issue with a shared part of the infrastructure. If it’s already recovering, at least it’s no longer urgent and you have some time to look at the latest changes/infrastructure protocols.”

Infrastructure survey

All affected websites were using Cloudflare as their CDN, which raised some eyebrows.

When asked about debugging, Mueller recommended checking Search Console data to determine whether DNS or failed requests were causing the problem.

Müller explained:

“The crawl stats in Search Console will also show a bit more, perhaps helping to decide between, say, DNS and failed requests.”

He also pointed out that the timing was a crucial clue:

“If everything was exactly at the same time, it wouldn’t be robots.txt and probably wouldn’t be DNS either.”

Impact on search results

Regarding concerns about search visibility, Mueller assured that this type of disruption would not cause problems:

“If this is from today and only lasted a few hours, I wouldn’t expect any visible problems with the search.”

Why this is important

If Googlebot suddenly stops crawling multiple websites at once, it can be difficult to determine the cause.

While temporary crawl breaks may not immediately impact search rankings, they can impact Google’s ability to discover and index new content.

The incident highlights a vulnerability that organizations may be facing without realizing it, particularly those that rely on shared infrastructure.

How this can help you

When Googlebot stops crawling your sites:

  • Check if the problem affects multiple websites at the same time
  • First, look at your shared infrastructure
  • Use Search Console data to narrow down the cause
  • Don’t rule out DNS just because normal traffic looks good
  • Keep an eye on your logs

If you run multiple websites behind a CDN, ensure the following:

  • Ensure good record keeping
  • Pay attention to your crawl rates
  • Know who to call if things go wrong
  • Keep an eye on your infrastructure provider

Featured Image: PeopleImages.com – Yuri A/Shutterstock

Previous Article

Google introduces AI-powered updates to Performance Max campaigns

Next Article

How to Hide Annoyances in Safari with Distraction Control on Mac

Subscribe to our Newsletter

Subscribe to our email newsletter to get the latest posts delivered right to your email.
Pure inspiration, zero spam ✨