Ask any seasoned SEO, and they will tell you how important website crawls are for maintaining strong technical SEO.
But just how frequently should SEOs perform website crawls? And a better question — how often do SEOs actually routinely perform them?
In this post, we’ll discuss what SEO publications suggest as a “best practice” web crawling cadence, as well as the “actual-practice” cadence that SEOs tend to perform them. Finally, I’ll wrap things up by outlining the benefits of ramping up the frequency of your routine website crawls (wherever your starting block is) — using a small case study from Fox.com.
But First, What is a Website Crawl Anyways?
Using specialized tools such as Screaming Frog or Deep Crawl, SEOs are able to take a look “under the hood” of a website — much like a mechanic does when inspecting cars.
“Using specialized tools such as Screaming Frog or Deep Crawl, SEOs are able to take a look “under the hood” of a website — much like a mechanic does when inspecting cars.”
But instead of inspecting at the mechanical parts of a car, SEOs are inspecting the optimizable elements of a website — including the quality of its metadata, XML sitemaps, response codes, and more. When something isn’t working as expected, SEOs will diagnose the problem and fix it.
So What Do the Industry Pros Advise as a “Best Practice” Cadence for Performing Website Crawls?
There is very little chatter on the web discussing the “optimal frequency” that SEOs should perform website crawls specifically. However, industry publications seem to be in agreement that “mini” technical audits should be conducted on a monthly basis and “in-depth” technical audits should be conducted on a quarterly or semi-quarterly basis.
- “Since search algorithms and technology can change at a rapid pace, you want to perform mini-audits monthly.” – Neil Patel, Neil Patel Digital
- “Part of your ongoing SEO strategy includes regular audits to allow you to find and fix issues quickly (we recommend quarterly).” – Erika Varagouli, SEM Rush
- “It’s good practice to do an automated scan once a month. This will be often enough to bring up major issues, like any on-page errors in the form of broken links, page titles and meta-data or duplicate content. – Digital Marketing Institute
- “I perform an SEO audit for my clients the first month, monthly (mini-audit), and quarterly (in-depth audit).” – Anna Crowe, Search Engine Journal
Though a site crawl and technical audit are not the same thing (thank you for the clear separation of the two Barry Adams), it’s fair to say that these publications would recommend running a website crawl at least as frequently as they run mini-audits /on a monthly basis.
And How Routinely Do SEOs Perform Them in “Actual Practice”?
Best practice is one thing, but how often do SEOs run website crawls for client sites in actual practice? To get an idea, I took to Twitter. (Yes, Twitter polls do have their obvious limitations — but it’s one of the simplest means to get some tangible data.)
Three days and 2,000 votes later, the SEOs had spoken:
Approximately 55% of SEOs who participated in my poll fell into the “monthly or longer” bucket, while 45% fell into the “weekly” or shorter bucket. In other words, we were all over the map.
SEOs reading this article (most of you) may not be too surprised by these poll results. After all, both ends of the spectrum could make complete sense — depending on the type and size of the websites you are SEO managing.
That said, I had an experience at FOX two months ago that made me thankful that we run weekly website crawls across our major domains. I’d like to share it with you all here — in case it encourages you to increase the cadence of your technical website crawls (wherever your starting block is).
How Weekly Website Crawls Helped FOX Take Swift Action on a allRoutes.json Bug
In late July 2020, the SEO team at FOX ran our routine weekly website crawl of FOX.com and discovered that 100% of our TV episode pages were now serving error status codes (due to a big with the allRoutes.json file). Though the pages were displaying fine to users, they were throwing 404s to Google bot — making them ineligible to appear in Google search results.
Not only is watch time one of our major KPIs for the site, but these pages also generate lots of ad revenue for the company. So needless to say this was important for us fix ASAP!
Though we were able to diagnose the problem quickly thanks to our crawl, the solution was a bit complex. And over the course of three weeks (July 23 – August 13) we worked on this item, we saw (unsurprisingly) steep declines in SEO clicks and impressions from these set of pages.
After fixing the bug in mid-August, clicks and impressions numbers began trending back up to normal.
But I did have the thought — if we had waited several weeks to a month before running the crawl, the improper response codes would have done incremental damage to the site’s SEO traffic and ad revenue. Particularly so given that it took three weeks to fix.
“If we had waited several weeks to a month before running the crawl, the improper response codes would have done incremental damage to the site’s SEO traffic and ad revenue.”
While it’s completely possible to have diagnosed this bug without a full website crawl, it would have made the task unnecessarily difficult.
The screenshot below illustrates that without the right filters set in GSC “search results report”, the issue was somewhat veiled. The GSC “coverage report” was also (10 days!) slow to identify the watch page errors — and only offered examples of affected URLs. Plus, as mentioned earlier in the article, this response code issue was not affecting the UX of the page.
The point here being: that it turned out to be hugely beneficial for us to run these routinely weekly crawls — rather than relying on a more simple, Google Search Console “eyeballing” approach.
While it’s not always necessary for SEOs to perform weekly website crawls, it could be beneficial to get in the habit of performing routine crawls at least on a monthly basis — rather than a more relaxed, “as needed” approach.
“You have to constantly improve. You have to put the effort in, make small tweaks, make it faster, make it better, optimize every single part of it. Constantly. All the time.”Barry Adams
In the words of Barry Adams, “You have to constantly improve. You have to put the effort in, make small tweaks, make it faster, make it better, optimize every single part of it. Constantly. All the time.” And regular website crawls is a great tactic.