Alexis Sanders of Merkle sat down with Martin Splitt of Google last year (before COVID) and spoke about crawl budget. It may be one of the more informative videos in the SEO mythbusting series to-date.
[embedded content]
Here is what was covered, with the timestamps, if you are interested in just scanning the video:
As an added bonus, here are some questions Martin responded to related to this talk on Twitter:
That pattern is kinda normal as Googlebot might zig-zag around the maximal reasonable crawl rate.
Crawl budget issues are when you see us discover but not crawl pages you care about for quite a while and the pages have no other issues.
— Martin Splitt at ??? (@g33konaut) July 15, 2020
It’s not a significant cost on our end
— Martin Splitt at ??? (@g33konaut) July 15, 2020
Either 404em or keep em around.
— Martin Splitt at ??? (@g33konaut) July 15, 2020
That’d qualify as dynamic rendering but in general these setups are “footguns” – sounds good & might work, but turns out to introduce lots of unnecessary complexity that backfires eventually.
— Martin Splitt at ??? (@g33konaut) July 15, 2020
If that’s something you’re concerned about, it might make sense. I don’t think it’s necessary normally, tho.
— Martin Splitt at ??? (@g33konaut) July 15, 2020
Correlation isn’t causation ?
So in short: No.— Martin Splitt at ??? (@g33konaut) July 15, 2020
It depends on how that drop down is implemented. If the links are valid links and in the rendered HTML, then the crawler can pick them up.
— Martin Splitt at ??? (@g33konaut) July 15, 2020
Lots of webmasters give us unhelpful dates.
— Martin Splitt at ??? (@g33konaut) July 14, 2020
Forum discussion at Twitter.