On December 3rd, Google announced that they were rolling out the latest Core Update. Initially, the bulk of the impact seemed to arrive on that date, with MozCast spiking at 112.4°F:
We measured above-average ranking flux in the three days prior to the update announcement, and a few days after the announcement, but the bulk of the flux seemed to occur on the roll-out day. (The dotted line represents the 30-day average prior to December 3rd.)
While technically the third largest named core update, Google’s December Core Update was very close in measured impact to the May 2020 Core Update and the August 2018 “Medic” Update.
Back in May, I came down pretty hard on winners and losers reports. I don’t want to discourage all core update analyses, but our rush to publish can produce misleading results, especially with multi-day updates. In May, I settled on a 7-day update analysis, comparing the full week before the update to the full week after. This helps better reflect multi-day roll-outs and also cleans up the noise of sites with naturally high flux, such as news sites (which often wax and wane on a weekly cycle).
Below are the top 20 overall winners in our MozCast data set, by percentage gain:
Note the 1-day comparisons (December 4th vs. December 2nd) vs. 7-day and in particular the orange values — five of our top 20 picked up considerably more gains after the bulk of the update hit. We also saw some reversals, but the majority of sites recorded their wins and losses early in this update.
Another challenge with winners and losers analyses is that it’s easy for large percentage gains and losses from small sites to overshadow larger sites that might see much larger traffic and revenue impact. Here are the top 20 winners across the 100 largest sites in our tracking set:
Note that New York Magazine picked up considerably more gains after December 4th. Of course, for any given site, we can’t prove these gains were due to the core update. While Apple’s App Store was the big winner here, a handful of big sites saw gains over +20%, and eBay fared particularly well.
We tend to focus on domain-level winners and losers, simply because grouping by domains gives us more data to work with, but we also know that many of Google’s changes work at the page level. So, I decided to try something new and explore the winners among individual pages in our data set.
I stuck to the top 100 most visible pages in our data set, removed home pages, and then looked only at the 7-day (before vs. after) change. Here are the top 10 winners, along with their 7-day gain (I’ve opted for a text list, so that you can click through to these pages, if you’d like to explore):
It’s interesting to note a number of shifts in financial services and especially around mortgage rates and calculators. Of course, we can’t speak to causality. It’s entirely possible that some of these pages moved up because competitors lost ground. For example, https://www.mortgagecalculator.org lost 23% of their visibility in the 7-day over 7-day comparison.
While it’s interesting to explore these pages to look for common themes, please note that a short-term ranking gain doesn’t necessarily mean that any given page is doing something right or was rewarded by the core update.
Now that the dust has mostly settled, are you seeing any clear trends? Are any specific types of pages performing better or worse than before? As an industry, analyzing Core Updates has a long way to go (and, to be fair, it’s an incredibly complex problem), but I think what’s critical is that we try to push a little harder each time and learn a little bit more. If you have any ideas on how to expand on these analyses, especially at a page level, let us know in the comments.