Google December 2020 Core Update Insights


Five search marketers contributed opinions on Google’s December 2020 Core Update. The observations offer interesting feedback on what may have happened.

In my opinion, Google updates have increasingly been less about ranking factors and more about improving how queries and web pages are understood.

Some have offered the opinion that Google is randomizing search results in order to fool those who try to reverse engineer Google’s algorithm.

I don’t share that opinion.

Certain algorithm features are hard to detect in the search results. It’s not easy to to point at a search result and say it is ranking because of the BERT algorithm or Neural Matching.

But it is easy to point to backlinks, E-A-T or user experience as reasons to explain why a site is ranking or not ranking if that’s what’s sticking out, even when the actual reason might be more related to BERT.

So the Search Engine Results Pages (SERPs) may appear confusing and random to those who are scrutinizing the SERPs looking for traditional old school ranking factors to explain why pages are ranking or why they lost rankings in an update.

Advertisement

Continue Reading Below

Of course the Google Updates may appear to be inscrutable. The reasons why web pages rank have dramatically changed over the past few years because of technologies like natural language processing.

What if Google Updates and Nobody Sees What Changed?

It’s happened in the past that Google has changed something and the SEO community didn’t notice.

For example, when Google added an algorithm like BERT many couldn’t detect what had changed.

Now, what if Google added something like the SMITH algorithm? How would the SEO community detect that?

SMITH is described in a Google Research paper published in April 2020 and revised in October 2020. What SMITH does is make it easier to understand a long page of content, outperforming BERT.

Here is what it says:

“In recent years, self-attention based models like Transformers and BERT have achieved state-of-the-art performance in the task of text matching.

These models, however, are still limited to short text like a few sentences or one paragraph due to the quadratic computational complexity of self-attention with respect to input text length.

In this paper, we address the issue by proposing the Siamese Multi-depth Transformer-based Hierarchical (SMITH) Encoder for long-form document matching.

Our experimental results on several benchmark datasets for long-form document matching show that our proposed SMITH model outperforms the previous state-of-the-art models including hierarchical attention, multi-depth attention-based hierarchical recurrent neural network, and BERT.

Comparing to BERT based baselines, our model is able to increase maximum input text length from 512 to 2048.”

Advertisement

Continue Reading Below

I’m not saying that Google has introduced the SMITH algorithm (PDF) or that it’s related to the Passages Algorithm.

What I am pointing out is that the December 2020 Core Update contains the quality of seemingly non-observable changes.

If Google added a new AI based feature or updated an existing feature like BERT, would the search marketing community be able to detect it? Probably not.

And it is that quality of non-observable changes that may indicate that what has changed might have something to do with how Google understands web queries and web pages.

If that is the case, then it may mean that instead of spinning wheels on the usual ranking factors that are easily observed (links from scraper sites, site speed, etc.), that it may be useful to step back and consider that it may be something more profound than the usual ranking factors that has changed.

Insights into Google December 2020 Core Update

I thank those who had time to contribute their opinions, they provided excellent information that may help you to put Google’s December Core Algorithm Update into perspective.

Dave Davies (@oohloo)
Beanstalk Internet Marketing

Dave puts this update in the context of what Google has said was coming soon to the algorithm and how that might play a role in the fluctuations.

Dave offered:

“The December 2020 Core Update was a unique one to watch roll out. Many sites we work with started with losses and ended with wins, and vice-versa.

So clearly it had something to do with a signal or signals that cascade. That is, where the change caused one result, but once that new calculation worked its way through the system, it produced another. Like PageRank recalculating, though this one likely had nothing to do with PageRank.

Alternatively, Google may have made adjustments on the fly, or made other changes during the rollout, but I find that less likely.

If we think about the timing, and how it ties to the rolling out of passage indexing and that it’s a Core Update, I suspect it ties to content interpretation systems and not links or signals along those lines.

We also know that Core Web Vitals are entering the algorithm in May of 2021 so there may be elements to support that in the update, but those would not be producing the impact we’ve all been seeing presently given that Web Vitals should technically be inert as a signal at this stage so at the very least, there would be more to the update than that.

As far as general community reaction, this one has been difficult to gauge past “it was big.” As one can expect in any zero-sum scenario, when one person is complaining about a loss, another is smiling all the way up the SERPs.

I suspect that before the end of January it’ll become clear exactly what they were rolling out and why. I believe it has to do with future features and capabilities, but I’ve been around long enough to know I could be wrong, and I need to watch closely.”

Advertisement

Continue Reading Below

Steven Kang (@SEOSignalsLab)

Steven Kang, founder of the popular SEO Signals Lab Facebook group notes that nothing appears to stand out in terms of commonalities or symptoms between the winners and losers.

“This one seems to be tricky. I’m finding gains and losses. I would need to wait more for this one.”

Daniel K Cheung (@danielkcheung)
Team Lead, Prosperity Media

Daniel believes that it’s helpful to step back and view Google updates from the big picture view of the forest rather than the tree of the latest update, and to put these updates into the context of what we know is going on in Search.

One example is the apparent drop in reports of manual actions in Google Search Console. The implication is, does that mean Google is better at ranking sites where they belong, without having to resort to punitive manual actions?

This is how Daniel views the latest core algorithm update from Google:

“I think we as Search/Discoverability people need to stop thinking about Core Updates as individual events and instead look at Core Updates as a continuum of ongoing tests and ‘improvements’ to what we see in the SERPs.

So when I refer to the December core update, I want to stress that it is just one event of many.

For example, some affiliate marketers and analysts have found sites that were previously ‘hit’ by the May 2020 update to have recovered in the December rollout. However, this has not been consistent.

And again, here is the problem, we can’t talk about sites that have won or lost because it’s all about individual URLs.

So looking at pure visibility across an entire website doesn’t really give us any clues.

There are murmurs of 301 redirects, PBNs, low-quality backlinks and poor content being reasons why some sites have been pushed from page 1 to page 6-10 of the SERPs (practically invisible).

But these practices have always been susceptible to the daily fluctuations of the algorithm.

What’s been really interesting throughout 2020 is that there have been very few reports of manual penalties within GSC.

This has been eerily replaced with impression and click graphs jumping off a cliff without the site being de-indexed.

In my humble opinion, core updates are becoming less about targeting a specific selection of practices, but rather, an incremental opportunity for the algorithm to mature.

Now, I’m not saying that Google gets it right 100% of the time – the algorithm clearly doesn’t and I don’t think it ever will (due to humanity’s curiosity).”

Advertisement

Continue Reading Below

Cristoph Cemper (@cemper)
CEO LinkResearchTools

Cristoph Cemper views the latest update as having an impact across a wide range of factors.

Here is what he shared:

“High level, Google is adjusting things that have a global impact in core updates.

That is:

a) Weight ratios for different types of links, and their signals

I think the NoFollow 2.0 rollout from Sept 2019 is not completed, but tweaked. I.e. how much power for which NoFollow in which context.

b) Answer boxes, a lot more. Google increases their own real estate

c) Mass devaluation of PBN link networks and quite obvious footprints of “outreach link building.”

Just because someone sent an outreach email doesn’t make a paid link more natural, even if it was paid with “content” or “exchange of services.”

Michael Martinez (@seo_theory)
Founder of SEOTheory

Michael Martinez offered these insights:

“Based on what I’ve seen in online discussions, people are confused and frustrated. They don’t really know what happened and few seem to have any theories as to why things changed.

In a general sense, it feels to me like Google rewrote a number of its quality policy enforcement algorithms.

Nothing specific in mind but other people’s sites I’ve looked at struck me as being okay, not great. Some of the sites in our portfolio went up, others went down.

Again, it just struck me as being about enforcement or algorithmic interpretation of signals mapped to their guidelines.

Not about punishing anything, but maybe about trying some different approaches to resolving queries.”

Advertisement

Continue Reading Below

What Happened in Google December 2020 Core Update?

The perspectives on what happened in Google’s core algorithm update vary. What most observers seem to agree is that no obvious factors or changes seem to stand out.

And that’s an interesting observation because it could mean that something related to AI or Natural Language Processing was refined or introduced. But that’s just speculation until Google explicitly rules it out or in.





Source