Google December 2020 Core Update Insights

Five search marketers gave their opinion on Google’s December 2020 core update. The observations provide interesting feedback on what may have happened.

In my opinion, Google updates have become increasingly less about ranking factors and more about improving queries and understanding web pages.

Some people have expressed the opinion that Google is randomizing search results to fool engineers who try to reverse Google’s algorithms.

I don’t share that opinion.

Some algorithm features are difficult to detect in search results. It is not easy to pinpoint the search result and say that it is ranking due to BERT algorithm or neural matching.

But this is It is easy to point to backlinks, due to EAT or user experience to tell if a site is ranking or if what it is pasting, may still be related to BERT.

Then search engine result page (SERPs) Can appear confusing and disorganized to people who are searching for traditional old school ranking factors to search for SERPs to explain why pages are ranked or why they lost rankings in updates.

advertisement

Continue reading below

Of course Google updates may seem insensitive. Web page rank has changed dramatically over the years due to techniques such as natural language processing.

What if Google updates and no changes have happened?

It has happened in the past that Google has changed something and the SEO community has not noticed.

For example, when Google added an algorithm like BERT, many did not know what had changed.

Now, what if Google has added something like SMITH algorithm? How will the SEO community find this out?

SMITH is described in a Google research paper published in April 2020 and revised in October 2020. What SMITH does is make it easier to understand long pages of content that improve BERT.

It is said here:

“In recent years, self-attention based models such as Transformer and BERT have achieved cutting-edge performance in text matching.

However, these models are still limited to some topics such as short text or a paragraph due to the quadratic computational complexity of self-attention with respect to input text length.

In this paper, we address the issue by proposing a Siamese multi-depth transformer-based hierarchical (SMITH) encoder for long-form matching.

Our experimental results on several benchmark datasets for long-term document matching suggest that our proposed SMITH model outperforms previous state-of-the-art models including hierarchical attention, multi-depth attention-based hierarchical recurrent neural networks, and BERT.

Compared to BERT based baselines, our model is capable of increasing the maximum input text length from 512 to 2048. “

advertisement

Continue reading below

I am not saying that Google has introduced SMITH algorithm (PDF) or it is related to passage algorithm.

What I am pointing out is that the December 2020 core update includes the quality of reasonably non-observable changes.

If Google added a new AI based feature or updated an existing feature like BERT, would the search marketing community be able to detect it? Probably right there.

And it is the quality of non-observable changes that can indicate that what has changed may be something of how Google interprets web queries and web pages.

If so, it may mean that instead of spinning on general ranking factors that are easily observable (links to scraper sites, site speed, etc.), that it is useful to step back and consider It may be that there is something much deeper than the general ranking factors that have changed.

Insights into Google December 2020 core update

I thank those who have had time to offer their opinions, provided excellent information that can help you put Google’s December core algorithmic update into perspective.

Dave Davis (@oohloo)
Beanstalk Internet Marketing

Dave calls this update in reference to what Google said was coming soon to the algorithm and how it could play a role in fluctuations.

Dave offered:

“The December 2020 core update was a unique one to watch roll out. Many sites that we work with started with a loss and ended with a win, and vice versa.

So clearly it had something to do with the sign or sign that cascaded. That is, where the change resulted in an outcome, but once that new calculation worked its way through the system, it produced another. Like PageRank recalculation, however, this one possibility had nothing to do with PageRank.

Alternatively, Google may have made adjustments on the fly during the rollout, or made other changes, but I am less likely to.

If we think about time, and how it is linked to the rolling of the index and it is a core update, I suspect that it is linked to content interpretation systems and not links or signals along those lines.

We also know that Core Web Vital is entering the algorithm in May of 2021, so the update may have elements to support, but will not produce the effects that we currently see in the Web Vitals technology. At this stage as a signal, at the very least, there will be more to update.

As far as the general community’s response, this past “it was big” is difficult to gauge. As one might expect in a zero-sum scenario, when one person is complaining about the loss, the other is smiling to all the SERPs.

I doubt it will be clear before the end of January what and why they were rolling. I believe it has to do with future characteristics and abilities, but I know that I may be wrong, and I need to look closely. “

advertisement

Continue reading below

Steven Kang (@SEOSignalsLab)

Steven Kang, founder of the popular SEO Signal Lab Facebook group, says that there appears to be nothing in terms of similarities or traits between winners and losers.

“It’s a difficult feeling.” I am getting profit and loss. I will need to wait for this. “

Daniel K. Cheung (@danielkcheung)
Team Lead, Prosperity Media

Daniel believes that it is helpful to withdraw and view Google updates from the big-picture view of the forest instead of the latest update tree, and put these updates in the context of what we know.

One example is the apparent decline in manual actions reported in Google Search Console. Does this mean that Google is better at ranking sites where they are without resorting to punitive manual actions?

This is how Daniel sees the latest core algorithm update from Google:

“I think we need to stop thinking of core updates as individual events, as search / discoverability and instead see core updates as continuous tests and ‘improvements’ that we SERPs. I see

So when I refer to the December core update, I want to emphasize that this is just one incident of many.

For example, some affiliate marketers and analysts have found sites that had previously been ‘hit’ by May 2020, which were recaptured in the December rollout. However, this has not been consistent.

And then, here’s the problem, we can’t talk about sites that won or lost because it’s all about individual URLs.

So pure visibility on the whole website does not really give us a clue.

301 redirects, PBNs, low-quality backlinks, and poor content cause havoc, leading to some sites being pushed from page 1 to page 6-10 of SERPs (practically invisible).

But these practices have always been susceptible to the daily fluctuations of the algorithm.

What is really interesting during 2020 is that there have been very few reports of manual penalties within the GSC.

This has been replaced with Eritli impressions and click on the graph jumping from the graph without de-indexing the site.

In my humble opinion, core updates about targeting a specific selection of practices are decreasing, but rather there is an incremental opportunity for the algorithm to mature.

Now, I’m not saying that Google gets it 100% at the right time – the algorithm obviously doesn’t and I don’t think it will ever (due to humanity’s curiosity). “

advertisement

Continue reading below

Christophe Temperature (@cemper)
CEO LinkResearchTools

Christophe Kemper sees the latest update as having an impact across a wide range of factors.

Here it is shared:

“Highly, Google is adjusting things that have a global impact in the main update.

Ie:

A) Weight ratio for different types of links, and their signals

I think the NoFollow 2.0 rollout has not been completed since Sept 2019, but tweaked. In what context is the power for NoFollow.

B) Answer boxes, too much. Google increases its real estate

C) large-scale devaluation of the PBN link network and fairly clear footprints of “outreach link building”.

Just because someone sent an outreach email does not make the payment link more natural, even if it was paid with “content” or “exchange of services”.

Michael Martinez (@seo_theory)
Founder of Sethori

Michael Martinez offered these information:

“People are confused and disappointed based on what I have seen in online discussions. They don’t really know what happened and why things have changed as some have some theories.

In a general sense, it seems to me that Google has rewritten a number of its quality policy enforcement algorithms.

Nothing specific in mind, but other people’s sites have made me see that I’m fine, not great. Some sites in our portfolio went up, others went down.

Then, it struck me as an explanation of enforcement or algorithms, which was in accordance with their guidelines.

Nothing about punishing, but perhaps about trying some different methods to solve the questions. “

advertisement

Continue reading below

What happened in Google December 2020 core update?

Attitudes vary about what happened in Google’s main algorithm update. Most observers seem to agree that none of the obvious factors or changes stand out.

And this is an interesting observation because it may mean that something related to AI or natural language processing was refined or introduced. But this is only speculation until Google explicitly rules it out or within.