|Google Algorithm Updates||Month|
|Broad Core Algorithm||March, 2018|
|Intrusive Interstitials||January, 2017|
|Local Result on Google ccTLDs||October, 2017|
|Meta Description Elongation||December, 2017|
Google keeps changing its algorithms at regular intervals, sometimes twice a day depending on the needs. Similarly, the company brought up changes in its algorithms in the year 2017 & 2018.
Here, we are moving ahead discussing the most recent ones i.e. 2018 Updates-
- 2018 Google Algorithm Update
- Clues for Broad Core Algorithm Update
- Why Google Focuses on Transparency?
- The Algorithm is not restricted to Target Low Quality
- Google Keeps Abandoning Phantom Update Speculation
- Google Core Algorithm Gets Updated Twice on Daily Basis
- Google Doesn’t Consider General Phantom Update Assumption
- Takeaway 1-
- Takeaway 2-
- Takeaway 3:
- Let’s talk about 2017 Updates now-
- Watch DSIM Trainees Celebrating Last Day of Batch
In its 2018 Google update, Daniel Sullivan of Google said that the search engine isn’t focusing low-quality sites. He further mentioned that there wouldn’t be any problem if sites with low rankings are fixed.
In this update, there have been brought multiple noticeable things related to different aspects of “low-quality signals”.
What is a Broad Core Algorithm Update?
Google makes no announcements for updates done to its core algorithm because it keeps updating its core algorithm almost every day and probably twice a day.
But, the current update is a different one as it, not a usual one as well.
This type of update occurs numerous times every year. Google calls it a Broad Core Algorithm Update.
Clues for Broad Core Algorithm Update
Google gave few clues. But here is what we know:
- The update inclined to offer better search results
- There is no issue with websites which have lost rankings
- Sites which have lost rankings can’t be fixed in any way.
- The changes were inclined to the content, but it’s not a big problem.
Why Google Focuses on Transparency?
The SEO niche always anticipates that the core algorithm updates of Google targets “low quality” web pages.
There is the different and low side of the update that it would be ignoring the possibility of Google’s core algorithm updates which assumed to provide the best solution for a search query.
Therefore, if Google changes in its core algorithm to make the search queries process better, the SEO niche would have been proved wrong about assuming that the sites with no ranking made mistakes.
The Algorithm is not restricted to Target Low Quality
The algorithm targeting low-quality web pages is completely different from the algorithm that offers best search query answers.
It’s a general conception of SEO industry that Google focuses on low quality and excludes the improvements in answering search queries.
Google Keeps Abandoning Phantom Update Speculation
Google denied all speculations of SEO niche about thinking that Google’s algorithm was particularly focusing on low-quality web pages for past few years. Google’s spokespersons even discouraged such assumptions.
Google said in response to Phantom update chatter that it made no significant update, there were only daily changes to their core algorithm.
If this is believed to be true, then, an SEO expert can take any day of the week to announce an update and they would be right. But they could go wrong in stating that update was covering more and more hacks and also in assuming that the change was focusing on low quality of web pages. As we saw above, Google’s update was more than “targeting” low quality.
Google Core Algorithm Gets Updated Twice on Daily Basis
A few years ago Google come up with the information regarding its algorithm which had a statement, “Google made at least two core algorithm changes on daily basis”.
Google made 665 search improvements in 2012 alone. If you consider that data, you would come to know why Google always says there was no update.
Gary Illyes, a webmaster trend analyst wasn’t ready to name a daily core algorithm update. Naming every change or picking one update out of several in a month could be an absurd idea. But, somehow he suggested a name after his pet fish, Fred.
Google Doesn’t Consider General Phantom Update Assumption
Google’s Danny Suvillan denied all speculations that were said in context to “targeting” low-quality websites.
Where is Google Doing Improvements?
He further said that he reads current research papers and patents and this research includes 22 areas, but none of that research was on targeting low-quality web pages.
Here are the most SEO related areas-
- Understanding user expectation
- Understanding content
Those were the areas which could be the part of Google’s Broad Core Algorithm Update. How would one optimize if Google improved the ways of better understanding of the content?
Find the link to the latest research- Learning for Efficient Supervised Query Expansion via Two-stage Feature Selection.
It is not possible to beat this type of algorithm with the help of a thesaurus or content with synonyms. This could be the reason Sullivan said it wasn’t something to fix.
The Update Possible not Focuses on Low Quality
Sullivan again stated that his research was never about getting low-quality web pages. And, it was because Google’s researchers don’t focus on it.
If one has to find the details, the person needs to explore years back research papers and patents that are associated with finding low-quality content. Most of SEO information retrieval research remains on user needs and understanding content itself.
If these are two points which search engine try to cover, then it follows that the core algorithm updates target on ranking concerns rather than “targeting” low-quality web pages.
If to presume that Google and SEOs are focused on low-quality web pages, there are no research papers and patents justifying this and it’s a fact.
Follow this; you’ll get information about “low-quality web page” algorithms.
In context to regain rankings, Google says to wait for your content to come up compared to other pages. But, that would happen when your content is “awesome” and more importantly, your content is better than your competitors.
Google’s search results are about solving user queries. That means that your content must also focus on solving the problems that users have when they make any particular search query. There is more to be said on this topic as it relates to user experience.
Google search results focus on offering best answers to user queries. This also applies to the content you need to make content resolving user’s problems for a specific search query. So, you need to make content including “How convenient it is, how easy it is to find an answer, how easy it is to compare products” offering a great user experience.
Let’s talk about 2017 Updates now-
Google has always focused on quality content & link creation and making the web high performing. Thus, there were some confirmed and unconfirmed updates regarding Google algorithms in 2017.
Intrusive Interstitials are popup ads that block most or all of a page creating a bad user experience for desktop and mobile users, both.
Google picked the “interstitials” topic and come up with an update which included a fair warning and a confirmation of removal. Mobile sites showing interstitials covering content were reprimanded when offered bad experiences to users when viewing through the web.
This update was also for the sites containing low-quality content. But, this was more inclined towards revenue generation rather than helping users.
Hawk, a local update removed some of the changes introduced by Possum in 2016. Google axed a filter that stopped local business businesses which shared an address or building from appearing in the same pack.
Local Results on Google ccTLDs–
This update was focused on the country code top-level domain of Google. The update showed that searchers can’t get international search results by changing the Google ccTLD. The change also showed a spike in a spike in Adwords ads and drop in SERPs local packs.
Meta Description Length Elongated
Since SEO professionals found that longer SERPs snippets, Google validated that experts can now add more data from their content to search engine to show more lengthy results. The new and average Meta Description length is 160-230 characters.
Maccabees focused on the sites that have doorway pages targeting numerous subcategories or locations with keyword permutations in content.
Maccabees update is targeted to find dodgy links and thin content devaluing the website in the eyes of Google’s search engine. The update mainly hit e-commerce sites and many of them reported a rank drop. The concern of Maccabees has been towards targeting shopping sites varied in nature of business.
Updates are integral parts of Google’s core algorithm. The company keeps it changing on daily basis and even sometimes in a year.
Here are major updates of 2017 & 2018, both in a collective form. Google with its 2018 update said that it was not focusing on low quality web pages and also squashed Phantom update speculation.
On the other hand with 2017 update, it showed the changes done in major algorithm and some unconfirmed updates.