We all know that the quality of your website is currently derived using signals that are external to the website itself, such as the trust, relevance and diversity of links pointing towards it.
However, according to New Scientist magazine it appears that in the near future, Google may value the accuracy of your content more than the quality of your backlinks.
A recent paper published by researchers within the company (PDF), shows that it is working on a system where it can determine the trustworthiness of a page not by who is linking to it, or how many incoming links it has, but by the number of facts it contains.
The system is based on something called a Knowledge-Based Trust (KBT) score, which would be calculated by cross referencing the 2.8 billion facts stored in Google’s Knowledge Vault with the content on a given webpage or domain. The pages that are found to be more accurate will be rewarded presumably in the form of higher rankings. In cases where a single web page doesn’t have enough facts, the paper proposes relying using additional pages from the same website to better determine trustworthiness.
Interestingly, the authors say that the preliminary empirical tests of KBT have been promising:
“We applied it to 2.8 billion triples extracted from the web, and were thus able to reliably predict the trustworthiness of 119 million web pages and 5.6 million websites.”(NB: The paper uses “triples” to describe the facts found and extracted from web pages, as they consist of subject-predicate-object expressions, such as Barack Obama, nationality, USA).
Google has been building a massive database of known facts for years, and in 2012 introduced its Knowledge Graph. For those of you that don’t know, the Knowledge Graph is the source of those information boxes and carousels that show up in Google search results, usually for queries that are related to places, people, and other ‘known’ entities such as films and books.
However, I can envisage a few difficulties in applying the KBT concept uniformly across the internet; primarily web pages that don’t exist to share facts (satirical website such as The Onion for one), or aren’t about entities that exist in a Knowledge Graph-style database (websites about new technology or new discoveries).
To accommodate this current concept, the authors suggest that the KBT method of measuring trustworthiness “provides an additional signal for evaluating the quality of a website,” and could be used “in conjunction with existing signals such as PageRank” — not necessarily as a replacement.
If you look at the charts in the paper, you can see just how well PageRank and the KBT score complement each other.
This isn’t the first time that Google has looked into ways of excluding links from Google’s Algorithm. Matt Cutts acknowledged in February 2014 that Google had tested their search results by turning off link data as part of their algorithm. However, at the time, Matt Cutts said the results would be “much much worse” if they did indeed do that in real life.
This suggests that external signals such as links are likely to be a part of Google’s Algorithm for a while longer. However, instead of attempting to manipulate rankings through links, brands should treat links in exactly the same way they would treat offline marketing and traditional PR; namely aiming to focus on assets that are valuable to who they want to connect with and in-line with how the brand wants to be perceived.
*For anyone wondering what the image at the top of the article has to do with the content. Absolutely nothing. Twentieth Century Fox has teamed up with Getty Images to create a set of stock photos to promote their new movie, Unfinished Business, featuring Vince Vaughn along with co-stars Tom Wilkinson, Dave Franco and others. It’s a great example of a different way of connecting with an audience, and you can see the rest of the images here.