Recently, Google has released Search Quality Evaluator Guidelines (hereinafter referred to as the Guidelines) for their raters, which reveals some of the hidden rules. And yes, Google has people to rate and adjust to the ranking of millions of websites except for algorithm working.
Besides the Guidelines, some SEO experts have concluded more than 200 rules that Google algorithm should be applying to. Since all these rules are summarized and speculated from third-party research, tweets of Google senior staff and webmasters’ own experience, they may not be completely accurate. However, thanks to them, we can at least find optimization directions.
In this article, I’ll explain some of Google’s ranking factors so that you can understand how the search results come up. In addition, I’ll bring in two tools which are commonly used by webmasters and developers to the quality and performance of an online website.
The Guidelines describes Google’s criteria for rating page quality which is to what extent the page achieves its goal from assessors’ understanding of the purpose of that page. The major objects for raters to evaluate are the main content, supplementary content and ads. And the main factors that evaluators consider when rating the page are, the purpose of the page; expertise, authoritativeness, trustworthiness; main content quality and amount; website Information and website reputation.
The following quote suggests how vital it is to build a website people can actually trust: “For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important.” By clarifying such rating criteria, webmasters can make targeted adjustments and efforts when optimizing their websites accordingly to improve their ranking in Google SERPs.
As for the over 200 ranking factors list, it mainly focuses on domain factors, page-level factors, site-level factors, backlink factors, user interaction, special Google algorithm rules, brand signals, on-site webspam factors and off-site webspam factors. Since there are too many of the rules, I’ll attach the link to the list at the end of this article, and you can explore further if interested.
Webmasters, SEO experts and digital marketing managers usually refer to authoritative tools to obtain a general perception of their own websites and competitors’ position. Alexa is one of them. Founded in 1996, Alexa is a global pioneer in the world of analytical insight. Alexa’s traffic estimates are based on data from our global traffic panel, which is a sample of millions of Internet users using one of many different browser extensions. The global traffic rank is a measure of how a website is doing relative to all other sites on the web over the past 3 months. The rank is calculated using a proprietary methodology that combines a site’s estimated average daily unique visitors and its estimated number of page views over the past 3 months.
Another tool is called Moz, who has created Domain Authority, which predicts a root domain’s ranking potential relative to the domains in their index. By using this score, webmasters can compare their sites’ likelihood to rank above competitors. This ranking score mainly focuses on authoritativeness and trustworthiness, which is exactly following the lead of the Google Guidelines.
For example, using third-party indicators DA RetailMeNot is 77, and its rating according to Alexa is 2434, and you understand why some sites can beat their competitors and take top positions in the search results. The Google algorithm prefers websites that provide rich and useful content that users can trust. And this is the goal of optimization, as we strive to develop our website in order to become a leader in the industry with the best quantity and quality of coupons to help our users save money and time.