Google’s Martin Splitt addressed a question about website trustworthiness and whether competitors can negatively impact it. He explained how Google assesses site trustworthiness and clarified why factors like links and site traffic don’t have a negative influence on Google’s perception of trustworthiness.
Trustworthiness
Googlers, research papers and patents mention the trustworthiness of websites but there is no actual trust metric in use at Google. It was confirmed at one time a long time ago that there are multiple signals that together indicate if a site could be trusted but that’s not a trust algorithm, those are just signals.
When Googlers talk about whether a site is trustworthy it’s probably best to not overthink it, they’re just talking about whether a site is trustworthy.
Can A Competitor Create Negative Trustworthiness Signals?
The person asking the question was worried about a competitor that was sending bot traffic to their site in what they felt was an effort to make their site appear to be untrustworthy by Google’s algorithm.
That might be a reference to an SEO idea that Google uses click metrics to rank web pages but most research papers about clicks are using clicks to validate search results, not for ranking web pages, it’s generally a quality assurance thing.
This is the question that was asked:
“Do I have to be concerned about bad actors trying to make our site appear untrustworthy by sending spam or fake traffic to my site? Since site trustworthiness is binary.”
Binary means it’s either this or that. In this case the person asking the question probably means a site is either trustworthy or untrustworthy with no gray areas in between.
Martin Splitt downplayed the idea of a binary quality to trustworthiness and outright denied that traffic could influence how Google sees a site.