A week after a House Judiciary Committee hearing on search engine bias, Donald Trump was still crying wolf on Twitter, alleging tech companies like Google and the very platform he tweets upon favor his political rivals.
According to data from analytics firm Brandwatch, Trump has tweeted about bias at least 25 times since taking office. And it was up to Google CEO Sundar Pichai to explain how the search giant processes 3.5 billion queries daily without a liberal agenda. Even though Pichai testified that he leads “this company without political bias and work[s] to ensure that [Google’s] products continue to operate that way,” Republicans remained skeptical.
Rep. Lamar Smith, R-Texas, for example, insisted, “Those who write the algorithms get the results they must want.” And Rep. Steve Chabot, R-Ohio, asked why negative news outranks positive coverage in searches for Republican legislative victories like the American Health Care Act or the Tax Cuts and Jobs Act.
The question of how exactly Google ranks search results has been hotly debated since the dawn of search engine optimization in the 1990s. And, to a degree, the only constant is change, as evidenced by a seemingly never-ending barrage of algorithm updates over the past 20 or so years with adorable names like Panda, Penguin, Hummingbird, RankBrain, Mobilegeddon and Fred.
But that’s the thing—Google isn’t going to simply hand over the secret sauce recipe in the Big Mac that is its search results.
“Google’s dilemma is that they don’t want to reveal those inner workings, and they’re especially sensitive to any revelations around user signals impacting results, so they can’t reveal the evidence that would help disprove these conspiracy theories,” said Pete Meyers, marketing scientist at analytics firm Moz. “On a typical day, they’re more willing to take a minor PR hit than to reveal the inner workings of the algorithm.”
How Trump could be kind of, sort of right
The issue, however, is not black and white. In fact, Will Critchlow, CEO of online marketing firm Distilled, said there are multiple ways Google could, in fact, be biased.
What Trump, Smith and Chabot have referred to is active bias, meaning Google engineers are deliberately skewing the algorithm to align with their views. However, Critchlow noted another possible form: unconscious bias, meaning they are not intentionally trying to influence results, but the results they think are best happen to be those that align with their opinions.
“Another challenge is trying to tease apart deliberate [versus] accidental bias,” Meyers said. “Google could be training systems on biased data, for example, even if they don’t intend to.”
There’s also the issue of institutional bias, in which Google gives higher rankings to authoritative, trustworthy sites that are frequently cited and have long histories. And so Critchlow said Republicans may feel there is a bias toward less prominent sites because they see results from well-known sources like The New York Times and the Washington Post, which they also claim are biased.
“If you put the exact same article on The Washington Post and a new blog, the one on The Washington Post would outrank it,” Critchlow added. “That bias is there not for political reasons, but because it’s the best result over a broad range of queries.”
International SEO consultant Gianluca Fiorelli noted Google can also downgrade websites that have been flagged as untrustworthy if this has been verified using the fact-checking review ClaimReview Schema.
“The more … sites are flagged as not [trustworthy], the less they will be visible in organic search,” Fiorelli said.
Do reptiles make good pets?
In a blog post earlier this year, Danny Sullivan—the longtime search marketing reporter who now acts as a “search liaison” for the mother ship—addressed problems with featured snippets, which had resulted in answers like, “women are evil” and “Barack Obama is planning a coup.”
“We failed in these cases because we didn’t weigh the authoritativeness of results strongly enough for such rare and fringe queries,” Sullivan wrote.
In response, he said Google updated its Search Quality Rater Guidelines with better examples of low-quality webpages to help its raters flag inappropriate sites, such as those with misleading or offensive information or those with unsupported conspiracy theories.
And, he said, showing more than one featured snippet may eventually help when users ask about the same thing in different ways. Sullivan used the example of searches for “are reptiles good pets,” and “are reptiles bad pets,” because they are both trying to find out how reptiles rate as pets. However, the featured snippets for the individual queries differed because “sometimes our systems favor content that’s strongly aligned with what was asked.”
These reptile queries are fairly innocuous. Unfortunately, not all queries are.
“For example, if you search for, ‘Did the Holocaust happen,’ then you might get some good journalistic answers, but there have been points in time—especially if you phrased it with skeptical notes, like, ‘Is the Holocaust a hoax?’—when there was a good chance you went into that thinking it’s a possibility and you might be made most happy if the top result is something that confirms your bias,” Critchlow said.
And so if Google was simply optimizing to make all users happy no matter what, Holocaust denial sites would rank highest in these instances.
And so now Google has to figure out how to balance its goal of quickly delivering information that answers queries with delivering accurate information, even if it will not make a given user happy—and it is not what that user wants. (Google did not respond to a request for comment.)
Meanwhile, search results can also be manipulated by outside parties, such as when George W. Bush was president and a search for “miserable failure” delivered WhiteHouse.gov as the No. 1 result. (Or, as Rep. Zoe Lofgren, D-Calif., pointed out more recently, how a Google Image search of “idiot” results in a picture of Trump.)
“That wasn’t Google editorially saying [Bush was a miserable failure]; it was hoaxers, people outside manipulating search results to return something that was obviously politically motivated,” Critchlow added.
It’s just the algorithm doing its job
While noting it gets “really speculative really fast,” Meyers said it’s not impossible for Google to have bias against a party or to build it into the algorithm, but the examples politicians use are not typically proof.
“They’re just quirks of how the algorithm works, especially based on searcher interactions,” Meyers said.
And, Critchlow said, these politicians are also cherry-picking to a degree, as for every example of bias against one party, there are similar examples on the other side.
Indeed, Fiorelli said what seems like bias against a political party is actually the algorithm doing its job, like personalizing results based on prior search history and other behaviors, which we can see when Google offers news similar to suggested news a user has clicked on previously.
“It may happen that the politicians always reading articles against [Trump] and his party will end up having Google show them more results from those sites, and from sites linking [to] and mentioning [those articles]” he added.
But, Critchlow said, even if Google is biased, courts have ruled its algorithm is protected under the First Amendment and it has a right to determine how to order its search results. So until Trump repeals free speech, search engines should be OK.
Update: A Google rep said the search engine does not use ClaimReview Schema to downgrade websites and pointed to a Twitter thread in which Sullivan contradicted Fiorelli’s latter statement: “The assumption is that results have been customized in some way based on information unique to an individual, such as search history. FYI: we do not personalize search results based on demographic profiles nor create such profiles for use in Google Search…”