Google Is More Evil Than You Think

It turns out that Google has been deceiving us about the level of human intervention of their search results:

Google, and its parent company Alphabet, has its metaphorical fingers in a hundred different lucrative pies. To untold millions of users, though, “to Google” something has become a synonym for “search,” the company’s original business—a business that is now under investigation as more details about its inner workings come to light.

A coalition of attorneys general investigating Google’s practices is expanding its probe to include the company’s search business, CNBC reports while citing people familiar with the matter.

………

Google’s decades-long dominance in the search market may not be quite as organic as the company has alluded, according to The Wall Street Journal, which published a lengthy report today delving into the way Google’s black-box search process actually works.

Google’s increasingly hands-on approach to search results, which has taken a sharp upturn since 2016, “marks a shift from its founding philosophy of ‘organizing the world’s information’ to one that is far more active in deciding how that information should appear,” the WSJ writes.

Some of that manipulation comes from very human hands, sources told the paper in more than 100 interviews. Employees and contractors have “evaluated” search results for effectiveness and quality, among other factors, and promoted certain results to the top of the virtual heap as a result.

One former contractor the WSJ spoke with described down-voting any search results that read like a “how-to manual” for queries relating to suicide until the National Suicide Prevention Lifeline came up as the top result. According to the contractor, Google soon after put out a message to the contracting firm that the Lifeline should be marked as the top result for all searches relating to suicide so that the company algorithms would adjust to consider it the top result.

Or in another instance, sources told the WSJ, employees made a conscious choice for how to handle anti-vax messaging:

………

The company has since maintained an internal blacklist of terms that are not allowed to appear in autocomplete, organic search, or Google News, the sources told the WSJ, even though company leadership has said publicly, including to Congress, that the company does not use blacklists or whitelists to influence its results.

The modern blacklist reportedly includes not only spam sites, which get de-indexed from search, but also the type of misinformation sites that are endemic to Facebook (or, for that matter, Google’s own YouTube).

We already know that algorithms tend to reinforce, rather than mitigate, human bias and bigotry.

Now we know that there are discrete human fingers on the scales.

This is why we need real antitrust enforcement.

Leave a Reply