A Feature, Not a Bug

While a part of this is the inherent bias of the programmers, and a blithe attitude about the tech industry in general, and Facebook in particular, but a bigger part is because pandering to people basest inclinations is profitable.

You wave a wand, and call it tech, and suddenly not renting to black people (Air BnB), not giving rides to black people, and Facebook’s ads as shown below, but (because it’s all “science” and computers) it’s all good:

How exactly Facebook decides who sees what is one of the great pieces of forbidden knowledge in the information age, hidden away behind nondisclosure agreements, trade secrecy law, and a general culture of opacity. New research from experts at Northeastern University, the University of Southern California, and the public-interest advocacy group Upturn doesn’t reveal how Facebook’s targeting algorithms work, but does show an alarming outcome: They appear to deliver certain ads, including for housing and employment, in a way that aligns with race and gender stereotypes — even when advertisers ask for the ads to be exposed a broad, inclusive audience.

………

The new research focuses on the second step of advertising on Facebook, [what they do after the customer fills out their ad preferences] the process of ad delivery, rather than on ad targeting. Essentially, the researchers created ads without any demographic target at all and watched where Facebook placed them. The results, said the researchers, were disturbing:

Critically, we observe significant skew in delivery along gender and racial lines for “real” ads for employment and housing opportunities despite neutral targeting parameters. Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive. [emphasis mine]

Rather than targeting a demographic niche, the researchers requested only that their ads reach Facebook users in the United States, leaving matters of ethnicity and gender entirely up to Facebook’s black box. As Facebook itself tells potential advertisers, “We try to show people the ads that are most pertinent to them.” What exactly does the company’s ad-targeting black box, left to its own devices, consider pertinent? Are Facebook’s ad-serving algorithms as prone to bias like so many others? The answer will not surprise you.

For one portion of the study, researchers ran ads for a wide variety of job listings in North Carolina, from janitors to nurses to lawyers, without any further demographic targeting options. With all other things being equal, the study found that “Facebook delivered our ads for jobs in the lumber industry to an audience that was 72% white and 90% men, supermarket cashier positions to an audience of 85% women, and jobs with taxi companies to a 75% black audience even though the target audience we specified was identical for all ads.” Ad displays for “artificial intelligence developer” listings also skewed white, while listings for secretarial work overwhelmingly found their way to female Facebook users.

………

In the case of housing ads — an area Facebook has already shown in the past has potential for discriminatory abuse — the results were also heavily skewed along racial lines. “In our experiments,” the researchers wrote, “Facebook delivered our broadly targeted ads for houses for sale to audiences of 75% white users, when ads for rentals were shown to a more demographically balanced audience.” In other cases, the study found that “Facebook delivered some of our housing ads to audiences of over 85% white users while they delivered other ads to over 65% Black users (depending on the content of the ad) even though the ads were targeted identically.”

Facebook appeared to algorithmically reinforce stereotypes even in the case of simple, rather boring stock photos, indicating that not only does Facebook automatically scan and classify images on the site as being more “relevant” to men or women, but changes who sees the ad based on whether it includes a picture of, say, a football or a flower. The research took a selection of stereotypically gendered images — a military scene and an MMA fight on the stereotypically male side, a rose as stereotypically female — and altered them so that they would be invisible to the human eye (marking the images as transparent “alpha” channels, in technical terms). They then used these invisible pictures in ads run without any gender-based targeting, yet found Facebook, presumably after analyzing the images with software, made retrograde, gender-based decisions on how to deliver them: Ads with stereotypical macho images were shown mostly to men, even though the men had no idea what they were looking at. The study concluded that “Facebook has an automated image classification mechanism in place that is used to steer different ads towards different subsets of the user population.” In other words, the bias was on Facebook’s end, not in the eye of the beholder.

So, not only does Facebook allow advertisers to discriminate, bigotry is baked in their whole “Social Graph”.

This is not surprise.

Even if Facebook weren’t evil, and they are very evil, this is a part and parcel of the whole techno-utopian delusion that permeates the whole misbegotten industry.

Leave a Reply