{"id":178002,"date":"2019-04-04T18:16:00","date_gmt":"2019-04-04T23:16:00","guid":{"rendered":"https:\/\/www.panix.com\/~msaroff\/40years\/2019\/04\/04\/a-feature-not-a-bug-8\/"},"modified":"2019-04-04T18:16:00","modified_gmt":"2019-04-04T23:16:00","slug":"a-feature-not-a-bug-8","status":"publish","type":"post","link":"https:\/\/www.panix.com\/~msaroff\/40years\/2019\/04\/04\/a-feature-not-a-bug-8\/","title":{"rendered":"A Feature, Not a Bug"},"content":{"rendered":"<div>A study has shown that <a href=\"https:\/\/theintercept.com\/2019\/04\/03\/facebook-ad-algorithm-race-gender\/\">Facebook&#8217;s advertising algorithms are a bigotry machine<\/a>.<\/div>\n<p>While a part of this is the inherent bias of the programmers, and a blithe attitude about the tech industry in general, and Facebook in particular, but a bigger part is because pandering to people basest inclinations is profitable.<\/p>\n<p>You wave a wand, and call it tech, and suddenly not renting to black people (Air BnB), not giving rides to black people, and Facebook&#8217;s ads as shown below, but (because it&#8217;s all &#8220;science&#8221; and computers) it&#8217;s all good:<\/p>\n<blockquote><p><span style=\"color: blue;\">How exactly Facebook decides who sees what is one of the great pieces of forbidden knowledge in the information age, hidden away behind nondisclosure agreements, trade secrecy law, and a general culture of opacity. <a href=\"https:\/\/arxiv.org\/abs\/1904.02095\">New research<\/a> from experts at Northeastern University, the University of Southern California, and the public-interest advocacy group Upturn doesn\u2019t reveal how Facebook\u2019s targeting algorithms work, but does show an alarming outcome: They appear to deliver certain ads, including for housing and employment, in a way that aligns with race and gender stereotypes \u2014 even when advertisers ask for the ads to be exposed a broad, inclusive audience. <\/p>\n<p>\u2026\u2026\u2026<\/p>\n<p>The new research focuses on the second step of advertising on Facebook, <span style=\"color: black;\">[what they do after the customer fills out their ad preferences]<\/span> the process of ad delivery, rather than on ad targeting. Essentially, the researchers created ads without any demographic target at all and watched where Facebook placed them. The results, said the researchers, were disturbing: <\/span><\/p>\n<blockquote><p><span style=\"color: blue;\">Critically, we observe significant skew in delivery along gender and racial lines for \u201creal\u201d ads for employment and housing opportunities despite neutral targeting parameters. <b><span style=\"font-size: 100%; font-variant: small-caps;\">Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive<\/span><\/b>. <\/span>[<i>emphasis mine<\/i>]<\/p><\/blockquote>\n<p><span style=\"color: blue;\">Rather than targeting a demographic niche, the researchers requested only that their ads reach Facebook users in the United States, leaving matters of ethnicity and gender entirely up to Facebook\u2019s black box. As Facebook itself <a href=\"https:\/\/www.facebook.com\/business\/news\/relevance-score\">tells<\/a> potential advertisers, \u201cWe try to show people the ads that are most pertinent to them.\u201d What exactly does the company\u2019s ad-targeting black box, left to its own devices, consider pertinent? Are Facebook\u2019s ad-serving algorithms as prone to bias <a href=\"https:\/\/www.newscientist.com\/article\/2166207-discriminating-algorithms-5-times-ai-showed-prejudice\/\">like so many others<\/a>? The answer will not surprise you. <\/p>\n<p>For one portion of the study, researchers ran ads for a wide variety of job listings in North Carolina, from janitors to nurses to lawyers, without any further demographic targeting options. With all other things being equal, the study found that \u201cFacebook delivered our ads for jobs in the lumber industry to an audience that was 72% white and 90% men, supermarket cashier positions to an audience of 85% women, and jobs with taxi companies to a 75% black audience even though the target audience we specified was identical for all ads.\u201d Ad displays for \u201cartificial intelligence developer\u201d listings also skewed white, while listings for secretarial work overwhelmingly found their way to female Facebook users. <\/p>\n<p>\u2026\u2026\u2026<\/p>\n<p>In the case of housing ads \u2014 <a href=\"https:\/\/www.propublica.org\/article\/facebook-lets-advertisers-exclude-users-by-race\">an area Facebook has already shown in the past has potential for discriminatory abuse<\/a> \u2014 the results were also heavily skewed along racial lines. \u201cIn our experiments,\u201d the researchers wrote, \u201cFacebook delivered our broadly targeted ads for houses for sale to audiences of 75% white users, when ads for rentals were shown to a more demographically balanced audience.\u201d In other cases, the study found that \u201cFacebook delivered some of our housing ads to audiences of over 85% white users while they delivered other ads to over 65% Black users (depending on the content of the ad) even though the ads were targeted identically.\u201d <\/p>\n<p>Facebook appeared to algorithmically reinforce stereotypes even in the case of simple, rather boring stock photos, indicating that not only does Facebook automatically scan and classify images on the site as being more \u201crelevant\u201d to men or women, but changes who sees the ad based on whether it includes a picture of, say, a football or a flower. The research took a selection of stereotypically gendered images \u2014 a military scene and an MMA fight on the stereotypically male side, a rose as stereotypically female \u2014 and altered them so that they would be invisible to the human eye (marking the images as transparent \u201calpha\u201d channels, in technical terms). They then used these invisible pictures in ads run without any gender-based targeting, yet found Facebook, presumably after analyzing the images with software, made retrograde, gender-based decisions on how to deliver them: Ads with stereotypical macho images were shown mostly to men, even though the men had no idea what they were looking at. The study concluded that \u201cFacebook has an automated image classification mechanism in place that is used to steer different ads towards different subsets of the user population.\u201d In other words, the bias was on Facebook\u2019s end, not in the eye of the beholder.<\/span><\/p><\/blockquote>\n<p>So, not only does Facebook allow advertisers to discriminate, bigotry is baked in their whole &#8220;Social Graph&#8221;.<\/p>\n<p>This is not surprise.<\/p>\n<p>Even if Facebook weren&#8217;t evil, and they are <b>very<\/b> evil, this is a part and parcel of the whole techno-utopian delusion that permeates the whole misbegotten industry.<br \/><span style=\"color: blue;\"> <\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A study has shown that Facebook&#8217;s advertising algorithms are a bigotry machine. While a part of this is the inherent bias of the programmers, and a blithe attitude about the tech industry in general, and Facebook in particular, but a bigger part is because pandering to people basest inclinations is profitable. You wave a wand, &hellip;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[],"tags":[506,365,368,714,367,382],"class_list":["post-178002","post","type-post","status-publish","format-standard","hentry","tag-bigotry","tag-business","tag-corruption","tag-discrimination","tag-internet","tag-technology"],"_links":{"self":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/posts\/178002"}],"collection":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/comments?post=178002"}],"version-history":[{"count":0,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/posts\/178002\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/media?parent=178002"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/categories?post=178002"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/tags?post=178002"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}