{"id":177313,"date":"2019-10-24T18:43:00","date_gmt":"2019-10-24T23:43:00","guid":{"rendered":"https:\/\/www.panix.com\/~msaroff\/40years\/2019\/10\/24\/a-feature-not-a-bug-7\/"},"modified":"2019-10-24T18:43:00","modified_gmt":"2019-10-24T23:43:00","slug":"a-feature-not-a-bug-7","status":"publish","type":"post","link":"https:\/\/www.panix.com\/~msaroff\/40years\/2019\/10\/24\/a-feature-not-a-bug-7\/","title":{"rendered":"A Feature, Not a Bug"},"content":{"rendered":"<div>It turns out that an algorithm used by health care providers to determine who is in need of enhanced care and monitoring <a href=\"https:\/\/www.theverge.com\/2019\/10\/24\/20929337\/care-algorithm-study-race-bias-health\">discriminates against black black people<\/a>.<\/div>\n<p>Call me a conspiracy theorist, but I continue to think that algorithmic discrimination is actually one of the goals of this sort of AI tech, just like Airbmb listing, Facebook employment ads, etc:<\/p>\n<blockquote><p><span style=\"color: blue;\">A health care algorithm makes black patients substantially less likely than their white counterparts to receive important medical treatment. The major flaw affects millions of patients, and was just revealed in research published this week in the journal Science. <\/p>\n<p>The study does not name the makers of the algorithm, but Ziad Obermeyer, an acting associate professor at the University of California, Berkeley, who worked on the study says \u201calmost every large health care system\u201d is using it, as well as institutions like insurers. Similar algorithms are produced by several different companies as well. \u201cThis is a systematic feature of the way pretty much everyone in the space approaches this problem,\u201d he says. <\/p>\n<p>The algorithm is used by health care providers to screen patients for \u201chigh-risk care management\u201d intervention. Under this system, patients who have especially complex medical needs are automatically flagged by the algorithm. Once selected, they may receive additional care resources, like more attention from doctors. As the researchers note, the system is widely used around the United States, and for good reason. Extra benefits like dedicated nurses and more primary care appointments are costly for health care providers. The algorithm is used to predict which patients will benefit the most from extra assistance, allowing providers to focus their limited time and resources where they are most needed. <\/p>\n<p>To make that prediction, the algorithm relies on data about how much it costs a care provider to treat a patient. In theory, this could act as a substitute for how sick a patient is. But by studying a dataset of patients, the authors of the Science study show that, because of unequal access to health care, black patients have much less spent on them for treatments than similarly sick white patients. The algorithm doesn\u2019t account for this discrepancy, leading to a startlingly large racial bias against treatment for the black patients. <\/p>\n<p>The effect was drastic. Currently, 17.7 percent of black patients receive the additional attention, the researchers found. If the disparity was remedied, that number would skyrocket to 46.5 percent of patients. <\/span><\/p><\/blockquote>\n<p>I really do believe that this is a deliberate business decision.&nbsp; &#8220;It&#8217;s not racism, it&#8217;s just giving the cusomers what they want.&#8221;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>It turns out that an algorithm used by health care providers to determine who is in need of enhanced care and monitoring discriminates against black black people. Call me a conspiracy theorist, but I continue to think that algorithmic discrimination is actually one of the goals of this sort of AI tech, just like Airbmb &hellip;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[],"tags":[618,460,398],"class_list":["post-177313","post","type-post","status-publish","format-standard","hentry","tag-artificial-intelligence","tag-public-health","tag-racism"],"_links":{"self":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/posts\/177313"}],"collection":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/comments?post=177313"}],"version-history":[{"count":0,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/posts\/177313\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/media?parent=177313"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/categories?post=177313"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.panix.com\/~msaroff\/40years\/wp-json\/wp\/v2\/tags?post=177313"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}