The U.S. civil liberties organization Electronic Frontier Foundation has launched a petition titled “Don’t Scan Our Phones”. The background is Apple’s plan to search users’ phones for photos that show child abuse or are the result of child abuse. In doing so, the company is going even further than Microsoft, which scours the cloud for such material. On its website, the organization writes: “Apple has abandoned its once-famous commitment to security and privacy. The next version of iOS will contain software that scans users’ photos and messages. Under pressure from U.S. law enforcement, Apple has put a backdoor into their encryption system. Sign our petition and tell Apple to stop its plan to scan our phones. Users need to speak up say this violation of our privacy is wrong.” (Website Electronic Frontier Foundation) More information via act.eff.org/action/tell-apple-don-t-scan-our-phones.
Health Care Prediction Algorithm Biased against Black People
The research article “Dissecting racial bias in an algorithm used to manage the health of populations” by Ziad Obermeyer, Brian Powers, Christine Vogeli and Sendhil Mullainathan has been well received by science and media. It was published in the journal Science on 25 October 2019. From the abstract: “Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, Black patients are considerably sicker than White patients, as evidenced by signs of uncontrolled illnesses.” (Abstract) The authors suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts. The journal Nature quotes Milena Gianfrancesco, an epidemiologist at the University of California, San Francisco, with the following words: “We need a better way of actually assessing the health of the patients.”