Welcome to the EI weekly round-up; a curation of quality posts to help you cut through the noise and get right to the heart of the discussion on AI and Tech Ethics.
Every Tuesday we publish a list of links to articles and debates that have happened over the past week in the community, allowing you to stay as up-to-date as possible on developments and facts. We will often link to arguments from all sides of the debate, even if the opinions may be controversial. We would like to mention, however, that EI does not endorse any of the information published, all links are reflections of the author's opinions and not that of Ethical Intelligence.
'Lost memories': War crimes evidence threatened by AI moderation
"From bombings and protests to the opening of a new health center, student journalist Baraa Razzouk has been documenting daily life in Idlib, Syria, for years, and posting the videos to his YouTube account."
"But this month, the 21-year-old started getting automated emails from YouTube alerting him that his videos violated its policy, and that they would be deleted. As of this month, more than a dozen of his videos had been removed, he said."
Why Not Lab Launches
A new web portal that with tools, articles, and more that focuses on our rights, the need for governance, and on how we, as workers, can shape intelligent systems.
Who Is Responsible When Autonomous Systems Fail?
"While human oversight is an important step toward ensuring that future AI systems enhance human dignity and fundamental rights, it isn’t a straightforward solution; it matters a great deal how that human is positioned “in the loop” and whether they are empowered — or disempowered — to act. Figuring out the right ways to design for and certify human-AI collaboration will be one of the major challenges facing the responsible innovation and governance of AI systems."
The dangers of tech-driven solutions to COVID-19
"Recent events around the world and in the United States demonstrate that the threat of a slide into authoritarianism is real. But we think it is also clear that entrenched habits of deferring to private-sector “solutions” to collective problems have undermined our capacity for effective pandemic response. What’s more, failures to hold tech firms accountable for their uses of personal information have actually made us more vulnerable to prolonged, uncontainable outbreaks."
Sociotechnical Design for HIV
"Efforts to make intimate platforms work for HIV frequently focus on user- to-user interactions and disclosure of ones HIV status but elide both the structural forces at work in regulating sex and the involvement of the state in queer lives. In an effort to foreground these forces and this involvement, we analyze the approaches that intimate platforms have taken in designing for HIV disclosure through a content analysis of 49 current platforms. We argue that the implicit reinforcement of stereotypes about who HIV is or is not a concern for, along with the failure to consider state practices when designing for data disclosure, opens up serious risks for HIV-positive and otherwise marginalized people."
Now Is the Time to Dismantle Our Cities’ Invasive Surveillance Infrastructure
"Not all innovation deserves to exist — many surveillance and policing technologies should never have been created in the first place"
In a U.S. first, California city set to ban predictive policing
"As pressure mounts to address police brutality and racism, California's Santa Cruz is poised to become the first U.S. city to ban predictive policing - despite headquartering the firm that pioneered the technology."
A New Compact for Intimate Information
"Intimate life is under constant surveillance. Firms track people’s periods, hot flashes, abortions, sexual assaults, sex toy use, sexual fantasies, and nude photos. Individuals hardly appreciate the extent of the monitoring, and even if they did, little can be done to curtail it. What is big business for firms is a big risk for individuals. The handling of intimate data undermines the values that sexual privacy secures—autonomy, dignity, intimacy, and equality. It can imperil people’s job, housing, insurance, and other crucial opportunities. More often, women and minorities shoulder a disproportionate amount of the burden."
Why are Google and Apple dictating how European democracies fight coronavirus?
"A debate has been raging as to where the data from contacts is stored – either on the user’s phone, presumably guaranteeing privacy, or with the national health authority once a user tests positive for coronavirus and might have exposed others to it. This distinction has been labelled a conflict between centralised versus decentralised storage of contact information."
Researchers find racial discrimination in ‘dynamic pricing’ algorithms used by Uber, Lyft, and others
"A preprint study published by researchers at George Washington University presents evidence of social bias in the algorithms ride-sharing startups like Uber, Lyft, and Via use to price fares. In a large-scale fairness analysis of Chicago-area ride-hailing samples — made in conjunction with the U.S. Census Bureau’s American Community Survey (ACS) data — metrics from tens of millions of rides indicate ethnicity, age, housing prices, and education influence the dynamic fare pricing models used by ride-hailing apps."
The Ed-Tech Imaginary
"To borrow from artist Alisha Wormsley, "there are Black people in the future." Pay attention when an imaginary posits otherwise. To decolonize the curriculum, we must also decolonize the ed-tech imaginary."
"How much of ed-tech is, to use Ruha Benjamin's phrase, "the new Jim Code"? How much of ed-tech is designed by those who imagine students as cheats or criminals, as deficient or negligent?"
Collective Letter to Springer RE: "A Deep Neural Network Model to Predict Criminality Using Image Processing"
"Springer Nature plans to publish an article "A Deep Neural Network Model to Predict Criminality Using Image Processing" that revives long discredited physiognomist pseudoscience."
Great Twitter Thread From Rachel Thomas on the Myth Of Neutral Technology
"If the idea of tech not being neutral is new to you, or if you think of tech as just a tool (that is equally likely to be used for good or bad), I want to share some resources & examples in this thread."
Situated data analysis: a new method for analysing encoded power relationships in social media platforms and apps
"This paper proposes situated data analysis as a new method for analysing social media platforms and digital apps. An analysis of the fitness tracking app Strava is used as a case study to develop and illustrate the method. Building upon Haraway’s concept of situated knowledge and recent research on algorithmic bias, situated data analysis allows researchers to analyse how data is constructed, framed and processed for different audiences and purposes. Situated data analysis recognises that data is always partial and situated, and it gives scholars tools to analyse how it is situated, and what effects this may have."