Balancing AI fears: Why current risks outweigh the apocalypse
New research from the University of Zurich indicates that people are more concerned about current threats related to AI than distant apocalyptic catastrophes.
Researchers from the University of Zurich conducted studies showing that people worry more about current threats associated with the development of artificial intelligence (AI), such as misinformation and job loss, than about distant apocalyptic disasters. Over 10,000 respondents from the UK and the USA participated in the study. The results were published in the Proceedings of the National Academy of Sciences (PNAS).
What AI-related threats cause the most concern?
The studies conducted by a team of political scientists from the University of Zurich included three extensive online experiments. Participants were exposed to various news headlines related to AI, ranging from catastrophic scenarios to current threats and potential benefits.
"In three preregistered, online survey experiments (N = 10,800), participants were exposed to news headlines that either depicted AI as a catastrophic risk, highlighted its immediate societal impacts, or emphasized its potential benefits," write the study's authors, Emma Hoes and Fabrizio Gilardi.
Should the public debate change direction?
The research suggests that even after being exposed to apocalyptic warnings, participants remained focused on current issues. "Our study shows that the discussion about long-term risks is not automatically occurring at the expense of alertness to present problems," Emma Hoes told SciTechDaily. Fabrizio Gilardi emphasized that "the public discourse shouldn't be 'either-or.' A concurrent understanding and appreciation of both the immediate and potential future challenges is needed."
The study provides important empirical evidence that can influence ongoing scientific and political debates regarding the social implications of AI. It highlights the necessity of addressing both short-term and long-term threats associated with AI concurrently.