We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Emerging trends: Unfair, biased, addictive, dangerous, deadly, and insanely profitable.
- Authors
Church, Kenneth; Schoene, Annika; Ortega, John E.; Chandrasekar, Raman; Kordoni, Valia
- Abstract
There has been considerable work recently in the natural language community and elsewhere on Responsible AI. Much of this work focuses on fairness and biases (henceforth Risks 1.0), following the 2016 best seller: Weapons of Math Destruction. Two books published in 2022, The Chaos Machine and Like, Comment, Subscribe , raise additional risks to public health/safety/security such as genocide, insurrection, polarized politics, vaccinations (henceforth, Risks 2.0). These books suggest that the use of machine learning to maximize engagement in social media has created a Frankenstein Monster that is exploiting human weaknesses with persuasive technology, the illusory truth effect, Pavlovian conditioning, and Skinner's intermittent variable reinforcement. Just as we cannot expect tobacco companies to sell fewer cigarettes and prioritize public health ahead of profits, so too, it may be asking too much of companies (and countries) to stop trafficking in misinformation given that it is so effective and so insanely profitable (at least in the short term). Eventually, we believe the current chaos will end, like the lawlessness in Wild West, because chaos is bad for business. As computer scientists, this paper will summarize criticisms from other fields and focus on implications for computer science; we will not attempt to contribute to those other fields. There is quite a bit of work in computer science on these risks, especially on Risks 1.0 (bias and fairness), but more work is needed, especially on Risks 2.0 (addictive, dangerous, and deadly).
- Subjects
ILLUSORY truth effect; CLASSICAL conditioning; PERSUASIVE technology; BIOTIC communities; COMPUTER science; SOCIAL media
- Publication
Natural Language Engineering, 2023, Vol 29, Issue 2, p483
- ISSN
1351-3249
- Publication type
Article
- DOI
10.1017/S1351324922000481