How psychological and interpersonal processes are influenced by human-computer interactions
- This article includes a video of an interview 2024-08-02 with Ian Axel Anderson, a Caltech faculty researcher in applied social psychology.[1] It is posted here to invite others to contribute other perspectives, subject to the Wikimedia rules of writing from a neutral point of view citing credible sources[2] and treating others with respect.[3]
Ian Axel Anderson, a Caltech faculty researcher in applied social psychology,[1] discusses psychological and interpersonal processes that influence media use and human-computer interaction. Dr Anderson has studied how Internet companies exploit social-psychological processes – such as habit formation, social learning, and attention span – in ways that threaten democracy and rule of law. He is a member of the Coalition for Independent Technology Research.[4]
Dr. Anderson’s experiments have studied habits, online posting and scrolling behavior, hate speech, extremism, conspiracies, rumors, well-being, identity, stereotypes, and social media influence. He mentioned "The Facebook Papers", which were tens of thousands of Facebook’s internal documents that former Facebook employee and whistleblower Frances Haugen released to the Securities and Exchange Commission and The Wall Street Journal in 2021. These documents establish that Facebook executives knew that their algorithms were creating problems for many users and others including proactively inciting violence such as the genocide of Rohingya Muslims in Myanmar but prioritized company income over the wellbeing of users and society more generally.[5] Anderson said that Facebook’s content moderation policies in the US and Europe are much more friendly to users and society than in many other countries.
Mark Twain observed, “How easy it is to make people believe a lie, and how hard it is to undo that work again!” Anderson discusses research supporting this observation. This problem is exacerbated by the “continued influence effect”, which is the tendency for misinformation to continue to influence memory and reasoning even after a person agrees that the information was erroneous.[6] A more subtle effect is that people who read only headlines on social media have less actual knowledge while thinking they know more than people who catch a standard news broadcast or read a longer report in a standard newspaper (but not a tabloid); the actual knowledge of both may translate into increased civic participation, but the actions of those informed by social media headlines are less likely to be constructive.[7]
For countering misinformation in social media, crowdsourcing trustworthyness, i.e., judgments of news source quality, seems to be effective.[8]
Anderson is interviewed by Karl Brooks[9] and Spencer Graves.[10]
The threat
[edit | edit source]More on these threats to democracy and world peace is summarized in Category:Media reform to improve democracy.
Discussion
[edit | edit source]Notes
[edit | edit source]- ↑ 1.0 1.1 Ian Axel Anderson, Wikidata Q128639294
- ↑ The rules of writing from a neutral point of view citing credible sources may not be enforced on other parts of Wikiversity. However, they can facilitate dialog between people with dramatically different beliefs
- ↑ Wikiversity asks contributors to assume good faith, similar to Wikipedia. The rule in Wikinews is different: Contributors there are asked to "Don't assume things; be skeptical about everything." That's wise. However, we should still treat others with respect while being skeptical.
- ↑ Coalition for Independent Technology Research members, Wikidata Q128696184
- ↑ In defence of the decisions by executives of Facebook and Meta, they could be fired or sued if they prioritized the wellbeing of users over shareholder value .
- ↑ The "continued influence effect" is listed in a table in the section on "Other memory biases" in the Wikipedia article on "List of cognitive biases". See also Cacciatore (2021).
- ↑ Schäfer and Schemer (2024).
- ↑ Pennycook and Rand (2019).
- ↑ Karl Boyd Brooks, Wikidata Q128214400
- ↑ Spencer Graves, Wikidata Q56452480
Bibliography
[edit | edit source]- Dean Baker (15 October 2023). "No More Special Privileges for Social Media Giants: Reform Section 230". The Messenger. Wikidata Q127474286. https://web.archive.org/web/20231102140912/https://themessenger.com/opinion/no-special-privileges-social-media-reform-section-230-defamation-internet.
- Michael A Cacciatore (9 April 2021). "Misinformation and public opinion of science and health: Approaches, findings, and future directions". Proceedings of the National Academy of Sciences of the United States of America 18 (15): e1912437117. Wikidata Q128597690. ISSN 0027-8424. https://www.pnas.org/doi/pdf/10.1073/pnas.1912437117.
- Renée DiResta (2024). Invisible rulers: The people who turn lies into reality (in en). PublicAffairs. Wikidata Q127420033. ISBN 978-1-5417-0337-7.
- H. R. McMaster (2020). Battlegrounds: The Fight to Defend the Free World (in en). HarperCollins. Wikidata Q104774898. ISBN 978-0-06-289948-4.
- Gordon Pennycook; David G Rand (28 January 2019). "Fighting misinformation on social media using crowdsourced judgments of news source quality". Proceedings of the National Academy of Sciences of the United States of America 116 (7): 2521-2526. doi:10.1073/PNAS.1806781116. Wikidata Q61808560. ISSN 0027-8424. PMID 30692252. PMC 6377495. //www.ncbi.nlm.nih.gov/pmc/articles/PMC6377495/.
- Maria Ressa (2022), How to Stand Up To a Dictator, Harper, Wikidata Q117559286
- Svenja Schäfer; Christian Schemer (3 January 2024). "Informed participation? An investigation of the relationship between exposure to different news channels and participation mediated through actual and perceived knowledge". Frontiers in Psychology 14. Wikidata Q128709784. ISSN 1664-1078. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2023.1251379/full.
- Yanni Chen; Candace Clement; Matt Wood (14 May 2024), What Is Section 230? Why Ending It Would Create Problems, Free Press, Wikidata Q128631734