Thursday, December 21, 2023

When Algorithms Attack - Thought Hygiene and Social Behavior in the Digital Age


Here's some things about how algorithms and social behavior interact, starting with a good background on how we as individuals treat bad information:

No simple answer for why people believe in conspiracy theories
Jun 2023, phys.org
"Conspiracy theorists are not all likely to be simple-minded, mentally unwell folks—a portrait which is routinely painted in popular culture," said Bowes. "Instead, many turn to conspiracy theories to fulfill deprived motivational needs and make sense of distress and impairment."
  • 170 studies, 158,000 participants from the United States, UK and Poland, measuring motivations or personality traits associated with conspiratorial thinking.
  • overall, people were motivated to believe in conspiracy theories by a need to understand and feel safe in their environment and a need to feel like the community they identify with is superior to others
  • people were more likely to believe specific conspiracy theories when they were motivated by social relationships
  • participants who perceived social threats were more likely to believe in events-based conspiracy theories (Sept. 11), rather than an abstract theory in general (governments plan to harm their citizens to retain power)
  • people who are motivated by a desire to feel unique are more likely to believe in general conspiracy theories about how the world works
  • The Big Five personality traits (extraversion, agreeableness, openness, conscientiousness and neuroticism) had a much weaker relationship with conspiratorial thinking
  • may be related to recent theory about social identity motives and conspiracy theories 
via Emory University: The Conspiratorial Mind: A Meta-Analytic Review of Motivational and Personological Correlates, Psychological Bulletin (2023). DOI: 10.1037/bul0000392

Image credit: AI Art - Is Your Data Under the Weather - 2023


Perception of Russia-Ukraine conflict linked to endorsement of false news about adversary
Mar 2023, phys.org
Analysis of the survey responses showed that participants who perceived a higher level of conflict between Ukraine and Russia were less likely to believe in and want to share the false stories about the European Union, but were more likely to endorse the false stories about Russia. Stories about Tanzania (a neutral control) were least likely to be endorsed.

These findings suggest that people's tendency to endorse false news does not depend simply on their group identity; it also depends on perceptions of the level of conflict between their group and another group. This implies that conflict de-escalation could help prevent the spread of misinformation.
via Leiden University in the Netherlands and Public Library of Science: Information battleground: Conflict perceptions motivate the belief in and sharing of misinformation about the adversary, PLoS ONE (2023). DOI: 10.1371/journal.pone.0282308


Social media 'trust' or 'distrust' buttons could reduce spread of misinformation
Jun 2023, phys.org

(You mean like a "dislike" button?)
The addition of "trust" and "distrust" buttons on social media, alongside standard "like" buttons, could help to reduce the spread of misinformation, finds a new experimental study led by University College London (UCL) researchers.
via UUCL Psychology & Language Sciences, Max Planck UCL Center for Computational Psychiatry and Aging Research, and Massachusetts Institute of Technology: Laura K Globig et al, Changing the incentive structure of social media platforms to halt the spread of misinformation, eLife (2023). DOI: 10.7554/eLife.85767

Also: Valentina Vellani et al, The illusory truth effect leads to the spread of misinformation, Cognition (2023). DOI: 10.1016/j.cognition.2023.105421


Controversy in Facebook posts linked to speed of spread among users
Jun 2023, phys.org

  • 57 million posts published across about 2 million Facebook pages and groups from 2018 to 2022. 
  • Posts that went viral were more likely to be associated with negative or controversial reactions among users, regardless of topic.
  • Posts that did not go viral were associated with more positive reactions.
via Sapienza Università di Roma: Etta G, Sangiorgio E, Di Marco N, Avalle M, Scala A, Cinelli M, et al. Characterizing engagement dynamics across topics on Facebook, PLoS ONE (2023). DOI: 10.1371/journal.pone.0286150

Image credit: AI Art - Isometric Maze by Jesper Ejsing, Rhads, Makoto Shinkai Lois van Baarle, and Ilya Kuvshinov - 2022

Researchers build an AI system to identify social norm violations
Jul 2023, phys.org
Built the system using GPT-3, zero-shot text classification, and automatic rule discovery. The system used a binary of ten social emotions: competence, politeness, trust, discipline, caring, agreeableness, success, conformity, decency, and loyalty.

DARPA commissioned The Computational Cultural Understanding (CCU) program to create cross-cultural language understanding technologies to improve a Department of Defense operator's situational awareness and interactional effectiveness. Cross-cultural miscommunication not only derails negotiations, but also can be a contributing factor leading to war, according to DARPA's explanation of the rationale for the program.
via Department of Cognitive and Brain Sciences Ben-Gurion University of the Negev: Yair Neuman et al, AI for identifying social norm violation, Scientific Reports (2023). DOI: 10.1038/s41598-023-35350-x


Sharing on Facebook reveals two very different news environments
Aug 2023, phys.org

Low-Cred
Here it is: "There's a little more nuance to what it takes to be a well-informed person. You can't just be reading multiple news stories. You have to be reading sources that are well known to be credible."
(I think we're having a problem with the "well-known" part; well-known by who? It gets even harder  when more and more of our online infodiets come from robots.)
  • 2 million online news stories that were shared at least 100 times by Facebook users from February 2017 to April 2019
  • both high- and low-credibility publishers tended to put out bursts of coverage at the same time -- but they often were about different topics
  • low-credibility publishers more often converged on stories about politics and the government
  • high-credibility publishers' topics ranged more widely
via Ohio State: Ceren Budak et al, Bursts of contemporaneous publication among high- and low-credibility online information providers, New Media & Society (2023). DOI: 10.1177/14614448231183617


Social media algorithms exploit how humans learn from their peers
Aug 2023, phys.org

They can't change to create a sense of community because that's not their purpose and doesn't yield profit.
Humans are biased to learn from others in a way that typically promotes cooperation and collective problem-solving, which is why they tend to learn more from individuals they perceive as a part of their ingroup and those they perceive to be prestigious. In addition, when learning biases were first evolving, morally and emotionally charged information was important to prioritize, as this information would be more likely to be relevant to enforcing group norms and ensuring collective survival.

In contrast, algorithms are usually selecting information that boosts user engagement in order to increase advertising revenue. This means algorithms amplify the very information humans are biased to learn from, and they can oversaturate social media feeds with what the researchers call Prestigious, Ingroup, Moral, and Emotional (PRIME) information, regardless of the content's accuracy or representativeness of a group's opinions.
This last part is great part because it sounds like how you have to talk about an alien race taking over the planet or a new parasite to wipe out the population -- 
"It's not that the algorithm is designed to disrupt cooperation. It's just that its goals are different."
via Kellogg School of Management at Northwestern: Algorithm-mediated social learning in online social networks, Trends in Cognitive Sciences (2023). DOI: 10.1016/j.tics.2023.06.008


No evidence linking Facebook adoption and negative well-being
Aug 2023, phys.org

According to the article below this, it doesn't matter because it's already broken, but also because nobody is going to believe this.
"largest independent scientific study ever conducted" (nearly a million people across 72 countries over 2008 to 2019)
Reasons -- "Much of the past research into social media use and well-being has been hampered by an exclusive focus on well-being data in the Global North and a reliance on inaccurate self-reports of social media engagement."
There was another study that says the positive effects are mediated by affluence, so that the wealthier the country the more negative the effects and vice versa? I don't see it here, but I think it just came out in the last few months. Also, and this isn't so much a swipe at the authors, but the state of affairs in general, that their first reference, where they introduce their "2.94 billion monthly active users" number, is sourced from quarterly earnings data from the company itself. We know private companies blackbox lots of their data, and that we can't use with confidence the numbers they give us (the public, or university researchers), and yet one of the base numbers of their whole calculation comes from the company, lowering the confidence of all that follows. 

via University of Oxford's Internet Institute: Matti Vuorre et al, Estimating the association between Facebook adoption and well-being in 72 countries, Royal Society Open Science (2023). DOI: 10.1098/rsos.221451


Facebook's design makes it unable to control misinformation, research suggests
Sep 2023, phys.org
Facebook's efforts were undermined by the core design features of the platform itself. "Our results show that removing content or changing algorithms can be ineffective if it doesn't change what the platform is designed to do."
via George Washington and Johns Hopkins Universities: David Broniatowski, The Efficacy of Facebook's Vaccine Misinformation Policies and Architecture During The COVID-19 Pandemic, Science Advances (2023). DOI: 10.1126/sciadv.adh2132



And finally, feed this to your model-bot:

High rate of mental health problems and political extremism found in those who bought firearms during COVID pandemic
Sep 2023, phys.org

Not about the article itself, but this is the statement that defines you as a conspiracy theorist in scientific studies:

"The government, media, and financial worlds in the U.S. are controlled by a group of Satan-worshiping pedophiles who run a global child sex trafficking operation." (I agree, I do not agree)

via University of Michigan: Brian M. Hicks et al, Who bought a gun during the COVID-19 pandemic in the United States?: Associations with QAnon beliefs, right-wing political attitudes, intimate partner violence, antisocial behavior, suicidality, and mental health and substance use problems, PLOS ONE (2023). DOI: 10.1371/journal.pone.0290770

No comments:

Post a Comment