AKA The Semibotic Socialization Machine
Ludwig Fleck already detailed much of this behavior in his 1935 book Genesis and Development of a Scientific Fact, a staggeringly prescient work when read in the zeitgeist of today.
Controversy elicits engagement, and engagement facilitates polarization, since humans polarize themselves naturally. Or maybe it's the information polarizing itself, through us. Nonetheless, digital media enables algorithms to accelerate this natural dynamic. Now let's spread those two sentences over the next two pages:
Disagreement may be a way to make online content spread faster, further
Jul 2021, phys.org
Computational Simulation of Online Social Behavior (SocialSim) program of the U.S. Defense Advanced Research Projects Agency: exists
23,000 "controversial" posts about cybersecurity were seen by nearly twice the number of people and traveled nearly twice as fast when compared to 24,000 posts not labeled controversial (the Reddit definition of controversial is to have increasing numbers of both likes and dislikes). The controversial posts had 60,000 total comments, vs 25,000 for the non-controversial posts.
via University of Central Florida's Department of Computer Science: Jasser Jasser et al, Controversial information spreads faster and further than non-controversial information in Reddit, Journal of Computational Social Science (2021). DOI: 10.1007/s42001-021-00121-z
Here's a good example of what happens when we automate socialization with revenue prioritized over the public good:
"In a recent video, @jameslxke asks his followers why TikTok’s algorithm puts trans users in harm’s way by promoting their content on conservative For You pages. If the algorithm is smart enough to know each user’s identity and is intent on keeping users on its platform, @jameslxke reasons, then why does it put vulnerable users at risk for what he calls a “digital lynching”? In the comment section of @jameslxke’s video, users speculate that creating conflict serves the platform’s bottom line: “tiktok does it on purpose bc arguing/dialogue keeps people on the app & the shock value of sending videos to ppl who wont enjoy it boost their app.” By sharing experiences, asking questions, and crowdsourcing answers, teens are developing an algorithmic folklore while discerning the potential motivations behind TikTok’s software engineering.
-Strategic Knowledge: Teens use “algorithmic folklore” to crack TikTok’s black box, by Iretiolu Akinrinade for the Data & Society Institute on Jul, 2021 [link]
Viral true tweets spread just as far as viral untrue tweets
Nov 2021, phys.org
Correction -- in self-similar and metalogical form, the viral meme that fake viral memes spread farther and faster than the truth is actually not true. (If you're ever trying to make shit up that's crazier than what's really happening in the world, then you will fail.)
The problem is that we all know "a lie has spread halfway around the world before the truth has put its pants on". So when we hear that fake news is more fit, as in survival-of-the-fit, it makes perfect sense, and it sticks. Kind of like accidentally
eating spiders in your sleep?
In this study, they looked at a part of network dynamics called "cascades", which follow the path a tweet takes as it spreads through the network. In this case, the cascade takes the form of retweets. It turns out that the cascade of true and fake tweets are indistinguishable. Now let's see how long it takes to correct this one.
via Cornell University: Jonas L. Juul et al, Comparing information diffusion mechanisms by matching on cascade size, Proceedings of the National Academy of Sciences (2021). DOI: 10.1073/pnas.2100786118
People unknowingly group themselves together online, fueling political polarization across the US
Dec 2021, phys.org
They found that when people are less reactive to news, their online environment remains politically mixed. However, when users constantly react to and share articles of their preferred news sources, they are more likely to foster a politically isolated network, or what the researchers call "epistemic bubbles."
Once users are in these bubbles, they actually miss out on more news articles, including those from their preferred media outlets. Users seem to avoid what they deem as "unimportant" news at the expense of missing out on subjectively important news, the model shows.
Polarization of online social networks emerges naturally as people curate their feeds.
People who consume and share fake news might be inadvertently isolating themselves from everyone else who follows mainstream sources.
via Princeton: Polarized information ecosystems can reorganize social networks via information cascades, Proceedings of the National Academy of Sciences (2021). DOI: 10.1073/pnas.2102147118.
Why false news snowballs on social media
Dec 2021, phys.org
When a network is highly connected or the views of its members are sharply polarized, news that is likely to be false will spread more widely and travel deeper into the network than news with higher credibility. Even if people are rational in how they decide to share the news, this could still lead to the amplification of information with low credibility.
Important to understand the idea of "cost" in sharing information (and thus in the greater idea of memetic propagation) - "Nominal cost, for instance, taking some action, if you are scrolling on social media, you have to stop to do that. Think of that as a cost. Reputation cost might come if I share something that is embarrassing. Everyone has this cost, so the more extreme and the more interesting the news is, the more you want to share it." If the news affirms the agent's perspective and has persuasive power that outweighs the nominal cost, the agent will always share the news. But if an agent thinks the news item is something others may have already seen, the agent is disincentivized to share it.
Talking about information cascades, or news cascades, they say the credibility threshold is lower the more connected the network is and the more surprising the news is. But also, in a polarized network, where many of the nodes are likely to spread extreme views, the credibility threshold is also very low.
"For any piece of news, there is a natural network speed limit, a range of connectivity, that facilitates good transmission of information where the size of the cascade is maximized by true news. But if you exceed that speed limit, you will get into situations where inaccurate news or news with low credibility has a larger cascade size," Jadbabaie says.
If the views of users in the network become more diverse, it is less likely that a poorly credible piece of news will spread more widely than the truth.
via Massachusetts Institute of Technology: Chin-Chia Hsu et al, Persuasion, News Sharing, and Cascades on Social Networks, SSRN Electronic Journal (2021). DOI: 10.2139/ssrn.3934010
Post Script:
Adaptive Metamemetics, Infectious Disease Networks, and Ludwig Fleck's Thought Collectives, 2020
Genesis and Development of a Scientific Fact. Ludwig Fleck, 1935 (Switzerland). Edited by Thaddeus J. Trenn and Robert K. Merton. Translated by Fred Bradley and Thaddeus J. Trenn. Foreword by Thomas S. Kuhn. Published by University of Chicago, 1979.
What Even Is 'Coordinated Inauthentic Behavior' on Platforms?
What Is ‘Coordinated Inauthentic Behavior’?
Snopes, Sep 4, 2021. [
link]