Zum Seitenende Übersicht Artikel Home & Impressum
First the link to this week's complete list as HTML and as PDF.
Johnson et al. and Derzsy fall into two parts, a diagnosis and a cure, to be discussed separately. The first is a classic and Orwellian example of misleading words inducing muddled and misled thinking. Facebook and others are neither social nor networks but centralised commercial platforms. Their so-called content is the bait, whose only purpose is drawing in and engaging so-called users, i.e. the merchandise being sold to the platforms' customers.
Nothing could be further from the truth than Derzsy’s
Online social-media platforms are challenging to regulate [...] Efforts to ban and remove hate-related content have proved ineffective.
Nobody there is the least bit interested in content or in regulating it as long as it fulfils its purpose. The only object of their moderation is being visible and being seen to be trying in order to deflect political regulation and fines. As the authors notice
For example, when hate groups are removed by social-media platform administrators, the clusters rapidly rewire and repair themselves, and strong bonds are made between clusters.
Facebook and others employ highly intelligent and competent people, many more of them and better funded than Johnson et al. They don't need Johnson et al. to tell them how the results of their interventions turn out and we may assume that those outcomes were their purpose all along. I shall admit once again to being hampered in understanding the workings of Facebook by failing to see how anyone can voluntarily participate there in the first place. But it's clear to me that stirring things up and making them worse is not an unintended consequence but the whole purpose of how they organize things. It draws visitors into their pages and makes them engage longer and more intensively.
Johnson et al.'s cure, as skilfully summarised by Derzsy, is a separate issue entirely. Let's begin with their analysis:
We uncover hate clusters of all sizes — the distribution has a high goodness-of-fit value for a power-law distribution. This suggests that the online hate ecology is self-organized, because it would be almost impossible to engineer this distribution using top-down control.
Over the years, it has become apparent that effective solutions [...] will require a combined effort from technology companies, policy makers and researchers.
Disregarding the distraction of the value-laden term
hate-speech, what they find is this:
“These communities near-perfectly represent the ideals of freedom and democracy. They are as far from totalitarian control as they can get.”
“Abandon any idea of the division of power and dividing between the areas of private commerce and politics. Centralized single-party control is the way to go.”
And now the list of their four cures with apt paraphrases:
Target the smallest and weakest of your adversaries. Avoid any successful defence being seen.
Punish one, educate thousands.
Strike random targets.
Sow fear and confusion.
Establish a centrally controlled strike force.
Take the examples of the Red Guards and Rotfront-Kämpferbund.
Infiltrate to sow discord.
Ziel der Zersetzung ist die Zersplitterung, Lähmung, Desorganisierung und Isolierung feindlich-negativer Kräfte. [From the official Stasi handbook]
It is of course possible that Johnson et al. have independently arrived at these rules through their own analysis. Another, far more efficient way would have been to consult any handbook on Stalinist terror.
The theme of this blog is science, not politics and I apologize to any, who see this contribution as inappropriate, especially after I criticize others for blurring the line. However when a commentary concerns an article published in a leading science journal, nature, and all comments are directly based on quotations from that study, it must be allowed.
Zum Anfang Übersicht Artikel Home & Impressum