How do you deal with communities that keep spreading false news? Facebook is still struggling with this question. Now it comes out: The German lateral thinking movement was called by the group “Experiment” treated.
For years there was only one currency on Facebook and its subsidiary Instagram: If an article, post or video was discussed, shared and commented particularly intensely, it ended up at the top of the feed. But in recent years it has become more and more obvious that the company has a problem with fake news and extremist groups on its social networks. In order to master them, one tried sometimes very different approaches. One of them: that “Lateral thinking experiment”.
This emerges from a document on the German movement of Covid skeptics, which is actually only used internally, and which is dated April 29 of this year. It has become public about the numerous leaks by former employee Frances Haugen: They were part of her testimony to the US Congress. And give extensive insights into the processes of the Internet giant.
Between doing nothing and escalating
It was no secret that Facebook was taking action against lateral thinking: in mid-September, the group announced that it would take action against the movement on its own platforms, Instagram and Facebook, and took numerous groups, posts and videos offline. But behind the scenes the battle seemed to have been going on for some time.
Because lateral thinking had been chosen by Facebook as a test object for its larger fight against the increasingly obvious problems. Let the experiment be a “Proof of Concept”, so a feasibility study, it says in the documents. “It’s a good intermediate step between doing nothing and escalating to a point where we have to be tough”, is it[called
Harmful community
Facebook was well aware of the problem of lateral thinking. The movement has one “robust presence” The paper analyzes the company’s own platform, successful pages and the corresponding reach. At the same time, the group was aware that there were overlaps with other problem movements such as QAnon and the Reich Citizens. The violence that occurred at some demonstrations and the observation by the German state were also known to Facebook at the beginning of the experiment. Both the deeply ingrained conspiracy ideologies and the “Offline violence” were listed as specific dangers.
At the same time, the connection to violence would not be far-reaching enough to justify a specific ban, as was the case with other radical political groups in the network, Facebook believed. Thinking outside the box is a prime example of what has been redefined in the group “Harmful Topic Community” (Community around a harmful topic), HTC for short. In contrast to political groupings, these are less about achieving specific goals, and the degree of organization is also lower. Rather, it is a question of communities that have grown organically around a problematic topic.
Experiment thinking outside the box
That also changes the way the groups are dealt with. Instead of taking tough action against individuals, one should first and foremost prevent growth and prevent the problem areas from normalizing among users. To do this, they wanted to specifically reduce visibility. Specifically, this means: If a community of interests is defined as HTC, the posts of the groups appear less often in feeds, the groups are suggested less and the members also appear less frequently as friend suggestions. So the topics shouldn’t be supported by Facebook as they grow. And lateral thinking should be Facebook’s experiment, whether that is possible.
With the upcoming elections in September and the great agreement with the internal definition of an HTC, the movement was made for it, explains the thesis paper. The plan itself was simple: the community should be limited over three weeks, then the measures should take effect for another three weeks. Facebook had scheduled a last week to analyze the results.
It is not clear from the paper whether the plan was ultimately implemented. An update from May 11th names a start date three days later, after initial preliminary attempts had looked promising. Facebook’s approach to the removal of the lateral thinking network in September speaks against a resounding success of the experiment: The harmful behavior there originated from a core group of people who had coordinated strongly, Facebook explained in a blog post. And instead of soft measures such as less visibility, he resorted to deletions and locks.

David William is a talented author who has made a name for himself in the world of writing. He is a professional author who writes on a wide range of topics, from general interest to opinion news. David is currently working as a writer at 24 hours worlds where he brings his unique perspective and in-depth research to his articles, making them both informative and engaging.