BLOG POST | 3 Apr 2025
Reimagining digital spaces: A peacebuilder’s take on disinformation

As digital spaces become increasingly influenced by disinformation, peacebuilders play a crucial role in shaping a more informed online (and offline) world.
By Tessa Schindler
As a Gen Z social media user, I find myself scrolling through different platforms every day, trying to limit my screen time each week, and contemplating deleting accounts from platforms I no longer agree with at least once a month. It’s a constant cycle of balancing engagement and concern. As a peacebuilder, I worry about how disinformation spreads, who it impacts, and what we can do about it. How can we build a digital future that promotes truth and understanding instead of division and harm?
The Berghof Foundation, in collaboration with the Platform Peaceful Conflict Transformation, recently commissioned a study on Peacebuilding and disinformation. The study tries to answer this question and focuses on how peacebuilding can address disinformation.
Understanding the motivation behind disinformation
Disinformation is not just about false or misleading information. By definition, it means information shared with the intent to harm. While technology plays a role in amplifying disinformation, peacebuilders must also focus on the human drivers behind this phenomenon.
Disinformation actors can be broken down into three categories: Generators create or spread disinformation for financial, political, or ideological reasons. Spreaders either knowingly spread disinformation due to distrust in institutions or a need for chaos, or do so unintentionally, driven by habit, trust in sources, or a desire for social belonging. Disinformation recipients are vulnerable to manipulation due to limited critical thinking, partisan bias, or repeated exposure to false narratives. Understanding these dynamics helps us develop targeted strategies that address both technological tools and human behaviours.
Strengthening interdisciplinary exchange and collaboration
Disinformation is a complex issue that no single field can tackle alone. A collaborative, interdisciplinary approach is key to addressing this challenge effectively. Experts in cognitive science, media studies, political science, and peacebuilding must work together to develop meaningful solutions.
Strategic partnerships, particularly between researchers and peace practitioners, can lead to new approaches that go beyond fact-checking. However, adaptive funding is needed to develop these initiatives given the highly dynamic nature of disinformation and social media platforms and the constantly evolving threats, narratives, and tactics. Adaptive funding is also crucial for supporting peacebuilders who are already embedded in communities, empowering them to lead local efforts to counter disinformation.
Technology companies shape the digital landscape and their collaboration with peacebuilders can help develop tools that promote accurate information while limiting harm. Policymakers must balance freedom of expression with the regulation of harmful content, while educators have the critical role of teaching media literacy, equipping people with the skills to critically assess online information.
By fostering interdisciplinary dialogue and collaboration, we can move beyond reactive measures and work towards long-term strategies that promote a more informed and cohesive society.
Imagine a better (digital) future
While much attention is given to platform rules and content moderation, disinformation ultimately comes down to how we communicate and build social cohesion. During the online launch event of our study, Grace Connors, one of the authors, asked: “How do we talk to one another? How do we process information? And ultimately, how do we relate to one another online? And how does this contribute to building a socially cohesive society?”
It is important to criticise where social media platforms are going wrong, especially when fact-checking initiatives are sidelined and hate speech is justified in the name of free speech. However, we also need to think about the future we want for digital spaces. We can advocate for change within existing platforms, but we should also imagine new ones: platforms that control the harm caused by disinformation while promoting prosocial values, fostering meaningful dialogue, ensuring algorithmic transparency, and empowering users with tools to navigate information responsibly.
Media contact
You can reach the press team at:
+49 (0) 177 7052758
email hidden; JavaScript is required