As Sebastiaan Crul stated in this article, we are experiencing a crisis of truth due to the ‘enshittification’ of our – often digitally construed – facts and objectivity. Instead of the cultural-ethical responses that Sebastiaan outlined, this article will take a more pragmatic stance, and look at which technological innovations could actually help re-establish or maintain a shared body of objective facts that we all agree and trust. Using our framework of the Stack, we are exploring what a ‘TrustTech Stack’ will look like. After that, we will philosophically reflect on fake news from the network philosophy of Bruno Latour, and examine how we can give a more positive conception of the digital construction of digital facts and objectivity.
At FreedomLab, we often use our model of the Stack “to understand the anatomy of digital systems and to unfold the complex nature of contemporary digital societies”, by understanding them as a layered structure of technological and non-technological components. For example, we discern the technologies needed when making a photo with a smartphone in the various layers: the interface (a touch screen and camera), the hardware components (e.g. the chips and batteries necessary to make it work), the algorithms that help to adjust the camera and optimize the images, the type of digital society it creates (e.g. the 'selfie culture'). All these are layers of the Stack cooperate to make the piece of digital technology work. Similarly, the phenomenon of the post-truth society and ‘enshittification’ of digital media can also be analysed using the Stack: we remain anonymous behind our digital avatars (interface), AI can modify our content (intelligence), digital platforms don’t care about truth but mostly about selling profitable ads (application) etc. However, we can also imagine a Stack that helps to create and establish digital facts and objectivity.
Using the Stack, we can get a grip of a set of technological innovations that can help to combat the likes of fake news and deep fakes, and to maintain or re-establish the trust we have in a shared body of objective facts. Given that it has become much more difficult for human beings to discern truth from falsity, and fake from real in the digital domain, solutions native to the digital realm can help to achieve this goal. Below, we set out how such a digital Stack to maintain trust in facts and objectivity, i.e. a ‘TrustTech Stack’, will look like.
All these solutions stem from a truly optimistic framing on the possible role of technology on establishing and maintaining digital facts and objectivity. However, as digital innovations provide both risks and opportunities, are both threats and promises, they should be understood 'pharmacologically', e.g. AI creates new fake news content but also helps to combat fake news. As such, this article wants to stress the positive side and should be understood next to Sebastiaan’s article on the possible strategies for addressing the post-truth conditions. However, we also need to get clear what we mean by truths, facts, and objectivity if we want to maintain or digitally construct them. For this, the network philosophy of Bruna Latour is of great interest.
Bruno Latour (1947-2022) was a prominent French philosopher and sociologist, and widely known for his development of Actor-Network Theory (ANT), a framework that reimagines the relationships between humans, objects, and concepts within encompassing networks. Latour first rose to fame when he applied this to the production or social construction of scientific facts, by giving a network analysis of the 'laboratory life'. I will first describe Latour’s network philosophy and will then look at how this can be applied to the idea of the TrustTech Stack.
At its core, Latour's theory challenges the traditional view of knowledge as something that is independently produced by isolated actors—whether people or institutions—and instead sees knowledge as the product of complex networks involving various "actors." These actors can be human or non-human, interacting dynamically within networks. For Latour, reality is not simply "given" or objective; it is constructed through the interactions and associations within these networks. This relational perspective moves away from static notions of objectivity and fact, emphasizing instead these are constructed, negotiated, and maintained through dynamic relationships.
Applying Latour's ANT to the TrustTech Stacks opens up intriguing possibilities for exploring how facts and objectivity can be established and maintained in the digital realm. In Latourian terms, combating fake news would require a network of actors — journalists, fact-checkers, platforms, algorithms, and audiences — all interconnected in ways that reinforce each other's credibility and accountability. Rather than relying on isolated institutions or authorities to "declare" facts, a Latourian approach would involve systems in which diverse actors continuously interact to verify, support, and adjust information as it spreads. News could become more interactive, with networks of actors actively collaborating to expose inaccuracies in real-time, creating a living, self-correcting network that generates facts and objectivity. That is not something that has happened in the past decades when media and journalism and content became digital: the internet and digital platforms came with the promise of fostering freedom of speech and democratic dialogue but have descended into fake news, monopolies, filter bubbles, echo chambers and the like. In fact, many now claim that the internet is broken and needs a fix or transition. From a Latourian perspective, this means that the internet as the network of networks does not properly network (as a verb) in order to produce facts and objectivity: there is too little transparency, asymmetries, as centralized powers can impose their will or vision instead of a decentralized mesh of actors that collective constitute objectivity.
Latour's ANT also emphasizes the active role of technological actors within these networks. Rather than viewing technology as a passive tool, Latour’s philosophy encourages us to see it as a participant in the creation and verification of knowledge. This aligns well with emerging technological innovations designed to combat misinformation, such as AI-based fact-checkers and algorithms that flag potential falsehoods. Within a Latourian network, these technologies do not merely “assist” humans but play an integral role in constructing reliable information by detecting inconsistencies, identifying sources, and counteracting biases. By integrating technological actors as active participants, a network-based approach becomes more robust, shifting from relying on human authority alone to a collective of human and technological actors working together to verify and update information.
Importantly, Latour's network philosophy also emphasizes transparency, allowing individuals to see how information flows and evolves within a network. In the fight against fake news, this transparency could offer the public insight into how information develops and changes as it circulates across platforms. This could foster a greater understanding of how news is formed, validated, and shared, revealing the processes that help legitimize or discredit certain claims. By showing the paths that information takes through these networks, Latour’s framework suggests a model where the transparency of information pathways itself becomes a tool for combating misinformation.In addition, a Latourian perspective encourages a shift toward network-based accountability, where responsibility for information quality is shared across a network rather than concentrated in a single source. If audiences, for instance, are also participants in these networks, they can take on a more proactive role in questioning, verifying, and maintaining the integrity of shared information. This can create a form of resilience within the network, as misinformation would face increased scrutiny and verification from multiple actors. A resilient network weakens the spread of fake news by creating a diverse, well-connected web of verification points.
Ultimately, Latour’s ideas challenge the traditional view of objectivity as a solitary, fixed standpoint. Instead, he suggests that objectivity can be understood as "networked trust": a quality that emerges from reliable networks of interlinked sources, cross-verification, and evidence. In the context of the TrustTech Stack, this networked trust implies shifting from a model of "trust in authority" to "trust in the network," where facts and objectivity become emergent properties of transparent, diverse, and verifiable connections. As such, facts and objectivity emerge from the networks that aim to create them. However, not all networks aim to create this.
In his work An Inquiry into Modes of Existence (2013), Latour aims to supplement his ANT with a modal theory, i.e. a theory of the different ways beings exist. From this perspective, networks are not just networks in the abstract, but are modally networked and have their own values, truths, beings that they manifest. For Latour, there are fifteen different modalities, all aiming to establish and constitute a core value.
The modality that is most relevant to the TrustTech Stack is that of reference: the chains of reference that produce objectivity and facts. For Latour, this means that a fact is something that has been referenced by a network and all its actors. For example, the discovery of a elementary particle requires not only the teachings of mathematics and physics, highly technical instruments and mathematical innovations, but also HR offices giving out contracts to PhD students, presentations for politicians to maintain support for building particle accelerators, scientific papers with established peer review systems, computers doing calculations, national research funds that help to finance laboratories, the general interest in science etc. All these help in the construction of the fact of an elementary particle. The value for reference networks is that the beings that are transported by them can be verified by the actants, ensuring that the objective fact is something that is shared and constructed by all.
The next step by Latour is stating that construction is not something that deteriorates facts and objectivity, but that proper facts need to be well constructed: one needs to see that the reference network works well and that facts are referenced throughout the whole network. Instead of seeing that the being of reference is referenced only by a few or that political funding has a great stake in the course of the research process etc. This shows for Latour that facts and objectivity need to be well constructed.
Furthermore, we can now distinguish objectivity and facts from other modal beings and their corresponding truths. For example, the beings of fiction are also mentioned by Latour, as the beings that transport us to other worlds, other times, other places. However, these beings don’t find their truth in verification and reference, but in whether you are taken away by them: the truth of a good book or movie is not whether it is factually true what is happening but whether it grasps you and. Applying this to fake news means that one must have a modal understanding of the fake news item that is discussed: is the fake news item something that has its truth in terms of objectivity and facticity (often not) but often it has a more ‘fictional’ quality meaning that it wants to grasp people and give them an exciting or adventurous experience. We have written before how conspiracy theories like QAnon should not be understood as purely epistemological projects but aim to bind readers by giving them interesting stories and experiences, and Latour’s philosophy helps to further substantiate these ideas. Given the limited amount of space of this article, we cannot pass through all the 15 modes of existence that Latour distinguishes, but this example hopes to show that it pays off to have a network and modal analysis of fake news and objectivity.