theme radar

Our explorative tool can help you navigate the vast landscape of trends and disruptors. Use the dropdown menu below to discover our viewpoint on influential themes that emerge from unique configurations of our six frameworks.

Introduction

When thinking or talking about technology, we often perceive technology as mechanical devices, such as machines or robots assembled by humans, built up from parts to a whole, that has controllable and predictable characteristics. However, advances in AI, quantum technology, and synthetic biology will evolve these technologies towards a point of complexity and uncertainty that man cannot comprehend nor steer and control. The question is how we as humans can meaningfully interact and understand these complex innovations, and whether we can control their effects.

Drivers

The Sociocultural Framework: Technology

Exponential technologies are complex technologies by nature, and different countries will have a different from of these technologies and what they can bring to our societies and economy, e.g. China has a different conception of AI compared to the West, seeing it more as a natural phenomenon. For meaningful human control and reducing risks, transparency and openness are required in governance and research & development models.

The Stack: Intelligence, Hard Infrastructure, Neo-governance

AI and quantum are respectively at the intelligence and hard infrastructure layers. The systemic risks of exponential technologies also demand new governance models to limit risk and meaningful human control.

The Deep Transitions Framework: Transformative Innovation

Mission-driven innovation will likely invest substantially in exponential technologies, as they harbor the promise of solving some of our society’s biggest problems (e.g. climate change, military superiority).

The Hegemonic Framework: Technology

Exponential technologies will become an increasingly important vector of geopolitical interest, given their political, military and economic implications.

Relevance

An increasing number of technologies transcend meaningful human control and understanding, and thus pose risks for future applications and development. For example, synthetic biology aims to create synthetic organisms that have new, unpredictable and complex interactions with the environment and other organisms (e.g. this could lead to new viruses or diseases); AI could reach a point of singularity whose intelligence man cannot understand (e.g. AI could be adversary to human goals); smart algorithms can be so complex that they are a ‘black box’; quantum computers work on such different computational principles that their calculations are deemed not understandable for the human mind (e.g. stock market crashes that we cannot explain, autonomous systems turning wild). Nonetheless, given the huge expected benefits – both economically as strategically – it is likely that their development will continue. As such, we need a new understanding of and relationship with these technologies to ensure they contribute to.a better future.