Issue 9: the Terminator effect
Bisons, bio-labs, brain-computer interfaces, leviathan and science in war
Welcome to this week’s issue of the Anti-Apocalyptus newsletter. Each week I send you five links about some of the most important challenges of our time: climate change, weapons of mass destruction, emerging technologies, mass causes of death and great power wars. If you haven’t done so yet, feel free to subscribe at the button below, hit the heart button or share this email with anyone who could be interested.
This newsletter is about things that could either kill a lot of people, derail life on earth or even completely wipe out humanity. It deals with black swans (although I’m not such a fan of Taleb) and grey rhinos such as pandemics, nuclear war, great power war or climate change.
There are a bunch of psychological reasons why humans have a hard time dealing with these risks. A range of biases stand in the way of us acting seriously on long-term, low-likelihood, low-frequency risks, that might have a very high impact. On top of that, there are often power groups whose interests would be harmed by taking meaningful action against them (I’m looking at you, climate change).
Yet I would also argue that we are over-exposed to these types of risks, particularly in works of fiction. Books but especially movies about end of the world saturate our entertainment sphere. A quick search on Netflix will yield everything from alien invasions, zombie apocalypses, supervolcanoes or the masterwork Children of Men, where humanity has become incapable of reproducing.
According to Wikipedia in the 2010’s, there appeared 79 disaster films. 2020 already saw a major Hollywood film about a comet impact on earth. Again according to Wikipedia there are 118 films about viral outbreaks.
Even when we discount deep threats to ourselves, we are constantly inundated with them in fiction. Which yields an interesting juxtaposition. What if our constant dramatisation of unlikely, big impact threats is one of the reasons we discount them?
You see this for example in discussions about a malevolent superintelligent AI. This is a long-term, relatively unlikely event, that a lot of smart people think we should prepare for (like Nick Bostrom or Stuart Russell). Yet whenever you first broach this subject with people working on AI or technology, their first reaction is a sigh combined with comic disbelief. They will then explain to you that a Terminator scenario isn’t very likely (which isn’t the argument in the first place). The pop-culture imprint of Arnold Schwarzenegger hinders us when looking at the real-world threat superintelligence might offer.
Similarly, when you talk to regular people about existential risk (besides maybe climate change), their first image is of over-the-top science-fiction movies. Most of the argument is explaining people these are real things, not just some crazy sci-fi invention.
So however enjoyable movies are that deal with existential risks, maybe they are overdone in our current culture, and stop us from really thinking about these questions.
1. Climate change
National Review - Bring Back the Bison: 'Megafauna Nationalism' in America
Conservationists and native American groups have been wanting to re-introduce bisons in the US in large numbers. Now they might get an unlikely ally: US right-wingers who see them as a symbol of nationalism. This interesting article describes how this coalition could work, and comes at a time when (genetically-driven) re-introduction of (partly) extinct species is speeding up. A Przewalski horse was cloned last week and attempts are being made to clone Woolly Mammoths. These species in turn could help maintain carbon-sequestering ecosystems.
2. Weapons of mass destruction
South China Morning Post - The labs where monsters live
Great feature discussing Biosafety Level 4 laboratories, or labs that handle extremely dangerous substances, like diseases such as Ebola, Anthrax and Smallpox. These labs have been key in our fight against COVID-19, and can help us prevent and respond to deadly (human-engineered) pandemics. But they also offer threats. There have been repeated security breaches at these labs all across the world, and they have sometimes been employed to make bio-weapons.
3. Emerging technologies
Bulletin of the Atomic Scientists - The brain-computer interface is coming and we are so not ready for it
There might be a lot of hype going on around Elon Musk’s gambit into brain-computer interfaces, but this piece digs deep into the state-of-the-art of the field (without, maybe thankfully, mentioning Neuralink). In theory brain-computer interfaces offer great possibilities to for example help disabled people. Yet the threats are also there, from ubiquitous surveillance to brain hacking.
4. Mass causes of death
London Review of Books - Leviathan in Lockdown
A slightly older article I recently came across. Turns out that the infamous cover for Thomas Hobbes’ Leviathan not only has the composite body of Sovereignty towering over the land, it also depicts an empty city with plague doctors walking across it. This interesting detail is used in this article to explore the relation between state formation and epidemics. As some say: there are no libertarians in a pandemic.
5. Great power war
The Quarterly Journal of Economics - Frontier Knowledge and Scientific Production: Evidence from the Collapse of International Science
This paper looks at scientific cooperation during World War 1, and finds that knowledge flows (predictably) fell during this time, and that top-level basic research was particularly affected by this. Much has, of course, been written about how war can incentivise states to invest in science and technology. Yet this research suggests major productivity falls in some areas.
I hope you enjoyed this newsletter. Feel free to send me comments or remarks by responding to this email. If you haven’t done so yet, please subscribe at the link below, hit the heart button or forward this email to anyone who could be interested.