We know that complexity is the worst enemy of security, because it makes attack easier and defense harder. This becomes catastrophic as the effects of that attack become greater.
In A Hacker’s Mind (coming in February 2023), I write:
Our societal systems, in general, may have grown fairer and more just over the centuries, but progress isn’t linear or equitable. The trajectory may appear to be upwards when viewed in hindsight, but from a more granular point of view there are a lot of ups and downs. It’s a “noisy” process.
Technology changes the amplitude of the noise. Those near-term ups and downs are getting more severe. And while that might not affect the long-term trajectories, they drastically affect all of us living in the short term. This is how the twentieth century could—statistically—both be the most peaceful in human history and also contain the most deadly wars.
Ignoring this noise was only possible when the damage wasn’t potentially fatal on a global scale; that is, if a world war didn’t have the potential to kill everybody or destroy society, or occur in places and to people that the West wasn’t especially worried about. We can’t be sure of that anymore. The risks we face today are existential in a way they never have been before. The magnifying effects of technology enable short-term damage to cause long-term planet-wide systemic damage. We’ve lived for half a century under the potential specter of nuclear war and the life-ending catastrophe that could have been. Fast global travel allowed local outbreaks to quickly become the COVID-19 pandemic, costing millions of lives and billions of dollars while increasing political and social instability. Our rapid, technologically enabled changes to the atmosphere, compounded through feedback loops and tipping points, may make Earth much less hospitable for the coming centuries. Today, individual hacking decisions can have planet-wide effects. Sociobiologist Edward O. Wilson once described the fundamental problem with humanity is that “we have Paleolithic emotions, medieval institutions, and godlike technology.”
Technology could easily get to the point where the effects of a successful attack could be existential. Think biotech, nanotech, global climate change, maybe someday cyberattack—everything that people like Nick Bostrom study. In these areas, like everywhere else in past and present society, the technologies of attack develop faster the technologies of defending against attack. But suddenly, our inability to be proactive becomes fatal. As the noise due to technological power increases, we reach a threshold where a small group of people can irrecoverably destroy the species. The six-sigma guy can ruin it for everyone. And if they can, sooner or later they will. It’s possible that I have just explained the Fermi paradox.
More Stories
CISA’s 2024 Review Highlights Major Efforts in Cybersecurity Industry Collaboration
The US Cybersecurity and Infrastructure Security Agency’s 2024 Year in Review marks Jen Easterly’s final report before resignation Read More
Casino Players Using Hidden Cameras for Cheating
The basic strategy is to place a device with a hidden camera in a position to capture normally hidden card...
Friday Squid Blogging: Squid on Pizza
Pizza Hut in Taiwan has a history of weird pizzas, including a “2022 scalloped pizza with Oreos around the edge,...
Scams Based on Fake Google Emails
Scammers are hacking Google Forms to send email to victims that come from google.com. Brian Krebs reports on the effects....
Infostealers Dominate as Lumma Stealer Detections Soar by Almost 400%
The vacuum left by RedLine’s takedown will likely lead to a bump in the activity of other a infostealers Read...
The AI Fix #30: ChatGPT reveals the devastating truth about Santa (Merry Christmas!)
In episode 30 of The AI Fix, AIs are caught lying to avoid being turned off, Apple’s AI flubs a...