Consider the past few years. A global pandemic revealed how quickly a biological threat can spread through tightly coupled economies and fragile health systems. At the same time, ransomware attacks—in which data is encrypted and access restored only upon payment of a ransom—repeatedly disrupted hospitals, pipelines, and public services, demonstrating how small groups operating in the digital domain can impose widespread disruption. These crises did not occur in isolation. They exposed a deeper reality: The infrastructures that sustain modern life are simultaneously vulnerable to biological risk and digital exploitation.
Yet biosecurity measures meant to prevent or mitigate human-made biological risks remain largely anchored in conceptual frameworks inherited from the nuclear era, like arms control agreements and export controls. These frames focus on threats and risks that are imagined as episodic, visible, and bounded—something to be deterred or prohibited. That framing has helped sustain a powerful taboo against biological weapons. But it is increasingly misaligned with the kinds of challenges emerging from rapid advances in biotechnology and the broader technological environment.
The governance problems now emerging in the biological domain do not relate simply to preventing misuse of a single material, facility, or piece of equipment; they are about managing risk in a world in which biological capability is distributed across scientific networks, digital infrastructures, and global supply chains. These challenges look less like the nuclear deterrence standoff and more like those long familiar in the cybersecurity realm: persistent intrusion, ambiguous actor identities and motivations, and the need to manage risk rather than eliminate it. Cyber policy has spent decades grappling with conditions that biosecurity is only beginning to confront.
Persistent risks. Both cyber and biological risks are characterized by lower barriers to entry than in traditional military domains, although those barriers are far from trivial. Developing a nuclear weapon requires access to fissile material, specialized industrial infrastructure, and state-scale resources. By contrast, a sophisticated cyber intrusion can be conducted with commercially available computing equipment and technical expertise, and elements of biological experimentation increasingly rely on widely distributed laboratory tools and digital design platforms. These characteristics do not make biological misuse easy, but they present lower structural barriers than nuclear weapons development poses. Expertise, organizational capacity, and access still matter, and neither the cyber or biological domain is as accessible as popular images of lone hackers or rogue scientists suggest.
Importantly, the empirical record differs across the two domains. Cyber incidents occur daily and are widely documented, ranging from criminal ransomware campaigns to state-linked espionage operations. By contrast, confirmed cases of deliberate biological attacks remain extremely rare. Aside from a small number of incidents—most notably the 2001 anthrax letter attacks in the United States—there are few documented examples of non-state actors successfully deploying biological agents.
But there is a valid comparison between cyber and biological risks involving structural features of the domains. Both the software development and life sciences research that underpin modern economies and biomedical progress create opportunities that can be exploited. Small groups can create disproportionate effects in either the cyber or biological domain. To date this dynamic has manifested far more clearly in cyberspace.
Perhaps most consequentially, attribution is inherently difficult in both areas. Determining responsibility requires piecing together incomplete evidence and navigating political sensitivities. The fraught national and international investigations into the origins of the COVID-19 pandemic illustrate how challenging it can be to establish responsibility with high confidence. Likewise, major cyber incidents such as the SolarWinds attack of 2020 required months of forensic analysis before investigators attributed the operation against US-made software to Russian intelligence-linked actors with high confidence. Even then, attribution relied on probabilistic assessments combining technical artifacts, behavioral patterns and intelligence reporting rather than definitive proof.
These conditions complicate deterrence, accountability, and response.
From weapons to capabilities. Over time, cybersecurity practitioners have largely abandoned the idea that their security focus revolves around discrete “cyber weapons.” Instead, they focus on capabilities, access, vulnerabilities, and effects. What tools and resources (capabilities) do actors have access to, what vulnerabilities do they exploit, and with what cognitive, behavioral, and political effects? This shift allows policymakers to grapple with digital activities that fall below the threshold of armed conflict—espionage, influence operations, cybercrime, supply-chain compromises, and persistent network exploitation.
In contrast, biosecurity discourse remains wedded to the concept of weaponization. While this focus reinforces important norms, it can narrow the policy aperture. Many plausible forms of misuse do not fit neatly into traditional definitions of biological weapons, such as manipulation or pollution of biological data sets, disruption of biomanufacturing, or exploitation of surveillance systems for political control.
Adopting a capabilities-based framing would not weaken existing prohibitions in the biological realm. Rather, it would broaden the conversation, enabling policymakers to address how biological knowledge, data, and infrastructure might be used to produce strategic effects across a spectrum of scenarios.
Intermediate risks. Cybersecurity experience underscores the importance of distinguishing between different levels of threat sophistication. At one end of the digital ecosystem spectrum are ransomware campaigns that rely on widely available malware kits and opportunistic targeting of vulnerable organizations. At the other end are highly tailored, sophisticated operations like the Stuxnet cyber-physical attack on Iranian nuclear facilities, which required extensive intelligence, specialized engineering knowledge, and coordination among highly capable actors.
Biosecurity discussions often oscillate between extreme scenarios and benign research. Yet many concerns—laboratory accidents, misuse of widely available techniques, erosion of safety norms—occupy a middle ground of intermediate risks. Historical cases illustrate how biomedical research can open up avenues for misuse that fall short of a biological weapons program or attack. The 2001 anthrax letter attacks were carried out using material from a research laboratory. They caused relatively few casualties but generated nationwide disruption and political panic. More recently, debates surrounding research with pandemic risks have highlighted how research intended for legitimate scientific purposes can raise concerns about accident, misuse, and governance gaps.
Cybersecurity has developed relatively mature practices for operating under uncertainty. Analysts routinely combine technical, behavioral, and contextual evidence, attributing behaviors to adversaries in terms of confidence levels rather than definitive judgments. Cyber-threat assessments frequently express findings in probabilistic language. For example, a US intelligence assessment of the SolarWinds intrusion concluded that Russian actors were “likely” responsible for the operation, reflecting a high but not absolute level of confidence based on available evidence. Investigations proceed despite incomplete data, acknowledging gaps while still informing decision-making. Biosecurity governance has historically sought clearer lines of proof, reflecting the legal sensitivities surrounding biological weapons. But as biosecurity risks diversify and move into the ‘grey zone’— the potential misuse of genomic databases, pathogen surveillance networks, or biomanufacturing infrastructure— insisting on certainty may hinder timely responses.
Developing shared frameworks for probabilistic assessment could improve transparency while enabling more agile governance. Cybersecurity experience also highlights the limits of traditional deterrence models. States and non-state actors engage in continual activity below the threshold of armed conflict, probing systems and gathering intelligence without triggering escalation. Biological risks may increasingly manifest below the threshold of armed conflict and include incremental erosion of norms, competition over technological advantages, or strategic exploitation of vulnerabilities. Deterrence by punishment—so central to nuclear thinking—may prove less effective than biosecurity strategies focused on denial, resilience, and norm reinforcement.
Moving beyond silos. The growing interdependence of digital and biological infrastructures further complicates governance. Cybersecurity vulnerabilities can affect laboratory operations, supply chains, and health systems. Biological data and tools are embedded in digital ecosystems that themselves face persistent cyber threats. The pandemic illustrated how disruptions in one domain cascade across others, affecting economies, information systems, and geopolitical stability. As technologies converge, risks will increasingly cross traditional boundaries, demanding governance approaches that reflect this interconnected reality.
None of this implies abandoning arms control, export controls, or the norms that underpin biological nonproliferation. These remain essential pillars of global security. But they are not sufficient on their own to address an evolving landscape of biological risk shaped by rapid technological change and persistent competition.
The challenge is not simply to prevent the next crisis but to build governance frameworks capable of navigating a world in which disruption is persistent and uncertainty unavoidable. In that task, the experience of the cyber domain offers a valuable guide.
Source: The Bulletin
Click Here For The Original Source.
