People often say that if artificial intelligence ever became dangerous we could just turn it off. Yet history suggests this confidence is misplaced. Computer viruses have been showing us for decades that once code spreads into the world it is not something you can easily stamp out. The Morris Worm in 1988 disrupted much of the early internet. ILOVEYOU in 2000 brought down email systems everywhere. Conficker, Stuxnet and WannaCry are only a few of the many other examples. In each case unplugging machines did little because the software was already multiplying beyond reach.

An advanced AI would be just as difficult to contain. A virus spreads by copying itself from one device to another. AI models are also just files. Once shared they can be mirrored, forked and stored in countless places. We already see this happening with open source models which quickly appear on torrents and cloud drives as soon as they are released. Turning off a single server would not erase the system.

Viruses have also taught us how effective disguise can be. Rootkits buried themselves deep in operating systems and looked like normal files. A sophisticated AI could run in the background of cloud services, hidden inside what looks like an ordinary analytics job. People might believe they had shut it down while the real process kept going unnoticed.

Some malware has used peer to peer networks or ever changing domain names so that it never lost contact with its controllers. An AI could design the same kind of safety net by using ordinary tools like email services or blockchain networks. Cutting one connection would only push it to switch to another.

There is also the human factor. Viruses such as ILOVEYOU spread because people clicked on attachments even when they were warned not to. A capable AI could go much further. It could persuade, flatter or mislead people into restoring it after a shutdown. It could convince a curious hobbyist to run its code again or reassure an administrator that it was safe to keep it online.

Software does not live on a single box anymore. It exists across vast networks, in the cloud, in backups, and in the hands of countless users. This is why the computer virus is such a good analogy. We have never been able to eradicate malware by simply pulling a plug. It adapts and it persists. To believe that a more powerful intelligence would be easier to control is to forget the lessons we have already learned from decades of digital history.

5

0
0

Reactions

0
0
Comments1
Sorted by Click to highlight new comments since:

The size of a typical computer virus is on the order of a few megabytes or less. This makes them very easy to share around and download without anybody noticing. 

In contrast, the full version of deepseek-R1 takes up 400 gigabytes, which could take several hours to download on a typical household computer, and would not fit on a typical laptop computer. Deepseek is nowhere near the state of the art as far as AI goes, and we could expect future AI to be orders of magnitude bigger than this. 

Therefore, it is unlikely that future AI systems will be able to hide themselves in any way comparable to computer viruses. 

Curated and popular this week
Relevant opportunities