The recent attack from China against Microsoft has many nations in upheaval in regards to security. Could the future against cyberattacks lie in the use of entirely custom operating systems and processors?
The past 30 years have seen the computer industry move towards unified architectures and standards across all devices. While there are different operating systems and processors available to users, all of these can generally work with each other. For example, a Windows user can write a word document, email that to a macOS user, and successfully open the file with no errors. Another example would be a user who decides to use an AMD processor instead of an Intel for their Windows machine, which may require a reinstall, but Windows will work fine with both processors.
While different operating systems can work with each other transferring data and sharing files on a common drive, their mode of operation is fundamentally different. These differences are why most viruses and malware have to be targeted to an operating system to work correctly. Malware written for Windows can rarely infect a mac user and vice versa.
When many users are reliant on a single platform or operating system, it becomes increasingly more rewarding to target such users. To make matters worse, when the platform under attack relies on a widespread operating system to which almost everyone has access, it becomes increasingly hard to defend against attacks. Any weakness in the underlying operating system will be available to anyone using said system, allowing cybercriminals to refine their attacks locally before attempting a real-life attack.
Overall, unified architectures and standards that enable any computer to talk to any other computer also allow attackers to target many systems regardless of their hardware configuration. To make matters worse, any security vulnerability in the unified architecture of standard instantly exposes all systems that use the architecture of standard (see SSL Heartbleed).
For a moment, imagine a world where unified architectures and standards didn’t take place and that computers sold by manufacturers were unique to each other. Such a world would be identical to what computer users of the 80s had to deal with; many different computers from many different manufacturers, all of which could not share files or data. A game written for a BBC Micro would not run on a ZX Spectrum, and additional hardware for the ZX Spectrum would not work for any other system.
While this caused significant headaches for computer users, it did provide a barrier of protection against viruses. Any malware written for a mainframe or server would unlikely infect other devices if their hardware and software were fundamentally different. Sure, in theory, a ZX Spectrum could connect to a modem and transfer data to and from a remote server. But malware on a ZX Spectrum (if that is even possible) could not attack a remote server unless it was explicitly programmed to start on a ZX Spectrum then reconfigure itself to run on the server.
This leads up to the question of whether modern technology could create a similar environment? Is it possible for compilers and advanced code development platforms to create unique solutions each time a project is compiled?
In the distant future, it may turn out that the best way to defend against cyberattacks is to create a system that is so unique that the only industrial standard followed is the method of communication (i.e. using TCP/IP and HTML for external communication). The process could start with hardware construction, whereby a customer can pick and choose different hardware modules that describe how the system operates. Instead of having one choice of a memory management unit and one choice of bus protocol, many systems could be mixed and matched. These units are fundamentally designed to be different but produce the same output meaning that hardware attacks are harder to conduct (different wiring, different transistor layout, different instructions etc.).
The next stage for a customer would be a mix and match unique operating system. Some readers may be thinking immediately about Linux, but while Linux systems can be uniquely configured, they fundamentally use the same Kernel. Instead, a truly unique operating system would generate functions and methods for accomplishing tasks in its unique way. For example, a unique OS could flip the order in which bytes are read, use a pseudo-random method for generating addresses, and integrate different methods for inter-process communication. Such an operating system would require an attacker to gain intimate knowledge on how it operates before they could even attempt to attack the system.
There would be no doubt that such a project would be a massive undertaking, but the development of pre-design modules, code, and functional units could accelerate such development. Furthermore, such an operating system would not be targeted at individual users but instead those who operate large data centres that hold private personal information.
In a world where cyber-attacks are increasingly common, researchers and security experts may need to consider the advantages of unique systems. Just like how all door locks are unique, developing servers that use unique operating systems could help to provide an additional layer of protection against remote cyberattacks. For example, using a pseudo-random address generator would confuse an attacker as they wonder why a CPU executing instructions decided to go from address 0xC100 to 0x5E44.