If you run a major American business you can be sure that somewhere, probably in Russia or China, expert computer hackers have you in their sights and it is going to cost you – and ultimately your customers – a lot to limit their damage.
For example, Jamie Dimon, president of J.P. Morgan Chase, told shareholders earlier this year that the bank will spend $250 million and employ 1,000 people this year on cyber security. “It is going to be a continual and likely never-ending battle to stay ahead of it – and, unfortunately, not every battle will be won,” he wrote in an annual letter to shareholders.
Not long after, in August, J.P. Morgan announced it had been targeted by hackers starting in June. This week it said the attack had obtained information on 76 million households and about 7 million small businesses. While the numbers make the data breach one of the largest on record, the bank said that no financial information was compromised.
That makes the J.P. Morgan break-in, which the FBI has traced to Russia, potentially less damaging than the attacks on, for example, Target and Home Depot, where credit cards were compromised.
But it should focus questions about the causes of such computer vulnerabilities and raise a larger question: Why does computer software have to be so unsafe?
For every hacker trying to break in to corporate and government computer systems there is probably one trying to find and fix vulnerabilities before they can be exploited. But it is an uphill battle, as shown by the recent discovery of a flaw that affects potentially hundreds of millions of computers and devices.
“Shellshock” is the name given to a vulnerable element newly revealed in the basic architecture used by computers running variants of a program called UNIX. These include the Linux operating system and the Mac OS X system used by all Apple computers since 2002.
Before Shellshock there was “Heartbleed,” a different flaw affecting Internet encryption software, discovered earlier this year. Although a corrective patch is now available, experts estimate that many of the several hundred thousand services vulnerable to Heartbleed have yet to protected. The bug allows hackers to steal usernames, passwords and other encrypted data.
The appropriately named Shellshock, on the other hand, not only affects a larger number of computers, it allows hackers to take full control of them. Major network servers and security cameras are also among the many vulnerable devices.
Both Heartbleed and Shellshock are flaws in the most basic building blocks for larger computer systems. It is as though the design for a basic component of, say, an automobile steering system was not tested for reliability and ability to survive abuse before being installed on millions of cars. Fortunately, there are federal standards to prevent that.
The lesson ought to be that such flawed products should not have been put on the market in the first place. If taken seriously, that lesson will lead to the creation of better software development and testing standards. Using the model of automobile safety standards, it may be necessary for Congress to step in and tell the computer industry what standards it must achieve.