Opinion

Computer complexity

May 02, 2018

One of the most remarkable features about the computers which now dominate the lives of virtually everyone in the developed world is that at their core are just two “bits” of information, “1” and “0”. In 1965, the US academic Gordon Moore predicted that the power of computer chips which manipulate this binary code would double every two years and so it has proved. It was later added that over the same period the cost of the chips would have halved which has also turned out to be correct.

It is equally remarkable that computers go wrong so infrequently. And generally, when they fail, it is a result of human error, not a technology malfunction. For the past two weeks, customers of a UK bank have experienced the consequences of just such a mistake when their records were transferred to a new system run by a Spanish bank that has acquired the British bank. There have been similar glitches around the world, more often than not, as data are migrated from one mainframe to another.

This is one reason why modern banking systems around the world still use within their huge enormous computer setups some of the oldest code. The programs have grown incrementally. Ideally, as banking products and internal systems evolve, the IT managers would like to start over from scratch. But the costs of such an exercise are formidable and are likely to be impacted by another almost predictable law of computers, which is that new systems almost always cost at least twice as much as expected, take twice as long to complete and even then do not always work.

This gives an advantage to new “challenger” financial institutions, which can start with a clean sheet and avoid the legacy systems that cause such headaches and expense to established banks and corporations. One has only to consider the massive cost of auditing and updating virtually every existing computer system ahead of the change of date at midnight at the dawn of this new century. The so-called Y2K problem, which earned consultants billions in fees, was caused by the realization that in the 1970s and 1980s, programmers had not factored in the date change, assuming wrongly that within the next 30 years, the world would be using completely new systems. The panic to update legacy software was the greater because most of the original programmers had long since retired and there remained a limited understanding of the code they had written. Well, that at least was what the Y2K emergency consultants insisted and they profited hugely from their warnings.

Now new alarms are being sounded about the equipment produced by Chinese firms that allow for the connectivity of computers. Both ZTE and Huwaei are suspected of selling equipment with secret devices that will allow Beijing to spy on the sensitive data passed through them. In Western capitals, there is outrage that such a trick could be played. It must, however, be wondered if non-Chinese suppliers of similar equipment have not, at the very least, come under governmental pressure to install similar devices, for instance as part of “the war against terrorism.”

There can surely be no doubt that a new international war is being fought covertly in cyberspace. The Internet has brought the world closer together and has also helped to reveal its secrets.


May 02, 2018
65 views
HIGHLIGHTS
Opinion
3 days ago

Board of Directors & corporate governance

Opinion
14 days ago

Jordan: The Muslim Brotherhood's Agitation and Sisyphus' Boulder

Opinion
18 days ago

Why do education reform strategies often fail?