My last article discussed the possibility of an IT 9/11 and indicated that Alan Turing had left us a solution. A future article on “bloatware” will discuss the horrendous software mess in more detail but a few comments are appropriate here, remarks which in no way should be construed as a criticism of John von Neumann, who was a brilliant scientist and is rightly regarded as the “father” of the computer.
He was, however, not interested in software and was even opposed to the development of assemblers, believing that programming could be done at the machine language level. His interests were also atom bomb-related numeric processing and he did not foresee the use of the computer for symbol (such as text) processing. He was not alone in this. Even one of his contemporaries said publicly that he could not conceive of the same computer solving equations and at the same time being used for business data processing.
This meant that software developed without any planning. Assemblers were developed, then several programming languages which needed compilers, then operating systems to keep track of compiling and application running, then database systems, then graphics, then telecommunications, then computer games, then mobile telephony, then middleware and so on, until there came the fragmented mess we have to day, which, if unchecked, will destroy or severely cripple the industry which spawned the problem.
Alan Turing, regarded as the “father” of software, was more far seeing. He said all applications could be defined with a finite set of “states of mind” which I will refer to as functions (later research by others has shown that the finite number is quite small). Moreover these functions would be linked together to form an application, with functions being able to call themselves (now known as recursion). Some feeble attempts to follow through on these ideas were made with things like “functional programming,” “intentional programming,” and even more recently with some aspects of Java. While reasonably useful these are all involved with machine code generation and are open to intrusions (viruses, hacking etc.).
The latest research following up on Turing’s ideas is far more realistic and versatile, giving a system that can be used to define any application and which dispenses completely with ubiquitous operating systems.
The research has identified a number of functions that are used as building blocks. Rather than being in machine code these functions are given a numeric code, which could even be encrypted if desirable. An easily expandable development language is associated with the numeric codes, which can grow without affecting anything previously developed. In addition an analysis has been made of what machine language elements are needed to respond to the numerical code, which turns out to be a remarkably small number taking only a few hundred bytes. (Compare one version of Windows, which needs close to 700 million bytes.)
The numerically coded functions are placed in a hierarchical table, which can be augmented without affecting what has previously been developed, giving a readily expandable system without having to go through repeated updates. It is only at the bottom of the table that any reference is made to machine language processing. The few machine language elements needed are themselves static and no new machine code can be introduced, removing the ability for nefarious individuals to create viruses and worms, hack and do other nasty things.
The research has also shown that multiple applications can be included in the Turing-like scenario. There is no reason why the IT industry cannot proceed immediately to this approach, in a phased development process. Without detriment to existing legacy systems it could be introduced to smart mobile phones, handling all the activity currently done or planned for such phones, including intrusion-free e-mail. It could be readily adapted to smart cards and embedded systems. This alone would involve over 90 per cent of all installed computers. Following this verification of concept it could then start to take huge bytes out of the current PC software market, targeting database systems (easily duplicated), high-speed network control, graphics and other major applications.
It’s funny to think that Turing proposed the basics for this concept way back in 1936.
QuickLink 050823
–Hodson is a theoretical physicist, speaker and writer. He can be reached at bernie@genetix.ca.