HAL's parallelism

HAL and parallelism

HAL has been designed from scratch to exploit massive parallelism. Parallelism is desirable as it leads to increased performance.

Genetic algorithms are typically well able to exploit parallel computers where these are available. It is common for organisms to have their fitnesses evaluated largely independantly of one another, and only when organisms come to breed is information transfer between them required.

HAL takes the degree of possible parallelism to extremes. Many "artificial life" genetic techniques have historically taken as their domain the space of computer programs in a variant of a machine code (usually designed to be as robust as possible). This has meant a good match for execution on today's serial computers as it mirrors their idea of serial instruction stream.

HAL is not designed to be run on the serial machines common in the modern world. Its performance is miserable in such a confined environment. It has been designed for execution on a completely different type of machine.

The prevalance of parallelism

Unfortunately, many problems which organisms are faced with are amenable to treatment with significantly more parallelism than traditional von-Neumann architecture computers can employ, and solutions on such computers are correspondingly slow.

Processes which may be fruitfully dealt with by parallel processing techniques are often characterised by a large volume of input data, a large volume of output data, or both. Problems with these characteristics often have similar, largely independant processing steps which may be applied to the data.

Problems which would benefit from more parallelism than can be delivered by traditional serial CPUs include such fundamental computer-science techniques sorting, fast fourier transforms, encryption and decryption, compression and decompression.

In terms of real-world applications the availability of greater parallelism has the potential to improve image and video processing, speech, handwriting and optical character recognition, machine game playing, artificial intelligence and rendering virtual environments.

Problems writing parallel programs

One reason why parallelism is not yet very widespread is because writing parallel programs is genuinely difficult. However human's have particular problems writing parallel programs for a number of reasons.

Programs are often based on writen languages which are, by their nature, serial. This verbal characterisation of computer science problems is an aspect of a widespread western malaise characterised by allowing one's thought processes to be dominated by a singular "inner voice", identified with the self.

The same apparatus evolution intended to be used in solving social problems gets applied to problems where the serial, step-by-step thinking imposed by the brain's language processing machinery is a liability.

In addition to the mental block imposed by excessive verbal thinking, there's another problem which again stems from using the same apparatus evolution intended to be used in solving social problems, to address questions of a more general nature.

Humans have a tendency to understand processes by identifiying themself with parts of them, a tendency which often leads to interpreting distributed systems as being under the control of a single, particular agent.

This is the phenomenon behind the process for which Mitch Resnick has coined the term "the centralised mindset". Drawing examples from the behaviour of flocks of birds, ants nests and traffic flows he provides a number of examples where people's intuition is frequently, falsely led to the conclusion that systems exhibiting complex behaviour are best understood as being controlled by a central controlling agent.

While the ability to place yourself in the position of a component of a system is undoubtedly a useful aid to understanding, if that component is an important one, it can often lead to the illusion that it is 'in charge' of the dynamics of the system through, if you like, a misapplied sense of self-importance.

Lack of parallel hardware

Finally as a number of problems have resulted in a lack of programs able to exploit parallel systems, there has been a corresponding scarcity of hardware capable of providing a suitable environment for parallel programs to execute in.

Increasing the number of processing units available increases the cost of systems more than it increases their performance. If the number of programs able to exploit the additional parallelism is low then there can appear to be little incentive to build parallel computers, except in the case of extremely performance-critical applications.

Additional components also introduce engineering problems, such as difficulties in maintaining a synchronous clock signal over a larger area, increased failure rate due to increased component count, and problems associated with an increased demand for power and greater need for cooling structures.

Finally inherently slow serial hardware persists as a hangover from the early days of computers - in much the same way as the x86 architecture persists, despite its well-documented architectural problems.

Lack of parallel hardware means that few have access to hardware capable of exploiting parallel programs, which reduces demand for, and production of, parallel programs. Lack of parallel programs means that money spent on parallelism in hardware is often money largely wasted, which means fewer parallel machines are produced. This description of the situation reveals a vicious circle.

HAL's output

HAL does not output parallel programs, its output is more like an electronic circuit. HAL targets rectangular, computational universal cellular automata.

By working closer to the level on which machines are actually constructed, HAL bypasses any artificial limitations imposed on the form of solutions by the serial nature of languages.

Because of the resemblance of the output to the design of electronic circuitry, it is interesting that HAL's target hardware resembles what humans who design electronic circuitry actually use: FPGAs (Field Programmable Gate Arrays).


Index | HAL | HALcell | Parallel | Hardware | Future | Problem | Limits | Automata | Questions | Links
tim@tt1.org | http://www.alife.co.uk/