Heterogeneous Element Processor

Heterogeneous Element Processor

The Heterogeneous Element Processor (HEP) was introduced by Denelcor in 1982 as the world's first commercial MIMD computer. A HEP system, as the name implies, was pieced together from many heterogeneous components -- processors, data memory modules, and I/O modules. The components were connected via a switched network.

A single processor in a HEP system (up to sixteen could be connected) was rather unconventional; via a "program status word (PSW) queue," up to fifty processes could be maintained in hardware at once. The eight-stage instruction pipeline allowed instructions from eight different processes to proceed at once. In fact, only one instruction from a given process was allowed to be present in the pipeline at any point in time. Therefore, the full processor throughput of 10 MIPS could only be achieved when eight or more processes were active; no single process could achieve throughput greater than 1.25 MIPS. This type of multithreading processing classifies the HEP as a barrel processor.

Processes were classified as either user-level or supervisor-level. User-level processes could create supervisor-level processes, which were used to manage user-level processes and perform I/O. Processes of the same class were required to be grouped into one of seven user tasks and seven supervisor tasks.

Each processor, in addition to the PSW queue and instruction pipeline, contained instruction memory, 2,048 64-bit general purpose registers and 4,096 constant registers. Constant registers were differentiated by the fact that only supervisor processes could modify their contents. Interestingly, the processors themselves contained no data memory; instead, data memory modules could be separately attached to the switched network.

The HEP implemented a type of mutual exclusion in which all registers and locations in data memory had associated "empty" and "full" states. Reading from a location set the state to "empty," while writing to it set the state to "full." A programmer could allow processes to halt after trying to read from an empty location or write to a full location, enforcing critical sections.

The switched network between elements resembled, in many ways, a modern computer network. On the network were sets of nodes, each of which had three links. When a packet arrived at a node, it consulted a routing table and attempted to forward the packet closer to its destination. If a node became congested, any incoming packets were passed on without routing. Packets treated in such a manner had their priority level increased; when several packets vied for a single node, a packet with a higher priority level would be routed before ones with lower priority levels.

Although it was known to have poor cost-performance, the HEP received attention due to what were, at the time, several revolutionary features. HEP systems were purchased by Los Alamos, the Argonne National Laboratory, the Ballistic Research Laboratory, and The HEP attracted widespread attention despite its terrible cost performance because of its many interesting hardware features that facilitated programming. The Denelcor HEP was acquired by several institutions, including Los Alamos, Argonne National Laboratory, Ballistic Research Laboratory, and Germany's Messerschmitt. Messerschmitt was the only client to put the HEP into use for "real" applications; the other clients used it for experimenting with parallel algorithms.

ee also

* multithreading
* hyperthreading


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Multithreading (computer hardware) — Multithreading computers have hardware support to efficiently execute multiple threads. These are distinguished from multiprocessing systems (such as multi core systems) in that the threads must all operate in the same address space, as there is… …   Wikipedia

  • HEP — or hep can mean: * Hep Records, a jazz record label in Scotland * Habitat Evaluation Procedures, a method used to document the quality and quantity of available habitat for selected wildlife species * Head end power, a method for providing… …   Wikipedia

  • Cell (microprocessor) — Cell is a microprocessor architecture jointly developed by Sony Computer Entertainment, Toshiba, and IBM, an alliance known as STI . The architectural design and first implementation were carried out at the STI Design Center in Austin, Texas over …   Wikipedia

  • Data integration — involves combining data residing in different sources and providing users with a unified view of these data.[1] This process becomes significant in a variety of situations, which include both commercial (when two similar companies need to merge… …   Wikipedia

  • computer science — computer scientist. the science that deals with the theory and methods of processing information in digital computers, the design of computer hardware and software, and the applications of computers. [1970 75] * * * Study of computers, their… …   Universalium

  • Cell — Saltar a navegación, búsqueda Para el personaje de Dragon Ball Z, véase Cell (Dragon Ball). El procesador Cell Cell es una arquitectura de microprocesador desarrollada conjuntamente por Sony Computer Entertainment …   Wikipedia Español

  • Endianness — Endian redirects here. For the Linux routing/firewall distribution, see Endian Firewall. In computing, the term endian or endianness refers to the ordering of individually addressable sub components within the representation of a larger data item …   Wikipedia

  • GPGPU — General purpose computing on graphics processing units (GPGPU, also referred to as GPGP and to a lesser extent GP²) is the technique of using a GPU, which typically handles computation only for computer graphics, to perform computation in… …   Wikipedia

  • computer — computerlike, adj. /keuhm pyooh teuhr/, n. 1. Also called processor. an electronic device designed to accept data, perform prescribed mathematical and logical operations at high speed, and display the results of these operations. Cf. analog… …   Universalium

  • information processing — Acquisition, recording, organization, retrieval, display, and dissemination of information. Today the term usually refers to computer based operations. Information processing consists of locating and capturing information, using software to… …   Universalium

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”