11.001001000011111101101010100010001000 Arithmazium
Home

What is Paranoia?

Prof. William Kahan of the University of California at Berkeley developed Paranoia in standard Basic on the original IBM PC, a product of IBM's development office in Boca Raton, Florida.

Paranoia arose from Kahan's experience with computers ranging from the earliest commercial mainframes of the 50s to the minicomputers in the 70s. For thirty years, Kahan had worked with colleagues at Toronto, Cambridge, Urbana-Champaign, Stanford, and Berkeley to understand why innocent-looking mathematical algorithms often failed so miserably. Most of the codes he studied were in Fortran, the era's language of science and engineering, but he chose standard Basic for Paranoia. He wanted it to run anywhere with minimal change, including new microprocessor-powered machines just starting to multiply.

It's one thing to home in on anomalies like a value that changes slightly when multiplied by \( 1.0 \) or a value that behaves like zero despite comparing not equal to zero. Programmers find these problems and fix them, or they discover strategies to just avoid them. Such strategies contribute to the lore of mystery around numerical computation.

It's another matter to port a working program from one computer to another with completely different arithmetic. As explored in the Dinosaur Gallery, designs varied widely even across a single manufacturer's product lines. This is the challenge Kahan had in mind when he wrote Paranoia.

The programmer's dilemma is not obvious. In everyday affairs, arithmetic is arithmetic, backed by mathematical principles we learn at an early age. But computer users and programmers at all levels of exposure know there's something suspicious about computer arithmetic.

Computers support floating point arithmetic, which is explored throughout this site. We have all experienced variations of floating point in some form. We use scientific notation in school to represent Avogadro's Number \(6.02252 \times 10^{23} \) and we see the condensed form 6.02252E23 on pocket calculators. This is a decimal representation, convenient for human use. Computers may use binary 0..1, octal 0..7, or hexadecimal 0..9A..F digits, scaled by \( 2^{k} \), \( 8^{k} \), or \( 16^{k} \), respectively. The decimal point becomes a radix point, and places to the right are powers of a half, an eighth, or a sixteenth.

This site is devoted to exposing the subtleties of arithmetic, tracing anomalies back to the earliest machines and trying to collect the ideas that will play into designs of the next decades.

Paranoia's job in 1982 was to psyche out a machine's arithmetic. Here is a list of some of the questions explored:

The final question refers to IEEE Standard 754 for Binary Floating-Point Arithmetic. The standard was formally approved in 1985, years after Paranoia's introduction, but its development dated from 1978. Paranoia arrived just as the world was about to improve dramatically for numerical programmers.

Paranoia's job was to characterize the arithmetic on a computer, so that programmers porting code from one computer to another would have some idea of the numerical challenges they faced. In principle, a programmer would run Paranoia just once on each machine in question, then devise a strategy for modifying what worked in one place to work in the next.

Home