Was von Neumann wrong about randomness?

Let’s look at what von Neumann actually said:-

QUOTE IMAGE

Ref. von Neumann J, Various Techniques Used in Connection with Random Digits, Notes by G E Forsythe, National Bureau of Standards Applied Math Series, 12 (1951) pp 36-38. Reprinted in von Neumann’s Collected Works, 5 (1963), Pergamon Press pp 768-770.

He was entirely reasonable to suggest that a deterministic computer should not be able to generate truly random numbers as those are non deterministic. This would be a direct contradiction to the very definition of a programmable computer. However, consider the period. He said his famous lines in 1951 when the computers he knew of were the EDVAC and UNIVAC-1. These were made up of 3563 vacuum tubes and ~5000 vacuum tubes respectively. Ref. http://www.computinghistory.org.uk/cgi/computing-timeline.pl accessed 15 October 2017. Ref. https://babel.hathitrust.org/cgi/pt?id=mdp.39015002095639;view=1up;seq=519 re. EDVAC details accessed 15 October 2017. Ref. https://en.wikipedia.org/wiki/UNIVAC_I accessed 15 October 2017.

The AMD processor that produced this random looking image has about 1.2 billion transistors. There is also another couple of billion transistors making up the continually refreshed DRAM memory. And a vacuum tube is equivalent to a transistor. Von Neumann could not have remotely envisaged the complexity of a modern computer.

The sample code executes within a self optimising Java virtual machine running atop a multi taking operating system. In addition to executing the Java instructions, the processor is dealing with hard and soft interrupts from many hardware circuits. It’s reading and writing to the hard disc, updating the video display and streaming psy-trance music via the Ethernet card. And those funky packets arrive subject to the vagaries of the internet and what your neighbours are downloading at the time. There is also hardware memory management, an autonomous Java garbage collector, several caches, pipe lining and speculative instruction execution. Plus a lot more that would bore most to death. In short, the inside of a computer is complex. Very complex indeed, bordering on chaos.

Whilst a microprocessor will reliably and predictably execute one instruction after another, that only considers the microscopic case. Von Neumann’s algorithm will operate exactly as designed at this level. Taking a holistic view of an entire computer and considering the macroscopic perspective, it appears much harder to predict with exact certainty what a microprocessor will be doing from one nanosecond to another. So if you were to ask it for something like say the time, the computer will respond quickly but not necessarily within the nanosecond you asked. Or perhaps not even within the next few as it performs necessary housekeeping.

Therein lies the possibility that he was wrong in his statement. What appears simple and predictable on a very small scale, can easily become very unpredictable on the very large scale of highly integrated circuity. We believe that it is possible that the chaotic behaviour inside a modern computer can lead to non determinism on a nanosecond timescale. A roulette wheel, a double pendulum and an electronic Chua circuit are three examples of such chaotic behaviour.

import java.io.DataOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
public class SECLOCK {
    public static void main(String[] args) throws IOException {
        try (DataOutputStream dos = new DataOutputStream(
                new FileOutputStream(
                        new File("/tmp/nanoimage.data")))) {
            for (int i = 0; i < 250_000; i++) {
                dos.writeByte((byte) (System.nanoTime() & 0xff));
            }
        }
    }
}

Java fans will note that there is no buffered output stream. This is deliberate and sauce for the goose. The code should produce a 250,000 byte binary file from the least significant byte of Java’s internal timer.

These images are greyscale where each pixel is represented by 256 shades of grey. This is important to know. Clock jitter

compresses to 14.9% of the raw original size or an equivalent entropy of 1.19 bits /byte Zoomed clock jitter

The very fact that the highly compressed PNG file showing you this image occupies 232 KB is an indication of it’s substantial entropy content. You are welcome to download these images and measure their entropy for yourself. Or better still, copy and execute the code above and make your own pretty pictures. That’s the concept that this site is founded upon.