Entropy Analysis

Here we test a raw 10MB sample, which incidentally is available for download at the bottom of the page…

The following test proves that entropy is being generated at a principally constant rate throughout that sampling process. $ \chi^2 = 1.0 $ for the 10 chart bars being these values randomly (as expected for a principally constant rate).

Kolmogorov complexity test for entropy creation by the Zenerglass.

Kolmogorov complexity test for entropy creation.

From ent:- Serial correlation coefficient $(R)$ is 0.000042 which is much less than the commonly accepted value for IID data of $10^{-3}$. Totally uncorrelated = 0.0. As a chart of the autocorrelation up to lag = 100 (ent serial correlation coefficient is only measured for a lag of $n=1$):-

Autocorrelation of samples from the Zenerglass.


Expand our fast IID test:-
Our fast IID test of samples from the Zenerglass.

Results of our fast IID test.

Expand our slow IID test:-
Our slow IID test of samples from the Zenerglass.

Results of our slow IID test.

Expand NIST's IID test:-

Therefore on the basis that all of the above tests were passed with flying colours (but mainly purple), we can confidently conclude that the Zenerglass output is very much IID. So in accordance with our Golden Rule #2 and it’s corresponding Golden Equation #2, $H_{\infty} = - \log_2(p_{max})$ taken at 27.4°C internal temperature:-

Sample waveform and histogram from the Zenerglass.

Sample waveform and histogram.

So $H_{\infty} \approx - \log_2(0.0204) = 5.6$. Or, simply and more accurately(!) read it from the NIST IID test: $H_{\infty} = 5.604857$ bits/byte.

Let’s call that 5.5 bits/byte.