…of TRNG design.

Follow these three simple rules, and you’ll find that designing and building an entropy source for your own TRNG isn’t all that difficult. Certainly not as difficult as *They* say. But *They* would say that, wouldn’t they?

Make sure that your entropy source is stable, both in the short and long term. By stability, we don’t mean invariance to temperature effects. We mean don’t try to use sources like atmospheric noise, raindrops, wired up chaotic jerk functions, aquarium fish or Chua circuits. These sources are not stable day to day, making their entropy rates indeterminate. The Chua circuit will require constant tweaking of variable components and atmospheric noise is more likely to be Lady Gaga’s latest jitty or a passing police car. Similarly, floating analogue to digital converter pins are unreliable and very susceptible to the weather or wandering paws.

The cliché reverse biased transistor also falls into the don’t use category. Electrical ageing effects rapidly accumulate with $V_{BE}$ as low as -8V[1]. For example the common USB form factor ChaosKey TRNG drives 2N3904 transistors at -20$V_{BE}$. Ouch! Other USB format devices operate at similar relatively high reverse voltages. The damage can manifest itself as long term noise drift. Exactly the type of problem repeatedly experienced by Rob Seward with his early TRNG designs.

Peter Drucker said *“If you can’t measure it, you can’t improve it.”* Very true. So you have to measure the entropy being generated by the source. Given the shenanigans with NIST SP 800–90B, BSI AIS ^{20}⁄_{31} and real world inappropriateness of the ubiquitous (and often misunderstood) Shannon log formula, strong compression is the way forward. Inevitable biases and autocorrelations are automatically incorporated into the calculation. Compress a sample data set generated by the source, divide by the number of samples and further divide by two as a safe safety factor. For example, 500,000 samples compressing to 250kB gives an entropy rate of 4 bits/sample. Dividing by another two results in a very conservative 2 bits/sample. Not too bad if you can sample at 50kSa/s producing a decent 100kbps of entropy.

For any TRNG to be worthy of the name, it must satisfy the most important aspect of TRNG design, namely that entropy generated > length output. Or more formally when a randomness extractor is in place as $ \operatorname{Ext}: \{0,1\}^n \to \{0,1\}^m $ where $n \gt m$, and typically $\frac{n}{2} \gt m $. ID Quantique, a leading TRNG manufacturer[2], somewhat controversially sets the IO ratio uncommonly high at $ \frac{n}{1 \frac{1}{3}} = m $ but still obeys the rule. Cases of $ m \gt n $ imply a simple pseudo random number generator and loss of information theoretic security. This of course requires the measurement of the rate of entropy generation, which can be one of the most challenging aspects of TRNG design and validation. The issue most commonly manifests itself during debates of whether `/dev/random`

is ‘better’ than `/dev/urandom`

.

References:-

[1] N. Toufik, F. Pilanchon and P. Mialhe, *Degradation of junction parameters of an electrically stressed npn bipolar transistor*, c.e.f., University of Perpignan, 52 av. de Villeneuve, F-66860 Perpignan.

[2] M. Troyer and R. Renner, *A randomness extractor for the Quantis device.* ID Quantique
Technical Paper on
Randomness Extractor, September 19, 2012