Thursday, July 7, 2011


One of the more interesting talks at Africacrypt this year was a talk given by Daniel Loebenberger on a paper with Michael Nusken (Univeristy of Bonn) on "Analyzing Standards for RSA Integers". The premise of this paper is that algorithms used in standards to generate RSA moduli might lead to "insecure" generation. The analysis is carried out by setting up a theoretical framework for calculating the entropy of the output distribution of algorithms for RSA modulus generation such as those found in IEEE 1363, FIPS 186-3, ISO 18033-2 etc.

It is perhaps unsurprising that many of these standards do indeed produce 'secure' RSA moduli however a closer inspection of some of the algorithms gives some surprising results. For example if we consider RSA integers generated from the GNU Crypto library it turns out the second prime q is only chosen after p, and the product must be of fixed length. Hence, if p is large there are far less choices for q. One might expect the loss in entropy to be large but in fact this is not the case and the output modulus pq is still suitably random.

This rather unusual and interesting work provides a theoretical framework for analysing practical cryptography and it would be good if other algorithms that have been standardised came under similar scrutiny. This could both prove the quality of some standards and expose weaknesses in others.