The first invited talk of Eurocrypt 2014 was by Jeff Hoffstein on "A mathematical history of NTRU and some related cryptosystems". This history spans about 20 years and includes both successes and disasters, and the title of this blog post was how Jeff introduced his own talk.

It all started in '94, with a lecture by Goldfeld that Jeff attended, which was about a one way function from number theory due to Damgard in '88. This one way function was based on claw-free permutations, and Jeff noted that it was related to certain L-series and he began wondering what you could achieve with such constructions. Could you achieve Public Key Encryption or maybe just Key Exchange? He still does not know the answer to this day, but such open questions eventually led to the NTRU cryptosystem.

He began thinking about restricted polynomials mod q and how to prove the knowledge of a polynomial without revealing the polynomial itself. Linear algebra shot down most of his initial ideas, but eventually he began thinking about 'short' polynomials, whose coefficients are all bounded. With these 'short' polynomials, the problem that relates to breaking the schemes turns into the Closest Vector Problem. At this point lattice basis reduction comes into play, and at the time it was believed that LLL was deadly for most applications that include lattices in some form or shape.

The next step was to compactify the ring structure, because "it seemed like the right thing to do". Instead of just taking polynomials mod q, you also take them mod X

But now we get to NTRU. The next required observation was that for most polynomials f you can find an inverse in your ring and that if you compute g/f for short polynomials g and f the result 'looks random'. With this in hand, everything was ready to create NTRU. Jeff united with Jill Pipher and Joe Silverman and they began thinking about quantifying the security and how to select parameters. They understood the combinatorial approach to break the scheme pretty well and Odlyzko pointed out the Meet-In-The-Middle approach to break the scheme, but the big issue was the lattice attack. Lattice basis reduction algorithms like LLL and BKZ performed well in practice, but there was not a clear understanding of why.

At the rump session of Crypto '96, Jeff presented their ideas for the NTRU scheme as he would have at any mathematical conference: he showed the audience what he had so far, hoping for some advice or input on the security implications and how to improve the scheme or select parameters. However, he found that some cryptographers were irritated because they had not done their security analysis properly themselves and why would they present these unfinished ideas?

At Eurocrypt '97 Don Coppersmith and Adi Shamir presented their lattice attack on NTRU, which appears to be the only time that the cryptanalysis of a cryptosystem was published before the cryptosystem itself. They claimed that LLL would easily solve the associated lattice problems for any remotely practical parameters. They also added the important observation that any other short vectors in the NTRU lattice would allow an adversary to decrypt, so he would not have to obtain the exact short polynomial f. As a result of this attack, NTRU was rejected from Crypto '97, because why would you accept a broken scheme?

However, Jeff was not deterred by this attack. In fact, he was convinced that the success of LLL was an artifact as a result of the low lattice dimensions. Since NTRU had the efficient ring structure, this lattice dimension could be increased at a relative low cost to the key size. So Jeff and his co-authors began testing various parameters against the NTL implementation of BKZ by Victor Shoup to see what could be broken and what could not. During this testing phase there were a number of papers published on NTRU that either sped up the basis reduction or attacked some other aspect of the scheme like the possibility of decryption errors. Eventually they put out some challenges in '97/'98 and to the best of Jeff's knowledge only the warm-up challenge was ever solved.

This is when their problems began: they attempted to construct a signature scheme. As mentioned before, all these signature schemes had the same issue: they leaked information in some way that allowed an attacker to obtain the secret key with enough signatures. The first scheme, NSS, was broken due to an attack by Gentry and Szydlo that uses reversal polynomials and a powerful way to obtain the secret polynomial f from the resulting data. Fortunately, the crypto community was very forgiving (NOT). The next attempt at a signature scheme based on NTRU was broken by Nguyen and Regev in '06 and the final attempt which added perturbations was broken by Nguyen and Ducas in '12. However, these schemes could be fixed by using the rejection sampling ideas of Lyubashevsky.

In the meantime Gentry, Peikert and Vaikuntanathan introduced Gaussian sampling and after some improvements to this notion StehlĂ© and Steinfeld showed that slightly modifying NTRU allows you to prove its security based on the Learning With Errors problem. Also around this time Gentry introduced the first fully homomorphic encryption scheme based on ideal lattices and it turns out that NTRU is itself a somewhat homomorphic scheme due to its ring structure and can be used to construct FHE schemes as well. So even though NTRU has had its fair share of problems in the past, there is currently an explosion of interesting research in related areas. Jeff ended his presentation saying that he could talk about this for hours, but that he would not.

Finally, when asked by session chair Phong Nguyen whether he would think if NTRU would be accepted at Eurocrypt today if it had not existed yet, Jeff responded: probably not.

It all started in '94, with a lecture by Goldfeld that Jeff attended, which was about a one way function from number theory due to Damgard in '88. This one way function was based on claw-free permutations, and Jeff noted that it was related to certain L-series and he began wondering what you could achieve with such constructions. Could you achieve Public Key Encryption or maybe just Key Exchange? He still does not know the answer to this day, but such open questions eventually led to the NTRU cryptosystem.

He began thinking about restricted polynomials mod q and how to prove the knowledge of a polynomial without revealing the polynomial itself. Linear algebra shot down most of his initial ideas, but eventually he began thinking about 'short' polynomials, whose coefficients are all bounded. With these 'short' polynomials, the problem that relates to breaking the schemes turns into the Closest Vector Problem. At this point lattice basis reduction comes into play, and at the time it was believed that LLL was deadly for most applications that include lattices in some form or shape.

The next step was to compactify the ring structure, because "it seemed like the right thing to do". Instead of just taking polynomials mod q, you also take them mod X

^{N}-1 and you get the ring structure of NTRU. Now the multiplying of polynomials corresponds to a convolution, which in turn naturally corresponds to the Fourier transform. And based on the idea that 'short times short equals short', Jeff and some co-authors turned the idea of 'proof of knowledge of a short polynomial' into a digital signature scheme. However, it leaked information, like many of the subsequent lattice schemes, which was not improved until Vadim Lyubashevsky showed in 2009 how to fix the leakage issue in general using rejection sampling.But now we get to NTRU. The next required observation was that for most polynomials f you can find an inverse in your ring and that if you compute g/f for short polynomials g and f the result 'looks random'. With this in hand, everything was ready to create NTRU. Jeff united with Jill Pipher and Joe Silverman and they began thinking about quantifying the security and how to select parameters. They understood the combinatorial approach to break the scheme pretty well and Odlyzko pointed out the Meet-In-The-Middle approach to break the scheme, but the big issue was the lattice attack. Lattice basis reduction algorithms like LLL and BKZ performed well in practice, but there was not a clear understanding of why.

At the rump session of Crypto '96, Jeff presented their ideas for the NTRU scheme as he would have at any mathematical conference: he showed the audience what he had so far, hoping for some advice or input on the security implications and how to improve the scheme or select parameters. However, he found that some cryptographers were irritated because they had not done their security analysis properly themselves and why would they present these unfinished ideas?

At Eurocrypt '97 Don Coppersmith and Adi Shamir presented their lattice attack on NTRU, which appears to be the only time that the cryptanalysis of a cryptosystem was published before the cryptosystem itself. They claimed that LLL would easily solve the associated lattice problems for any remotely practical parameters. They also added the important observation that any other short vectors in the NTRU lattice would allow an adversary to decrypt, so he would not have to obtain the exact short polynomial f. As a result of this attack, NTRU was rejected from Crypto '97, because why would you accept a broken scheme?

However, Jeff was not deterred by this attack. In fact, he was convinced that the success of LLL was an artifact as a result of the low lattice dimensions. Since NTRU had the efficient ring structure, this lattice dimension could be increased at a relative low cost to the key size. So Jeff and his co-authors began testing various parameters against the NTL implementation of BKZ by Victor Shoup to see what could be broken and what could not. During this testing phase there were a number of papers published on NTRU that either sped up the basis reduction or attacked some other aspect of the scheme like the possibility of decryption errors. Eventually they put out some challenges in '97/'98 and to the best of Jeff's knowledge only the warm-up challenge was ever solved.

This is when their problems began: they attempted to construct a signature scheme. As mentioned before, all these signature schemes had the same issue: they leaked information in some way that allowed an attacker to obtain the secret key with enough signatures. The first scheme, NSS, was broken due to an attack by Gentry and Szydlo that uses reversal polynomials and a powerful way to obtain the secret polynomial f from the resulting data. Fortunately, the crypto community was very forgiving (NOT). The next attempt at a signature scheme based on NTRU was broken by Nguyen and Regev in '06 and the final attempt which added perturbations was broken by Nguyen and Ducas in '12. However, these schemes could be fixed by using the rejection sampling ideas of Lyubashevsky.

In the meantime Gentry, Peikert and Vaikuntanathan introduced Gaussian sampling and after some improvements to this notion StehlĂ© and Steinfeld showed that slightly modifying NTRU allows you to prove its security based on the Learning With Errors problem. Also around this time Gentry introduced the first fully homomorphic encryption scheme based on ideal lattices and it turns out that NTRU is itself a somewhat homomorphic scheme due to its ring structure and can be used to construct FHE schemes as well. So even though NTRU has had its fair share of problems in the past, there is currently an explosion of interesting research in related areas. Jeff ended his presentation saying that he could talk about this for hours, but that he would not.

Finally, when asked by session chair Phong Nguyen whether he would think if NTRU would be accepted at Eurocrypt today if it had not existed yet, Jeff responded: probably not.

## No comments:

## Post a Comment