Hi all,
>Â3. The only implementation that mitigates decryption failures completely, killing information leaks to adversaries.
This is clearly a nice-to-have feature, but it comes with a tradeoff. To remove decryption failures you need to increase the parameter q, but this affects size (and so performance) in two ways: first, the key and ciphertext are arrays of integers mod q, so obviously increasing log_2(q) increases key and ciphertext size; but second, increasing q makes lattice reduction attacks more effective, so it means that you need to increase the dimension parameter N as well to get the same level of lattice security. Conversely, it's not difficult to calculate upper bounds on decryption failure probabilities, so it's straightforward to find a q that gives less than 2^-k chance of a decryption failure. There's no particular need for a decryption failure probability that's less than the security of the other parts of the cryptosystem.
Just wanted to explain why the standardized NTRUEncrypt parameter sets (fromÂhttps://github.com/NTRUOpenSourceProject/ntru-crypto) are chosen the way they are, i.e. to have nonzero decryption failure probability. We could have chosen larger q and N but didn't think the tradeoff is worth it. Obviously the other point of view is legitimate too.
Cheers,
William