Thursday, March 22, 2012
TCC, Wednesday PM
On the afternoon of the last day there were two sessions on encryption with extended functionality and pseudorandomness. Those talks dealing with randomness introduced interesting concepts. For instance the one given by B.Fuller, 'A Unified Approach to Deterministic Encryption: New COnstructions and a Connection to Computational Entropy' (with largest title in the conference) explained how to use the message to be encrypted as a source of randomness in deterministic encryption. Security is defined by imposing to the message source M to have high minimum entropy H(M)=u, so that the
probability M outputs a particular message m is less than 1/2^u. This means that even the most likely message is hard to guess.
Nevertheless the talks I enjoyed the most were the first two ones.
Subspace Learning with Errors (SLWE) was presented by K.Pietrzak (IST, Austria). LWE is a cryptographic primitive related to lattice problems introduced by Regev in 2005. Since then it has been widely studied, and used as a building block in several schemes. Most of the newest FHE schemes base security on it. It works as follows: we have two worlds, in the ideal world we are given samples (a,b), where a is an integer vector of n coordinates living in Z_q, and b is a value randomly sampled from Z_q. In the real world b is replaced by <a,s>+e, where <,> denotes the inner product, s is a secret vector and e is a small value in Z_q randomly sampled from some error distribution. The goal of the attacker is to distinguish between the two worlds. Regev gave a quantum reduction to well-known hard lattice problems, (there also exist a classical reduction but leads to less practical schemes). LWE is a generalization of LPN (learning parity with errors) and their hardness are polynomial equivalent. The SLWE differs from LWE in that an attacker trying to learn the secret s from the public key, can adaptively query LWE samples: the new oracle now receives as inputs a and an Z_q-affine transformations A, B. The oracale's output is (<A(a),B(s)>+e>). The matrices defining A, B must have rank > n to avoid trivial attacks. Not surprinsingly if SLWE is hard then so is LWE (the oracle used in LWE receives as input the identity map). What is more interesting and Pietrzak proved in his paper is the converse. He also gave two applications. The first one is an improvement of this protocol which only achieves security against passive adversaries. Using SLPN instead of LPN he managed to show that you can get security under active attacks. Most importantly, he noted that any secure scheme based on LWE remains secure under related-key-attacks (where the adversary can query not only for decryptions under secret key s but also for secret keys s'=f(s), for some function f within a given family). More precisely schemes are secure under RKA respect to affine functions on s.
The second talk was given by David A. Wilson (MIT) presenting this paper. He explain how bounded collusion IBE schemes can be constructed using homomorphisms. The main ideas are the following ones. 1) Key homomorphism: assume secret keys are tuples of elements in some ring R. We have homomorphically structured keys, if any R-linear combination of key tuples gives raise to a valid secret key, whose public key is efficiently derivable from the public key of the operands. 2) Identity map compatibility is a function on n ID users tags outputting n linearly independent vectors in R^n . 3) Linear related key semantic security is a type of security model where semantic security holds even if the challenged secret key is a linear combination of fixed secret keys values. The attacker can query for any other linear combination of secret keys values, as long as is independent of the challenge. My understanding of introducing this (in somehow artificial) new security type is to deal with homomorphic images of fixed secret keys. Any cryptographic scheme holding these three properties can be used to construct BC-IBE schemes. He gave an explicit scheme based on this scheme, whose underlying hard assumption is the Quadratic Residuosity problem.