The distinguished IACR lecture at this year's Asiacrypt was given by Phillip Rogaway, who chose to talk more about the political implications of cryptographic work rather than the technology itself. This was certainly refreshing to see.
Phil started his talk by highlighting some historical events on the relationship of ethics and sciences: the nuclear bomb, the Nazi doctors, and the environmental movement. There is now the general idea that scientists should not harm society, but actually contribute to the social good as an obligation from the professional role. This manifests itself in various ethical codes and organizations; even the IACR bylaws oblige it to serve the public welfare. According to the talk, however, these values are in decline. The military has no problems recruiting scientists, and it provides more and more funding for science. At the same time, cryptography, like any technology, is used as a tool of power, which is recognized much more by popular culture than by cryptographers themselves.
An older generation of cryptographers seems to more aware of this, for example David Chaum, who mentions big data collection as early as in the 1980s as well as the possible effects on behavior. Comparing the citations of David Chaum's most cited paper on electronic mail with Goldwasser and Micali's on probabibilistic encryption, one can see that only the latter is picked up by the usual cryptography conferences. Phil argues that this split is more political than anything else and that cryptographic primitives do not inherently favor individuals or powerful entities. For example, conventional encryption can be used as well to protect one's information as to take control from the user as in trusted computing.
All these issues gained much more attention by the Snowden leaks in the summer of 2013. It was revealed that mass surveillance is rife, obscured both by secrecy and complexity. Unsurprisingly, there is significant disagreement between government agencies and surveillance studies. While the former argue that cryptography destroys the balance between security and privacy, the latter show that surveillance simply is an instrument of power that makes people conformant (or killed by drones). Furthermore, they also argue that security and privacy are not necessarily mutually exclusive. There is historic evidence of unsavory uses of political surveillance from the FBI's letter to Martin Luther King Jr. trying to convince him of suicide to totalitarian regimes of today.
Considering all this, Phil claimed that while cryptography for security is a success, cryptography for privacy is not, and moreover, that cryptography becomes more and more self-serving ("crypto-for-crypto"). To counter this, he presented a few suggestions for interesting problems such xMail and big-key cryptography. The former is about sending messages via an untrusted server without allowing the server to link sender and receiver, the latter assumes that an attacker has already subverted the machine holding a key, but only has limited bandwidth to send information.
The last part of the talk consisted of twelve suggestions for cryptographers essentially calling for a more holistic view of our work. The suggestions cover quite a range from thinking twice about military funding to stopping to draw cute little devils for the adversary when it is in fact a large state-sponsored agency. The most interesting suggestions in my opinion is that we should taste our own medicine, that is, we should use privacy tools and improve them if necessary. However, there was also the suggestion to write fewer but more relevant papers, which is orthogonal to the current incentives in science.
Phil concluded with the quote "Just because you don't take an interest in politics doesn't mean politics won't take an interest in you."
There is an accompanying essay on his website.