The third and final panel discussion here in Tenerife at Crypto for 2020 was entitled 'Crypto for security and privacy', covering the current and expected future challenges to be faced by cryptographers working on protocols and methods for secure, private and/or authenticated communication on the internet. We heard from Nick Mathewson from the Tor project, Zooko Wilcox-O'Hearn, Dan Bernstein and Matthew Green. This is a pretty diverse and multi-faceted topic, so it was interesting to hear some wide-ranging and varied arguments.
The primary theme of the discussion (also quite pervasive throughout the workshop in general) was the divergence between the conceptualization of the problems as understood by theorists designing or analysing protocols and the reality faced by practitioners looking to implement and deploy schemes in the real world.
In terms of specifics, and somewhat related to Kenny Paterson's talk yesterday, the topic of backwards compatibility, poor communication between organisations, standards bodies or academics and influences from commercial or economic pressures resulting in broken or outdated primitives remaining in standards was touched upon. As always in such talks, references to good old TLS were common, but outside of that particular points made by the panel were that we as cryptographers need to consider framing the language we use to describe solutions in a context more useful for implementors, and that it may not be sufficient any more to keep patching up these protocols as they age and become outdated. One particularly weighty point made was that there has been very little academic involvement in the standardisation of many of the major widely deployed cryptographic protocols, and there was a general consensus that engagement between both sides of the equation needs to be both more frequent and more productive.
Matthew Green framed some of this as the problem of having support users who do not use any cryptography with those who do, within the same protocol. The canonical example is that the mechanism for the basic starting point of a user establishing a https connection after hitting enter in the url bar is non-trivial - we've seen some progress in this area with HSTS and certificate pinning, but there is certainly more work to be done.
We also heard about how we should be considering that in many cases on the Internet cryptography is not used by the majority of the userbase. Dan Bernstein discussed how there is potential for building new software to fill these gaps, citing DNSCrypt as an example. One suggestion was how the development of software for secure, authenticated email may be some low-hanging fruit; the barrier for entry is low, and currently doing this is a non trivial task. Tied into the above was a discussion on the usability and accessibility of security within the internet ecosystem - the complexity of configuring TLS within Apache was referenced as an example as how the barrier for entry for a non-expert user to use cryptography correctly is a hindrance for the uptake of crypto protocols.
The workshop is entitled 'Crypto for 2020', so we heard thoughts from each of the panel members on the state of play and guesses on challenges to be faced in the Internet sphere in 7 years time, and how we as cryptographers should be dealing with these. Zooko Wilcox-O'Hearn claimed that the rate of change is accelerating as more and more major players are re-evaluating how much they value security. We're seeing possibly nation-states involved in large scale cyber security operations with Flame, Stuxnet and the recent Red October, and further how these operations may have been running for years without the wider community noticing. An open question that follows is how or whether this 'arms race' or increased focus on security will change how we will have to use and analyse crypto. Nick Mathewson pointed out how the design of crypto protocols in practice lags behind academia by a good number of years, and as such the protocols and schemes we are working on now may be the ones in use in 2020.
The primary theme of the discussion (also quite pervasive throughout the workshop in general) was the divergence between the conceptualization of the problems as understood by theorists designing or analysing protocols and the reality faced by practitioners looking to implement and deploy schemes in the real world.
In terms of specifics, and somewhat related to Kenny Paterson's talk yesterday, the topic of backwards compatibility, poor communication between organisations, standards bodies or academics and influences from commercial or economic pressures resulting in broken or outdated primitives remaining in standards was touched upon. As always in such talks, references to good old TLS were common, but outside of that particular points made by the panel were that we as cryptographers need to consider framing the language we use to describe solutions in a context more useful for implementors, and that it may not be sufficient any more to keep patching up these protocols as they age and become outdated. One particularly weighty point made was that there has been very little academic involvement in the standardisation of many of the major widely deployed cryptographic protocols, and there was a general consensus that engagement between both sides of the equation needs to be both more frequent and more productive.
Matthew Green framed some of this as the problem of having support users who do not use any cryptography with those who do, within the same protocol. The canonical example is that the mechanism for the basic starting point of a user establishing a https connection after hitting enter in the url bar is non-trivial - we've seen some progress in this area with HSTS and certificate pinning, but there is certainly more work to be done.
We also heard about how we should be considering that in many cases on the Internet cryptography is not used by the majority of the userbase. Dan Bernstein discussed how there is potential for building new software to fill these gaps, citing DNSCrypt as an example. One suggestion was how the development of software for secure, authenticated email may be some low-hanging fruit; the barrier for entry is low, and currently doing this is a non trivial task. Tied into the above was a discussion on the usability and accessibility of security within the internet ecosystem - the complexity of configuring TLS within Apache was referenced as an example as how the barrier for entry for a non-expert user to use cryptography correctly is a hindrance for the uptake of crypto protocols.
The workshop is entitled 'Crypto for 2020', so we heard thoughts from each of the panel members on the state of play and guesses on challenges to be faced in the Internet sphere in 7 years time, and how we as cryptographers should be dealing with these. Zooko Wilcox-O'Hearn claimed that the rate of change is accelerating as more and more major players are re-evaluating how much they value security. We're seeing possibly nation-states involved in large scale cyber security operations with Flame, Stuxnet and the recent Red October, and further how these operations may have been running for years without the wider community noticing. An open question that follows is how or whether this 'arms race' or increased focus on security will change how we will have to use and analyse crypto. Nick Mathewson pointed out how the design of crypto protocols in practice lags behind academia by a good number of years, and as such the protocols and schemes we are working on now may be the ones in use in 2020.
No comments:
Post a Comment