Last
year at Real World Crypto you may remember my blog post about how the
conference went about the business of lamented the downfall of
theoretically secure cryptographic primitives based on practical
implementation issues. Well this year at real world crypto
things have gone in a similar direction. An example of this is
Bernstein's talk on Wednesday entitled “Error-prone
cryptographic designs.” Much
of the talk has already been described in a previous post by David B here which
helpfully gives us the “Bernstein principle” which I'll follow up
on, perhaps from a more applied point of view.
As
the title suggests the talk was about error prone cryptography, but
not
as you might expect. The talk was not a frustrated outburst at how
the practical implementation side of things was letting the side down
and needed to get it's act together, the
tone of which
you may be familiar with from other talks with similar titles. The
talk did contain many of the ingredients of this sort of talk. There
was a look at how the problems with cryptosystems seem always to come
from the implementation of secure primitives not the primitives
themselves, backed up by a number of horror stories showing this for
instance.
But this talk had one major twist.
The
main point of the
presentation was made near the beginning in
one of the horror stories (see blog post),
when he referred to the breaking of the encryption used by Sony
Playstation. Relating to this he observed how
the blame for the attack got distributed.
The attack was a practical attack and so naturally
the
designers of the primitives remarked how their
secure designs had been made vulnerable by the terrible implementers
who
seem to keep getting things wrong and making all their hard work
meaningless. But wait a minute, asked
the implementers, if we keep getting what you designs wrong, perhaps
the problem is with you? Perhaps
the designs of the primitives are so hard to implement properly that
the
designers are really to blame for not taking implementation
practicalities seriously enough in their design choices.
So
who was to blame for this and many other security slip ups that have
occurred through implementation errors? Well the answer to the
question still remains unresolved, but the fact that the discussion
is taking place throws a different kind of light on the practice of
designing crypto systems. The current system of designer make
something secure; implementer implement the design securely, may well
be where many of the problems are coming from and should be replaced
by something more like designer make something that's secure and easy
to implement; implementer … you can't really go wrong!
Although
unlikely
to have drawn the
agreement of the whole audience, the
talk didn't have the feel of a dig at designers but an appeal to them to wake
up to their responsibility to the implementers in making designs that
are easy to implement. It neither had the air of trying to get implementers
off the hook, but sought
rather
to
look
at the
question as to why in a world where secure systems seem to break
because of practical attacks against implementations rather than the
security of the primitives aren't the primitives ever singled out to blame
for being hard to implement rather than the people given the
task of implementing them.
It's
true, designers
have a hard problem on their hands making sure what they design is
secure, but often being from a more mathematical, theoretical
background, have they tended to overlook the practical aspect of what they
are concocting? Many designers are highly skilled in the art of theoretical
design but is this at the expense of not
knowing
the real engineering world in which they work as
well as they should?
The
talk examined these and other similar questions and was
summarised
with
an appeal to designers to “think of the children, think of the
implementers” and
ended with the words given as the title for this post.
No comments:
Post a Comment