"The Cloud" is supposed to let you run a virtual machine and abstract away all details like what physical processors or disks lie beneath. But we can't hope our adversaries will be nice and stick to this ...
In the first talk we saw some examples of what can go wrong from a security perspective when you ignore what lies beneath. If two virtual machines are executed on the same physical machine, we call them co-resident and if our adversary can get his VM co-resident to ours we may be in trouble. If he can take over the physical machine for example by exploiting a bug in the hypervisor (software layer that runs the VMs), he gains complete control of our VM. Even if he's confined to his VM, he can measure the latency of operations and possibly gain a side-channel from the fact that his code uses the same physical caches as ours. Using Amazon's cloud as an example, the speakers showed that it's surprisingly easy to get co-resident with a target of your choice and to detect if this was successful.
Another nice feature of VMs is they can be suspended to a disk image and restarted. A less nice feature is that the same image can be restarted several times, given a different query each time but may use the same randomness/PRG - practical examples were presented as our favourite browsers seed the PRG for TLS and co. when they start up. In other words, we can actually apply the "forking lemma" in practice! For a server VM, we saw practical examples of stealing the TLS master key this way.
Another security issue arises in cloud-based backup systems like dropbox. They use a trick called deduplication - if two users backup the same file, it only stores the file once and the second user doesn't even have to upload it again (the service can compare file hashes). It's pretty much the equivlent of UNIX hard links. It's estimated that deduplication saves 95%-99% of space for online backup systems, but we're still talking about petabytes.
Deduplication gives anyone an oracle to test whether a file has been backed up before and to brute-force test which version of a file has been backed up if only a small portion is unknown. Another problem is that knowing the hash of someone's file allows you to retrieve it - just pretend you have a file with that hash, deduplication means you get access to it without any further questions. As an aside, dropbox claims to encrypt all files with "military-grade encryption" (AES is mentioned too) but for deduplication to work at all, it seems like they encrypt everyone's files with the same key.
With problems like these out there, what can we do? Two main approaches were presented, both dealing with the fact that the client cannot fully trust the could provider. The first is based on secure hardware (mostly TPMs) and several talks gave scenarios for using hardware-based trust. The second approach is to use software-based cryptography - mostly some variation of MPC, although we heard some good arguments why general MPC won't make it into the real world. However, special cases (Voting and Linear Programming were mentioned) offer good opportunities for efficient MPC-like constructions. The principle of using two (or more) clouds also appeared in several talks.