I attended INTRUST 2010 in Beijing, China. The conference focused on all aspects of trusted systems like trusted modules, platforms, services and applications. In this year, program was split on two parts. In first two days was technical session where the audience was able to select talks from main topics like Hardware Security, Security Analysis, Software Protection or Mobile Trusted Systems. Third day was reserved for Workshop with Invited Talks with keynote speakers like Andrew Yao, Moti Yung, Ahmed-Reza Sadeghi or Liqun Chen.
The bast paper award was sponsored by Singapore Management University (US$1000) and was given to "Seamless Integration of Trusted Computing into Standard Cryptographic Frameworks" by Andreas Reiter, Georg Neubauer, Michael Kapfenberger, Johannes Winter and Kurt Dietrich from IAIK, Graz. This was the first presentation of the conference and in this talk author presented a novel design for Trusted Software Stack (TSS) - interface between applications and Trusted Platform Module (TPMs). Proposed TSS can be easily integrated into existing security frameworks and reuse application programming interface (APIs) from well known frameworks. Presented stack has nice features like dynamically loading components via the network, add, update or replaced functionality even after deployment and support multiple TPMs. The last features is especially nice for mobile devices and systems with many virtual TPMs. The prof of concept was done with the BoncyCastle security framework, but according to authors further enhancement might include integration into Java Cryptographic Extension and CryptoAPI.
Just after first session I gave a talk on "Hardware Trojans for Inducing or Amplifying Side-Channel Leakage of Cryptographic Software", where I presented a novel concept of micro-architectural Trojan Side Channels.
Definitively the last day of the conference was the best one. Many keynote speakers was invited to give a talk: Andrew Yao - "Some Perspectives on Complexity-Based Cryptography"; Moti Yung - "Embedding Cryptography to General IT Engineering System/Project"; Liquen Chen - "Security and Privacy in Vehicular Systems - Case Study: Trusted Anonymous Announcement"; Ahmad-Reza Sadeghi - "Trusted and Secure Computing in Practice: Where are We Now!" and many others.
During this session very interesting talk was given by DongHoon Lee from Korea University on Security Issues in Smart Grid. He highlights security problems in smart grids like privacy of smart meters users and smart meter attacks. According to wikipedia smart meter is an advanced meter that records consumption in intervals of an hour or less and communicates that information at least daily via some communications network back to the utility for monitoring and billing purposes. This information of consumption might reveal (if is not protected) e.g. user's lifestyle pattern which might be considered as a privacy violation. Author presented also the list of possible attacks on smart meters. The last part of the talk was dedicated to security requirements of smart meters and a need of security standards in this field.
The last talk of the conference was given by Claire Vishik from Intel. She briefly introduced "Direction in Hardwire Security R&D in Government, Academic and Industrial Research". The audience was able to listen some state-of-the-art security issues from industry, academia and government perspective, advantages and disadvantages of all of them and point of interests in terms of security research. The last part of the talk was focused on Intel's work and vision - goals and future.
A blog for the cryptography group of the University of Bristol. To enable discussion on cryptography and other matters related to our research.
Thursday, December 23, 2010
Wednesday, December 15, 2010
Highlights of ACSAC 2010
I just returned from this year's Annual Computer Security Applications Conference (ACSAC) in Austin, Texas. Unfortunately, due to the adverse weather conditions and resulting flight delays, I missed out on the first day of the technical program.
ACSAC is dedicated to work on practical security applications and this year its program encompassed such diverse topics as detection of misbehaving network entities (e.g. spammers, malware-infected machines/botnets), malware analysis/mitigation, practical authentication, hardware security, secure OS (components), security in mobile/wireless devices, and social engineering.
The best paper went to Ang Cui and Salvatore Stolfo from Columbia University, for their assessment of the vulnerability of network-enabled embedded devices. Basically, they performed a scan of the whole Internet in order to find embedded devices (typically gateways or routers) whose network managment interface was accessible via the manufacturer's default credentials. They identified a large number of vulnerable devices, especially gateways in ISP networks which typically act as the network point of entry for private home (NAT) networks. All these devices are potentially up for the taking by uploading malicious firmware, which could make them part of a botnet. This is especially alarming, as a botnet composed of such gateways could be much more potent than traditional botnets built from infected PCs. In terms of workload, these gateways are just as good as any infected machine residing behind it - in the end what counts is the rate at which packets (e.g. SPAM) can be sent out, and that's limited by the gateway's capabilities anyway. Moreover, most gateways are always switched on, giving rise to much higher availability of the bots. Also, infections on PCs usually get detected at some point by showing abnormal behavior like unusually high processor or network loads or weird disk access patterns. On the other had, it seems much harder for a "normal" user -who never even bothered to change the default password - to start suspecting the gateway to be infected with malware.
Another interesting talk given by Jonathan Valamehr involved potential use of 3D chip integration for attaching cryptographic coprocessors to regular embedded processors. 3D chip integration works by stacking several chip dies on top of each other and connecting them by intra-chip vias or similarly fine connections and it allows for much denser integration than would otherwise be possible. The contribution of the teams from UC Santa Barbara and UC San Diego is the development of special connector circuits which allow to add another die with matching landing points conditionally. Their example was an embedded processor which would work normally as a single die and could be stacked with a cryptographic coprocessor die for improved processing of security workloads.
In my talk - which immediately followed the 3D talk - I presented a detailled concept and FPGA prototype for a side-channel resistant embedded processor, which has been developed in the context of the Power-Trust project at my former university (Graz University of Technology, Austria). An ASIC prototype is currently under evaluation in a joint effort of Graz and Bristol.
Thomas Longstaff gave a very interesting invited talk on the lack of the scientic method in many works in the applied security community. He argued that certain pressures and realities in the present academic world would lead researchers to adapt experiments to fit the hypotheses instead of the (proper) other way around. Also he noted that many papers lack a sufficient description of their methodology - which he considered as one of the most important parts of any scientific paper. Also he plead for more care in the choice of program committee members and argued that PCs should contain a minimum number of members with formal scientific training as opposed to practitioners in certain fields.
The very last talk of the conference by Trajce Dimkov from the University of Twente discussed methodologies for evaluating the threat of social engineering attacks. The presentation compared the practical application of two different methodologies for penetration testing, i.e. basically they send in people into an organization who try to breach security via social engineering. In this case the goal of the penetration tester was to "steal" a laptop of a specific employee. The talk was accomanied by video footage take on some of the tester's "coups". The two methodologies differed in the number of involved people and who knows what. In the end there are several conflicting requirements and one must choose the best methodology which will create the least disturbance to the tested organization (e.g. disruption work or shattering existing trust relations).
ACSAC is dedicated to work on practical security applications and this year its program encompassed such diverse topics as detection of misbehaving network entities (e.g. spammers, malware-infected machines/botnets), malware analysis/mitigation, practical authentication, hardware security, secure OS (components), security in mobile/wireless devices, and social engineering.
The best paper went to Ang Cui and Salvatore Stolfo from Columbia University, for their assessment of the vulnerability of network-enabled embedded devices. Basically, they performed a scan of the whole Internet in order to find embedded devices (typically gateways or routers) whose network managment interface was accessible via the manufacturer's default credentials. They identified a large number of vulnerable devices, especially gateways in ISP networks which typically act as the network point of entry for private home (NAT) networks. All these devices are potentially up for the taking by uploading malicious firmware, which could make them part of a botnet. This is especially alarming, as a botnet composed of such gateways could be much more potent than traditional botnets built from infected PCs. In terms of workload, these gateways are just as good as any infected machine residing behind it - in the end what counts is the rate at which packets (e.g. SPAM) can be sent out, and that's limited by the gateway's capabilities anyway. Moreover, most gateways are always switched on, giving rise to much higher availability of the bots. Also, infections on PCs usually get detected at some point by showing abnormal behavior like unusually high processor or network loads or weird disk access patterns. On the other had, it seems much harder for a "normal" user -who never even bothered to change the default password - to start suspecting the gateway to be infected with malware.
Another interesting talk given by Jonathan Valamehr involved potential use of 3D chip integration for attaching cryptographic coprocessors to regular embedded processors. 3D chip integration works by stacking several chip dies on top of each other and connecting them by intra-chip vias or similarly fine connections and it allows for much denser integration than would otherwise be possible. The contribution of the teams from UC Santa Barbara and UC San Diego is the development of special connector circuits which allow to add another die with matching landing points conditionally. Their example was an embedded processor which would work normally as a single die and could be stacked with a cryptographic coprocessor die for improved processing of security workloads.
In my talk - which immediately followed the 3D talk - I presented a detailled concept and FPGA prototype for a side-channel resistant embedded processor, which has been developed in the context of the Power-Trust project at my former university (Graz University of Technology, Austria). An ASIC prototype is currently under evaluation in a joint effort of Graz and Bristol.
Thomas Longstaff gave a very interesting invited talk on the lack of the scientic method in many works in the applied security community. He argued that certain pressures and realities in the present academic world would lead researchers to adapt experiments to fit the hypotheses instead of the (proper) other way around. Also he noted that many papers lack a sufficient description of their methodology - which he considered as one of the most important parts of any scientific paper. Also he plead for more care in the choice of program committee members and argued that PCs should contain a minimum number of members with formal scientific training as opposed to practitioners in certain fields.
The very last talk of the conference by Trajce Dimkov from the University of Twente discussed methodologies for evaluating the threat of social engineering attacks. The presentation compared the practical application of two different methodologies for penetration testing, i.e. basically they send in people into an organization who try to breach security via social engineering. In this case the goal of the penetration tester was to "steal" a laptop of a specific employee. The talk was accomanied by video footage take on some of the tester's "coups". The two methodologies differed in the number of involved people and who knows what. In the end there are several conflicting requirements and one must choose the best methodology which will create the least disturbance to the tested organization (e.g. disruption work or shattering existing trust relations).
Subscribe to:
Posts (Atom)