Facebook’s a mess, but it doesn’t mean backdoors are the answer
It’s been a tough year for Facebook. It has faced international scrutiny, from its role in elections to the potential regulation of its cryptocurrency Libra. However, perhaps the most contentious argument for the social media giant, that is sure to rage on into 2020, is one of its longest fought. How to protect the privacy of users on its messaging platforms, whilst navigating the demands of governments who want backdoor access. What should we most prize? Consumers’ privacy or national security?
Currently, the solution proposed by governments internationally is a backdoor to allow access into messaging platforms — one that in my opinion, is highly unsatisfactory. With a backdoor, there is potential for abuse by the government agency in question but perhaps more concerningly, that same backdoor can be found and exploited. On the other hand, my own experience with members of the Islamic State has shown that absolute privacy of communications can be dangerous in the wrong hands.
The solution doesn’t lie in an open door for anyone with the right tools to climb through. What we need in a trustless environment, is a pre-agreed, cryptographically secure, and verifiable way to access certain data sources, which helps to bring tech companies and governments together.
An emergency entrance with access granted through a consensus voting mechanism from a pre-agreed group will be the way forward.
Take it from me, end-to-end encryption can benefit the wrong people
Facebook has made its stance on the issue of data privacy pretty clear. With further encryption of its video and calling systems being tested in October 2019, not to mention its current very public lawsuit against the NSO Group, its dedication to end-to-end encryption is plain to see.
It’s a position I once took myself. In 2014, my firm developed the world’s first ‘quantum safe’ instant messaging system with the us having “zero knowledge” of the contents. It featured encryption so advanced that not even a mature quantum computer, let alone the technology available at the time, would be able to decipher the coding in order to gain access. We were elated.
It was a much-needed victory for privacy, in an age where it was widely agreed that the misuse and exploitation of user data was getting out of control. We took the decision to make the solution available to all through the Apple app store as an easy to download application. We never would have predicted that the solution would end up appearing on an Islamic State recommended technical tools list.
We had created a tool which protected a fundamental right to online privacy. But, in doing so, had enabled an abhorrent group the ability to benefit from unfettered, untraceable communications. This created a period of great debate for our team. To create government backdoors in what we had claimed was a fully-encrypted and privacy-protecting service was counter-intuitive.
However, we simply couldn’t reconcile the idea that an organization such as Islamic State might be able to cause great harm using our technology. We felt we were left with no choice but to withdraw the messaging system altogether. Today, we only provide it to companies and governments for carefully selected and compliant use.
By splitting the keys, there is a cryptographic solution to the problem
In this scenario, it might be easy to argue that a government backdoor would have been appropriate. But we must remember that a backdoor for one, is a backdoor for all. Anyone can walk through it, whether that’s the government agency that was intended for, a hacker or even a malicious nation. Facebook, for all its flaws, is right to object to this on behalf of its users.
This is why I believe that governments should consider the creation of an emergency entrance, or side-door. Whatever you call it, these are metaphors for a process where pre-agreed access to data is enabled within a trustless environment.
In this scenario, the government agency, the social media provider, and a neutral third-party such as a court, would each safe-keep a fragment of the cryptographic key, which when used to reach a voting threshold, could allow sanctioned and pre-agreed access to messaging data. To remove any anxiety about the government keeping the data, the data and the key management could be hosted by the social media companies.
In a way, this idea known as ‘threshold cryptography’ would be similar to a Swiss bank safe deposit box, which can only be opened if both the bank and the customer are present. Except these cryptographic keys could not be replicated, and companies could even use blockchain to create an immutable record of how, when and why the data was accessed.
It would significantly limit the ability of rogue actors to stroll through a backdoor uninvited. There would be no ‘golden key’ kept by a social media company, which would remove any insider threat to security and privacy, even if governments weren’t pushing for a way in.
Facebook has a responsibility to find a solution to this ongoing debate. It can shout about respecting its users’ privacy from the rooftops, and in doing so defend its decision to continue with end-to-end encryption, but that argument only holds true when lives and liberty are not being endangered by the secrecy their messenger applications allow.
It is a government’s prerogative to keep its people safe, but if they think backdoors are the prize, I believe they are mistaken. In this scenario, the data is not even kept by the government. The social media companies should not complain either as the telco industry already has to comply with lawful intercept warrants.
There is common ground that can be found here in the form of key-splitting, and that’s been sadly absent from the privacy debate thus far.
Published January 2, 2020 — 10:00 UTC