Privacy is the issue that won’t go away, a testament to the fact that this is a fundamental aspect of how we expect the internet to behave. Just in the past week, the Court of European Justice killed a data sharing deal with the U.S. (Safe Harbor) because of privacy concerns, FBI Director Comey has testified on Capitol Hill that dozens of terror suspects hide from Law Enforcement by using encryption, California has adopted the strongest digital privacy law in the country, Apple has removed some apps from the App Store due to privacy concerns, and the Consumer Financial Protection Bureau (CFPB) is considering banning arbitration clauses from credit card agreements and similar that prohibit class-action lawsuits (as might be appropriate in a data breach). What the heck is going on?
All of this can be interpreted as part of the very active, ongoing discussion in the U.S. (and elsewhere) about privacy. This round, what people are calling CryptoWars 2.0 (the earlier CryptoWars 1.0 took place back in the 1990s), really heated up in earnest when Apple and Google announced they would include strong encryption in their newest products that even they would not be able to decrypt (since they would not have the decryption keys). The FBI in particular worries about trends that make communications supporting criminal or terrorism activities immune to warrants, whether those trends are legal or technical. But skeptics want to know how big of an issue this actually is in practice; hence Director Comey’s testimony of “dozens”.
Europe has generally been considered far ahead of the U.S. when it comes to protecting the privacy of its citizens. Last year’s ruling on the “Right to be Forgotten“, and the new ruling on Safe Harbor that prevents EU citizen data gathered by companies from being routinely transferred to U.S. servers are two prime examples. Interestingly, EU rules seem to focus more on corporate behavior, at a time when Governmental surveillance seems to be a fact of life. The strongest opposition to E.U rulings consistently comes from U.S. companies whose business models depend heavily on monetization of personal data through methods such as selling the data to advertisers.
As a result of the Snowden revelations about U.S. surveillance, the domestic focus here has been the opposite: heavy emphasis on preventing Government surveillance, while giving companies more of a pass. The California law is the first in the country to enact a single framework for both individuals and companies, protecting location data, content, metadata and device searches without a warrant. Our federal and state Governments have a harder time reining in commercial activities relying on personal data, one reason the tech industry leadership on strong encryption is so notable, but also a reason to cheer the ongoing pressure from Europe.
Where does all of this leave us? The privacy discussion has generally been very divisive, insisting that you are either “for” or “against” privacy. This is nonsense, with the arguments that the Government should never have access to any communications under any circumstances as absurd as the arguments that everyone should be allowed to listen in because law-abiding citizens have nothing to hide. We have a legitimate national interest in ensuring strong privacy for people in the United States, as well as a legitimate national interest in a functional law enforcement methodology supported by aspects of due process such as court-ordered warrants. The discussion is really about the right balance between these potentially conflicting priorities, and then the technical mechanisms to implement those policy decisions.
Ironically, it may be the Government that leads the way to some of the future technology solutions. While Europe has taken a policy approach to draw bright lines, here in the U.S. we have embraced the challenge head on in a search for solutions. At the excellent CSM/Passcode event on October 8, The Cutting Edge of Cybersecurity Research, Dr. Prabhakar from DARPA spoke about changing the tradeoff between privacy and security through technologies such as fine-grained control of information sharing for owners of private information, better authentication, and better methods for proving systems cannot be hacked. Dr. Vanderlinde of iARPA mentioned homomorphic encryption, a way for example for researchers to make secure queries to databases (“how many people in this hospital have disease x”) without ever decrypting the query or the individual database records required to respond to that query.
So, privacy remains a thorny issue, with lots of heavily vested stakeholders. In the near term, the best approach for users is to continue trying to restrict the amount of information you share online and with whom you share it. However this area is changing very quickly, and the robust current discussion will likely result in changes that significantly enhance your privacy over the next couple years. One additional point: much of this discussion depends on encryption as the fundamental protection mechanism. Stay tuned later this week for another Daveknology post discussing encryption, and whether it is capable of meeting all these expectations.