Hacking is no longer something that most of us only hear about in movies. It’s a weekly occurrence that affects everyone. Whether your credit card information was part of the huge Target breach or your personal data was leaked by the OPM, Experian or Home Depot hacks, you’re no longer a bystander; you’re the target.
And it’s not just large companies with shoddy security that are at risk either: Hackers are also after the treasure trove of data on individual people’s smartphones. We’ve gone from rooting for Matthew Broderick in WarGames and giggling at the hilarity of the 1990s movie Hackers (hack the planet!) to wondering how long before we have to change our passwords and replace our debit cards.
Meanwhile, tech companies like Apple and Google are in a constant battle to keep ahead of these hackers. That means fortifying their software and hardware with ever-increasing levels of encryption and security. That work protects not only your information but also their business. No one wants to buy a device that spills her secrets right out of the box.
So when elected officials and law enforcement start railing against encryption, insisting that it’s an uncrackable tool for criminals and terrorists, they’re ignoring the security benefits for individuals and businesses. Defeating tech company protections hurts US citizens and businesses; it doesn’t stop crime.
On several occasions government officials have floated the idea of making Apple and Google keep encrypted applications out of their app stores. The ramifications of this would be disastrous. In addition to creating a certification headache for the purveyors of those digital marketplaces, it would also discourage innovation.
If a company can’t sling its security wares in the United States, it’ll offer up its application in other countries. Worst-case scenario, it’ll move its entire operation out of the US, taking those jobs and tax revenue with it.
Plus, making an app unavailable in the United States won’t stop criminals from downloading it from foreign stores. You and I won’t go to the trouble of downloading something from overseas, but you can bet anyone planning a crime will be happy to figure out how to sideload an application.
The pinnacle of government shortsightedness is the bill introduced last month by Senators Richard Burr and Dianne Feinstein. The Compliance with Court Orders Act of 2016 would require companies to hand over data in an “intelligible format” or risk fines.
I could spend all night listing the various ways that Feinstein-Burr is flawed & dangerous. But let’s just say, “in every way possible.”
— matt blaze (@mattblaze) April 8, 2016
It’s a fancy way of saying that tech companies need to be able to decrypt any data on any device at the behest of the courts. That would require intentionally leaving exploits in hardware and software just in case something is used during the course of a crime.
Bad actors (hackers and nation-states with less-than-ideal human rights records) live for zero-day exploits. They poke and prod at hardware and software, hoping they can find a way in. If the Compliance with Court Orders Act of 2016 passes, their jobs will get much easier, because they’ll know that everything has a exploit now. The law requires it.
While these officials may be well meaning, their ignorance of security is troubling. It’s easy to write up a bill or tell the Senate that encryption is used by terrorists and criminals and therefore it’s bad. It’s tougher to take the time to talk to experts in that field.
But if your job is to understand security, it might be in your best interest (or at least the interest of the people who voted you into office) to actually learn how it works. Hacking is an ongoing threat, and encryption lessens the damage caused by it.
If you’re a government or law enforcement official who can’t wrap your head around that, maybe it’s time to retire.