Silicon Valley’s Developing Conscience: It’s Called Apple
Silicon Valley has a problem. In our quest to build better products and better meet the needs of the world for information, we built the most amazing system for effortless government surveillance as a byproduct. It is now incumbent on Silicon Valley to remedy this situation.
Forcing tech companies to weaken their products through compelling the creation of backdoors would be a massive step backwards.
Whatever the power of search engines or social networks, it’s really the smartphone that is the most incredible tool for tracking our every move and activity. With access to the information collected by a person’s smartphone, it’s probably straightforward to figure out everything important about that person. Who they love. What religion they profess. Their ethnicity. What drugs (legal or illegal) they consume. What content they read or watch. What laws they violate. Every secret.
And, without encryption of this information, the makers of smartphones had effectively handed those secrets to governments. Not just the U.S. government. Just about every government. For very little expense compared to other ways of gathering secrets.
Over the last couple of years, Apple figured out the implications of this expanded surveillance. They decided that their value proposition to smartphone users did not include making it easy for governments (or others) to collect everybody’s secrets.
As a society, Americans have frequently decided to put limits on our government’s powers, because we were founded in a period where government abused its powers extensively. We don’t allow our police to torture suspects for confessions. We throw out evidence gathered through illegal searches. The government does not, and should not, have automatic access to every secret.
The battle between Apple and the FBI is one of those crucial limit-setting moments. And Silicon Valley understands it as such a moment for the tech industry generally. If the FBI can force Apple to construct a back door for one iPhone for the U.S. government, we techies understand why this sets a strong negative precedent for extensive surveillance in the U.S. and globally.
This is not a theoretical problem. We have seen this problem here in the United States and around the world. My nonprofit creates the Martus software for human rights activists to securely store their sensitive information (via encryption). It may be documentation of atrocities they plan to use in later advocacy, or simply items like current membership lists. When we called an LGBT organization in Africa last year for a regular check-in, we found that they took the call from the back yard of their offices. They were burning all of their records because they had a tip that their government was going to raid them. Luckily, their records were already safely stored in Martus. Without a backdoor for that government, or any government for that matter.
As a society, we should not make it easy for governments or other interests to get lists of all of the gay people, or Christians, or Muslims, or rape survivors, or HIV positive people, or supporters of the opposition. We need to make it harder to find out our sensitive personal information, whether it’s our medical information, or when our 11-year-old child is home alone. And encryption without backdoors is how we secure that information against attackers of all stripes. A backdoor is an open door for any one that’s willing to try hard enough to gain entry.
That is why we, and so much of the technology sector, stand with Apple today. This is not a tradeoff between security and privacy, as this issue is so often portrayed. This is a tradeoff between security of our sensitive information and surveillance. And, making it easier to surveille us by weakening the technical protections on our private information makes it possible for governments, especially repressive ones, and others to exploit a user’s or organization’s vulnerabilities.
We should not be able to compel software developers to sabotage security protections that they carefully built for excellent reasons. We should not compel them to work against the interests of us, their users.
Comments