Monday, November 28, 2011

Why I’m Scared of the SOPA bill

Benetech, is a leading nonprofit organization based in Silicon Valley. We write software for people with disabilities as well as human rights and environmental groups. We’re against piracy, and have made commitments to authors and publishers to encourage compliance with copyright law.

So, we shouldn’t have anything to fear from a bill entitled “Stop Online Piracy Act,” right? Unfortunately, that’s not the case.

We’re getting very worried that our organization and the people we serve: people with print disabilities (i.e., people who are blind or severely dyslexic), and human rights groups will be collateral damage in Hollywood’s attempt to break the Internet in their latest effort to squash “piracy.” And, if we’re worried, a lot of other good organizations should start getting worried! Let me give two specific examples that came up in my first conversation with a lawyer about the proposed bill:

1. Stopping fund raising and subscription revenue for Bookshare, the largest online library for people who have print disabilities.

Bookshare is an online library for people who can’t read standard print books. We provide accessible ebooks that can be spoken aloud, turned into Braille or large print. We serve over 150,000 students with disabilities alone with free online services funded by the Department of Education (however, nothing contained in this post has anything to do with our funders). We also have thousands of adults with disabilities that pay a $50 a year subscription to be able to download all the books and newspapers they can read. Ironically, many of these users might buy commercial ebooks, but the anti-piracy technology built into many ebook systems are not compatible with the technology these users employ to get the books in Braille or synthetic speech.

Bookshare is legal in the United States because our copyright law includes an exception that allows nonprofit organizations like Benetech to make accessible versions of books for people with print disabilities without requesting permission or paying a royalty.

We frequently get emails or letters from authors, agents or publishers who don’t know much about people with disabilities or about Section 121 of the copyright law, calling us pirates and asking us to cease and desist from making their books available on the Internet. Often, these communications come in the form of what’s called a DMCA or take-down notice. Now, we have a nice little letter thanking them, explaining that we only help people with bona fide disabilities, that it’s legal, that we’ve worked with the big publishing associations and with authors groups, and wouldn’t they like to help us in the future by adding more of their books voluntarily to our collection. Most of the time, that works great, and we end up making a new friend after they dig a little and find out that we are closer to Florence Nightingale than the Dread Pirate Roberts.

Sometimes, we have to spend time talking a newbie lawyer down from high dudgeon and explaining that there really are such things as exceptions and limitations in copyright, and do they really want to have their client be the first author to attack the rights of blind people to be able to get Braille? And then they go away. Because that’s a lawsuit they are unlikely to win, and it would be a professional error to waste their client’s money attacking a library doing legal things.

However, SOPA apparently has shoot first, ask questions later provisions. If any single publisher or author of any one of the more than 130,000 accessible books in our library gets antsy, they can send a notice to VISA and MasterCard and say, stop money from going to Benetech and Bookshare. No more donations to our charity. No more subscriptions from individual adults with disabilities.

No need to send us a letter. Or file a DMCA notice. Or do any real research. Just send out a bunch of notices and get all those pirates! Except, we’re not pirates. But, now the burden of proof has shifted to us: we’re presumed guilty, and we have to spent time and money defending ourselves. Sounds kind of un-American, doesn’t it?

Now, apparently, we can file a counter-notice. But, my guess is that the credit card guys are going to play it safe and stay away from turning “pirates” back on, and we’d end up in court arguing to be able to get our ability to receive funds for our socially beneficial work, not only to help people with disabilities but also our work to help environmental and human rights groups.

Yet another example of bills written to catch criminals, that do very little to stop them, but end up screwing up law-abiding organizations.

2. Endangering Human Rights Activists.

Benetech is one of the largest developers of software for human rights activists around the world. We develop free and open source software to help groups capture the stories of human rights abuse, and store and back them up securely in another country. Wonderful stuff. We work all over the world, and our Martus software has been translated into Spanish, French, Russian, Arabic, Khmer and other languages.

The U.S. Department of State just funded us to help LGBT groups in Uganda securely capture documentation of abuses against those communities (again, our funders are not responsible for this post). We work in North Africa Latin America, Asia: most of the places where large scale human rights abuses are going on. And, in many of these places, we’re helping the activists avoid censorship and surveillance by the government. It’s also crucially important to be able to assure the confidentiality of witnesses and victims both to protect their privacy (i.e., victims of sexual violence) and their safety (do you want the police to know that you have testified to an illegal killing by the police?).

So, another example of potential collateral damage from SOPA. The problem is that we provide technology that allows for security, privacy and circumvention. We do it for human rights groups. But, when asked if we know whether or not there are “pirated” copyrighted materials, we can’t say. Because, if we make software that promises to keep your life or death sensitive information secret to the best of our abilities, we won’t build a back door in for Syria, or China, or the U.S. government or even (heavens!) Hollywood.

Apparently, one of the provisions of SOPA is that technology and servers and websites that can be used for evading controls on piracy can be shut down by the Attorney General. Unfortunately, safeguarding human rights information can’t be distinguished from piracy, if the contents are encrypted. So, our software, and the TOR network servers we and others operate, and other similar technologies, can get shut down in the name of protecting Hollywood.

Let's Not Do This Stupid Thing, and Avoid Breaking the Internet

In conclusion, I can’t imagine that breaking the Internet, making charities waste money fighting thoughtless and careless allegations, and making it easier for repressive governments to suppress human rights groups, was what was intended when this bill was drafted. Our concerns are just one set out of many. Engineers have described this bill as “breaking the Internet,” because complying with it requires major (and not good) changes in how the Internet works today. Most tech companies think this is the most counter-productive job and innovation killing bill they've seen in years. And, tons of human rights groups have protested against the U.S. starting to act more like China than the home of the free. The costs and impacts far outweigh any (unlikely) benefit Hollywood would receive. Let’s not do this as a country.

Background information:

Electronic Frontier Foundation has great information on SOPA and related bills, including this one: SOPA: Hollywood Finally Gets A Chance to Break the Internet. If you're so moved, here's where EFF points you to taking action by contacting your elected representatives: Take Action | Electronic Frontier Foundation.

Monday, November 14, 2011

Amnesty International at 50

I’m thinking a great deal these days about human rights and about doing more for the field. Today, I gave a presentation on human rights in DC, with a focus on our work with truth commissions. I recently spoke at the Silicon Valley Human Rights Conference, where I talked about technology for human rights defenders. Our human rights team is expanding and taking on new and exciting challenges. It makes me think about one of the giants of our field.

Earlier this year, I spoke at the 50th Anniversary Annual General Meeting of Amnesty International (AI). I stuck around for the main closing meeting, where the history and future of AI was presented. I was amazed to learn about the ways in which AI has transformed itself over the first half century of its existence, as one of the preeminent human rights group of our time.

AI was founded in 1961 on the inspiration of British lawyer Peter Benenson, whose article “The Forgotten Prisoners” launched the first Prisoners of Conscience campaign, which ignited overwhelming global support and marked the birth of AI. The newly formed organization initially based its demands on select parts of the Universal Declaration of Human Rights and on the Prisoners of Conscience campaign. In particular, it focused on freedom of thought, conscience and religion, freedom of opinion and expression, humane treatment of prisoners and the right to a fair trial. This set of principles was incorporated in AI’s mandate: the set of rules establishing the organization's goals and action parameters, or what it and its local groups can and cannot do.

Over the years, AI has altered and expanded its mandate to address new human rights issues and to ease the creative tension between the demands of its grassroots membership and its organizational policy. In fact, the adopted changes are at the heart of the organization’s identity and suggest that every generation reinvents AI. In the 1970s, AI widened its mandate to cover work against torture, extra-judicial killings and disappearances. Many other human rights and social issues were added in the following decades: ending the death penalty, advancing women’s and children’s rights, holding the rights of refugees, migrants and asylum seekers, and protecting LGBT rights. In 2001, its 40th anniversary year, AI expanded its mandate considerably to incorporate economic, social and cultural rights, as well as the body of international law governing armed conflict, thus committing itself to advance all human rights enshrined in the Universal Declaration of Human Rights.

At times, the changes in AI’s charter and organizational policies involved controversial decisions, arousing both internal and external debates: should people who use or advocate the use of force in opposing oppressive regimes be recognized as Prisoners of Conscience? What about people imprisoned solely due to their sexual orientation? Is it okay to allow AI members to not belong to a local group? Even more broadly: can AI be transformed rapidly enough to meet the many new challenges to human rights? Can it make the changes that are required while remaining true to all it has stood for in the past?

As an organization that monitors the changing conditions of human rights, AI – like other human rights organizations – undoubtedly must also change. And, clearly, the changes are a source not only of debate, but also of innovation.

At the 50th Anniversary Annual General Meeting, AI’s Secretary General, Salil Shetty, linked the expansion of the mandate with AI’s evolution and with its future directions. He pointed to four disturbing paradoxes affecting human rights at present that AI needs to address: massive increase in both wealth and inequality; tremendous reduction of war in tandem with a rise of global insecurity; huge influence of media and new technology contrasted with low accountability and justice; and increase of democracies coupled with distrust of leadership.

Mr. Shetty described AI’s need to explore in greater depth the conditions underlying human rights problems and to implement proactive, “upstream” strategies for their resolution, in addition to the more reactive approaches reflected in the organization’s established methods. In the coming years, he said, AI must therefore enlarge its footprint in developing countries and address its members’ concerns with poverty, international economic injustices and lack of corporate accountability as major sources of human rights violations. I especially heard strong interest in building the Amnesty movement in Latin America and Asia.

In fact, some of these new directions are already being implemented: as part of its 50th anniversary celebrations, AI is running throughout 2011 global actions with a focus on reproductive rights, international justice and stopping corporate abuse. It will be exciting to observe how AI continues to transform itself in the future as the challenges to human rights evolve.

As our human rights team grows and expands its impact, and we launch a new project to help LGBT groups, I have found myself thinking about AI’s evolution and what lessons it has to teach us. How can we remain true to our commitment to the truth, seeing technology and science better defend the defenders of human rights, and advancing global respect for human rights?

Thursday, November 03, 2011

One very long weekend in New York City for Megan Price

Guest Beneblog by Megan Price

New York City has many attractions – people often visit Times Square, the Statue of Liberty, Central Park, among many other sights. Me? I go to New York City to spend the weekend staring at my computer screen.

Data Without Border’s kickoff Data Dive is what tempted me across the country, and after a much longer than expected day of travel I found myself surrounded by fellow nerds (data scientists, as this particular group prefers to be called). The group included statisticians, epidemiologists, computer scientists, engineers, political scientists, journalists, and ‘data wranglers.’ We were all there thanks to the efforts of Drew Conway, Jake Porway, and Craig Barowsky (Data without Borders’s founders) who had the crazy idea of bringing together well-intentioned data analysts and non-profits with data in need of analysis.

This particular weekend we divided into teams and tackled projects from the New York chapter of the American Civil Liberties Union (NYCLU), MiX Market, and UN Global Pulse. I joined the NYCLU team, where we worked on data collected by the New York Police Department about their “stop and frisk” practices. “Stop and frisk” is the common name used to reference police stopping a pedestrian. Not all such stops actually result in a search or arrest.

Sara LaPlante, NYCLU’s data and policy analyst laid out two clear goals for us. First, provide data visualizations to help average New Yorkers contextualize their own personal experiences. For example, to answer questions such as in which precinct do the most pedestrian stops occur? On what days and at what time of day do the most pedestrian stops occur? The second goal was a much tougher question – is there a racial bias in pedestrian stops? Researchers have been tackling this question, specifically in NYC, for well over a decade. Most recently, Andrew Gelman, Jeffrey Fagan, and Alex Kiss published a rather complex analysis of similar data in the Journal of the American Statistical Association.

We were not able to make much significant progress on this second goal in a mere 48 hours. But we were able to provide NYCLU with some useful data visualizations, a dataset ready for analysis (typos corrected, locations translated to latitude and longitude for mapping, etc.), and some good ideas for next steps. The most challenging next step is acquiring disaggregated crime statistics for NYC, something we were surprised and frustrated was not readily available online.

Several of us plan to remain involved in NYCLU’s proposed analyses and look forward to staying in contact with Sara and the other members of the team. Descriptions of our project, plus the projects with MiX and Global Pulse, can be found on the Data Without Borders wiki.

The next Data Dive will be held here in San Francisco on Nov. 4-6 (tomorrow!), and I’ll switch from my statistician hat to my non-profit hat for that one – the human rights team is looking forward to supplying some of our own data and recruiting analytical assistance.