RightsCon, March 3-5, San Francisco

Further to all our coverage of the use and abuse of digital technologies, I commend RightsCon to our readers—a gathering of the Silicon Valley (and beyond) tech community, digital rights activists worldwide, journalists, and icons of the human rights community being held March 3-5 in San Francisco.  Themes include:

  • Measuring and preventing risk in the tech sector
  • Tech solutions for human rights challenges
  • Innovations in digital rights
  • Internet governance reform
  • Restoring rights in the age of surveillance

The latest program is here.  I am hosting a “fireside chat” (RightsCon eschews classic “panels” in favor of non-traditional formats such as workshops and scrums) on the way in which technology can be harnessed for the imperative of atrocities prevention.  Participants include:

  1. Sarah Mendelson, Deputy Assistant Administrator, U.S. Agency for International Development, Bureau for Democracy, Conflict and Humanitarian Assistance (see here for her recent Congressional testimony).
  2. Charles Brown, Senior Advisor, Atrocity Prevention and Response, U.S. Department of Defense.
  3. Jim Finkel, The Leonard and Sophie Davis Genocide Prevention Fellow, United States Holocaust Memorial and Museum, Center for the Prevention of Genocide.
  4. Susan Benesch, Faculty Associate, Berkman Center for Internet and Society, Harvard University.
  5. Simona Cruciani, Political Affairs Officer, U.N. Office on Genocide Prevention and the Responsibility to Protect.

The session will cover the following:

In 2012, President Obama established a high-level, interagency Atrocities Prevention Board (APB) to create and implement a whole-of-government approach to atrocities prevention and response.  The APB aims to strengthen the tools available to the government, and our partners within the United Nations, to launch robust policy responses in at-risk situations.  The chat will focus on ways that the tech sector can support the work of the Board and other early warning efforts by enabling them to better anticipate atrocity situations, address and defuse potential atrocity triggers and escalators, create effective sanctions programs against responsible individuals and groups, and identify perpetrators and collect evidence for potential prosecutions in U.S. or foreign/international courts.  Inversely, the discussion will consider ways that atrocity prevention efforts can better tap into the tech sector’s expertise in the areas of social media mobilization and documentation; data encryption and security, organization, and storage; digital authentication; geospatial positioning; and authentication.

Panelists will be invited to ruminate on such topics as:

  1. The imperative of atrocities prevention hinges on effective early warning – what existing technological limitations continue to hinder the ability of the international community to accurately anticipate where and when atrocities may occur?
  2. The APB and the UN have yet to tap into the full potential of the private sector (including human rights organizations and tech companies, both big and small) to inform its work, including by enhancing the ability of governmental personnel and implementing partners in the field to feed real-time information to policymakers.  The tech sector does not necessarily have easy access to those policy-makers with the power to direct government action and assistance in the face of atrocities.  Are there avenues of communication that you think could be developed or better utilized?
  3. How can government funding, for example through USAID’s Tech Challenge, encourage the development of technologies that can be deployed for the purpose of atrocities prevention and response?
  4. How might the work of tech companies focused on digital documentation, encryption, coding, storage, analysis, and geospatial positioning be put in the hands of chronically under-resourced human rights organizations and U.N. bodies that are gathering information and testimony on the frontlines?
  5. The freedoms of speech, assembly, and conscience are central to the human rights pantheon.  At the same time, incendiary speech—including acts of incitement, hate speech, threats, and other forms of dangerous speech—can play a crucial role in triggering and fomenting violence and in cowing victims into submission.  How can technology help to identify, suppress, or otherwise counter dangerous speech in ways that do not run afoul of human rights protections?  Are there examples of how this has been done to good effect in the past?
  6. Individual accountability must be a feature of any atrocities prevention regime, because chronic impunity sends the message that human rights violations will be tolerated and thus can continue.  The system of international criminal justice depends heavily on witness testimony; witnesses, however, are under constant threat—including through nefarious abuses of social media—and many have withdrawn from accountability processes because the risks to themselves and their loved ones is simply too high.  Are there technologies that could be developed to produce reliable and probative evidence to enable prosecutors to pursue criminal charges without relying so heavily on witness testimony?
  7. How can tech companies otherwise participate in a global system of international justice?
  8. Are there elements of the global system of atrocities prevention that are not amenable to a technological solution?  Is there a risk that we over-rely on technology in this sector?

Other high-level ex-USG participants include

I am also participating in a session organized by the Berkeley Human Rights Center and featuring the cyber investigations team of the International Criminal Court to think about how the tech sector can contribute to the ICC’s investigations.

Hope to see you there! 

About the Author(s)

Beth Van Schaack

Leah Kaplan Visiting Professor of Human Rights, Stanford Law School; Former Deputy to the U.S. Ambassador-at-Large for War Crimes Issues in the U.S. State Department. All views are her own. Follow her on Twitter (@BethVanSchaack).