Thursday, February 15, 2018
full house in Moot Court room, view from the front of the room, for panel discussion on darker side of digital

Full house: "Darker Side of Digital" discussed the intersection of technology, privacy and human rights.

 

By Chelsey Legge, 2L

A stellar panel of experts convened at the Faculty of Law, February 6th, to discuss the human rights implications of new and rapidly evolving technologies. The International Human Rights Program (IHRP) and Human Rights Watch Canada hosted the “Darker Side of Digital: Human Rights Implications of Technology in Canada & Abroad” and drew a sold-out crowd. It took place in the Rosalie Silberman Abella Moot Court Room at the University of Toronto’s Faculty of Law, and also livestreamed on Facebook, the irony of which was not lost on the organizers and panelists.

Samer Muscati, IHRP director, opened the event by welcoming the full house of attendees and introducing the panelists: Stephen Northfield (moderator), digital director at Human Rights Watch; Felix Horne, Ethiopia and Eritrea researcher at Human Rights Watch; Lex Gill, research fellow at The Citizen Lab; and Professor Lisa Austin, chair in law and technology at the U of T Faculty of Law.

Before launching into the discussion, the panelists each talked about how their work has intersected with technology and human rights.

Panelists

(From left) Stephen Northfield (moderator), digital director at Human Rights Watch, Felix Horne, Ethiopia and Eritrea researcher at Human Rights Watch,  Lex Gill, research fellow at The Citizen Lab, and Professor Lisa Austin, U of T chair in law and technology

Northfield spoke about some of the positive aspects of technology, and said “technology can be a real help in investigations and effecting change.” He explained how technology figures into the Human Rights Watch research methodology, such as the use of satellite imagery to document the ethnic cleansing of Rohingya villages.

Horne shared some of his experiences conducting research in Ethiopia, a country engaged in pervasive censorship and surveillance. He mentioned Human Rights Watch released a report on surveillance in Ethiopia in 2014.

Gill talked about her extensive work around encryption and anonymity laws. She told the audience about the Citizen Lab’s recent collaboration with the IHRP to produce a submission to the Special Rapporteur on Violence Against Women on technology-facilitated violence against women. She also made a point of “challenging the idea of a dark side and light side [to technology],” and suggesting that we think about darkness “not as malevolent or evil, but as unknown.”

Professor Austin spoke about her work on privacy issues, and her current focus on how the way we think about public space and privacy has changed as we have shifted from an analog world to the digital world. She suggested that we think about “how [we can] enlist new modes of technology to help up navigate this new [digital] space.”

Full house in the Moot Court Room, view from the back of the room

Northfield then posed a number of discussion questions to the panelists, provoking answers that were informative and thought-provoking. His first question asked how worried people should be about infringements on their privacy.

Gill suggested some paranoia is reasonable, but also pointed out certain communities have more cause to worry than others, since those communities are subject to disproportionate levels of surveillance. Quoting American-Canadian writer William Ford Gibson, Gill said “[t]he future is already here – it’s just not very evenly distributed.” Austin added that privacy is highly contextual: “that you share something in one context does not mean you have no privacy interest in other contexts.”

The conversation shifted to the increasing powers of governments to surveil their citizens, and the implications of this surveillance. Using Ethiopia as his touchstone example, Horne talked about how pervasive state surveillance can impede the work of journalists, activists, and human rights organizations. Moreover, the ubiquity of surveillance and spying spreads paranoia and “tears at the social fabric of a community, [which] really affects every facet of life.” These effects can spill across borders – for example, the Ethiopian government has begun using spyware to surveil the diaspora. This has a chilling effect on Ethiopians around the world: “[the diaspora] are self-censoring because of this perception that they’re being surveiled.”

The discussion moved toward the trade-off or balance between (national) security interests and (individual) privacy interests. Horne pointed out “governments conflate peaceful expression of dissent with terrorism all the time.” Gill questioned whether there is always a trade-off between security and human rights. For example, governments and police forces have long complained about encryption technology impeding investigations, but fail to consider how much crime is prevented by encryption (e.g., safe online banking).

“This is a very narrow view of what it means to be secure,” said Gill. Austin suggested we think about the security-privacy balance as we have constructed it in the analog world, but we need to be thinking about how it has changed as we’ve moved to the digital world. “What is getting lost in translation?” said Austin. “Is the surveillance power growing because we’re not paying attention to what’s happening as we shift?”

The panel covered a lot of ground in two hours, fluidly shifting from conceptual discussions about privacy, freedom of expression, and algorithmic responsibility to more specific and concrete topics such as Bill C-59 (the current government’s answer to some of the problems with Bill C-51, the anti-terrorism legislation passed in 2015 by the Harper government) and the Personal Information Protection and Electronic Documents Act (PIPEDA), legal cases involving technology companies, and the privacy issues associated with metadata.

Audience questions after the panel discussion ranged from inquiries about how the panelists manage their own online presences, to the ability of governments and regulators to keep pace with technological change, to the implications of biometric security software. One theme that wove through many of the questions was the role of ordinary individuals and communities in challenging excessive or rights-infringing state surveillance and censorship. All of the panelists expressed optimism on this front.

Horne noted how “many tools exist that can be used to protect oneself and minimize risk, such as Tor, VPN, and encryption applications.”Gill pointed out that we all have different threat models and different concerns about privacy, but “tools like the Citizen Lab’s Security Planner can help people to develop safer habits over time and at their own pace.” She cautioned, however, that the answer to state surveillance is not individual self-defence, and suggested that we focus on building censorship- and surveillance-resisting communities organized around an ethic of care. Austin encouraged civic action, and stressed the importance of “showing up, and insisting on best practices above the baseline.” 

The questions continue to pour in, and even after the panel concluded, many people in the audience approached the panelists to thank them for their participation and insights and to continue the discussion.

Watch the video