How do we regulate artificial intelligence? Munk School students seek answers to big questions

For many, artificial intelligence (AI) is synonymous with a distant future, where self-driving cars are the norm and humans work side-by-side with robots. In reality, we don’t have to look that far ahead to see artificial intelligence at work. AI already underpins many of the products and systems Canadians interact with today – from the voice of Siri to Netflix suggesting what we should watch next.

But what happens if AI harms rather than helps people? What if the algorithms behind a product, for example, discriminate against people based on their gender? Or race? And who should be in charge of regulating AI?

These are some of the questions four Master of Global Affairs (MGA) students are trying to answer. As part of their second-year capstone project, MGA students Tim Dutton, Vanessa Ko, Gabrielle Lim and Shamim Shamdani are working closely with Global Affairs Canada’s (GAC) Digital Inclusion Lab to develop a set of policy recommendations to regulate AI. The students’ insights with Canada’s private sector to help ensure AI innovation can flourish without putting Canadians in harm’s way.

“There’s a huge focus on AI and its impact on our future. What we don’t hear a lot about is how we can regulate and mitigate the societal impact AI has today,” says Dutton. “That’s what our research project focuses on.”

Artificial intelligence and its regulation pose challenges to the tech industry, governments, civil rights groups, lawyers and global affairs. AI is unique in that it is a technology that develops itself, and it encompasses a range of industries across the world. Given the global scope – and potential harm – of AI applications, the lack of an official regulatory body is all the more concerning, says Ko.

“There isn’t any regulatory umbrella for AI yet – not in Canada or any other country. It’s a huge governance gap that is currently filled by the private sector trying to self-regulate – and obviously, there are a lot of concerns connected to that.”

On top of gaining deep insights into complex policy issues, like regulating AI, capstone projects also serve as a valuable exercise for real-life interaction with policymakers and government officials, says Todd Foglesong, assistant professor and professor of global practice at the Munk School of Global Affairs and the project lead for this year’s MGA capstone on AI governance.

“It’s a chance for students to dig deeply into the tangled problem they are asked to give advice on,” says Foglesong. “I also think that negotiating the scope of their research directly with government officials helps students understand what it’s like to work within the confines of law and bureaucracy – and hopefully develop an appreciation for the methodical way in which some solutions to complex global problems are developed.”

In addition to this year’s capstone on AI governance, Foglesong is spearheading four others, tackling a wide range of policy challenges from improving transparency in national security to looking at the future of organized crime in the cannabis market.

This year’s other MGA capstone projects touch on innovation, global health, sustainability and financial risk. A team of students led by John Robinson, professor at the Munk School and the School of the Environment, is reviewing the City of Toronto’s fossil fuel divestment strategies in order to inform the city’s new investment policy. Another team, led by Munk School Associate Professor Shiri Breznitz, is working with the MaRS Data Catalyst to develop Toronto’s first shared mobility knowledge hub for the Greater Toronto and Hamilton area.

“So far it’s been an incredible learning experience,” says second-year MGA student Gabrielle Lim. “We learn how to narrow down our focus of research and how to work with clients outside of the university. It’s a really good opportunity to gain experience in an area you actually want to work in after your graduation.”