Graduating a Generation of Engineering Students Equipped to Consider the Ethics of Building Lethal Autonomous Weapons

Project

Over the past three years, the University of Ottawa’s CRAiEDL has partnered with Mines Action Canada (MAC) to educate engineering students on the ethics of Lethal Autonomous Weapons Systems (LAWS). Through a first-year design course, students created virtual reality and robot-based projects that highlight the ethical risks of LAWS. These projects challenged students to critically consider the societal impacts of autonomous weapons, encouraging ethical reflection rather than technical glorification. MAC and CRAiEDL challenged students to create interactive games, simulations, and creative tasks—like writing robot “manifestos”—to spark critical reflection of LAWS. The initiative fosters a generation of engineers equipped to question the development of ethically controversial technologies.

For the past three years, CRAiEDL has partnered with Mines Action Canada (MAC) to increase engineering undergraduate students’ understanding of the ethical risks posed by Lethal Autonomous Weapons. MAC is an international leader working to eliminate the serious humanitarian, environmental and development consequences of indiscriminate weapons. They are also a member of Stop Killer Robots, a global coalition of civil society organisations working together to establish a global ban on the development and use of lethal autonomous weapons systems (LAWS). (LAWS are weapons that can target and deliver lethal force (i.e. kill) without meaningful human control over the process.) MAC’s accomplishments are truly inspiring: they have been part of two Nobel Peace Prize winning campaigns in their 30-year history.

At the University of Ottawa, first-year Engineering students are required to take an introductory course in engineering design. That course partners student teams with community-based “clients”, who present the teams with practical engineering projects. Though the course is primarily focused on teaching students how to think through complex engineering design tasks, the partners are often chosen because their projects have strong societal or ethical aspects, allowing teams to develop socially beneficial technology.

Dr. Jason Millar (CRAiEDL’s director), who holds the Canada Research Chair in the Ethical Engineering of Robotics and AI, saw the first-year design course as a perfect opportunity to connect students to MAC’s work on Stop Killer Robots. MAC works with young people around the world to raise awareness and develop youth leadership around their disarmament activities. Millar partnered with MAC, asking them to serve as clients in the design course. According to Millar, “The idea was that we could come up with an engineering project that got students to engage critically with LAWS.”

Together, MAC and CRAiEDL to developed a design project in which students were asked to develop a Virtual Reality (VR) experience intended to expose “decision-makers”—politicians or policymakers—to an experience illustrating one or more of LAWS’ known ethical risks.

“The results were great!” recalls Millar. “Some teams built virtual primary classrooms filled with posters teaching kids how to avoid being mistakenly targeted by LAWS. Others walked you through dystopian cityscapes patrolled by killer drones, with nets hanging between rooftops, and people selling fake ID cards on the black market so you could pass as a non-combatant.” But according to Millar, the most important outcome of the projects was that students were challenged to think critically about the social impacts of the technologies they may someday be asked to build. “Young engineers spend a lot of time learning how to analyze and solve technical problems but are rarely given the tools to analyze the ethical aspects of that work. This project challenged them to do that.”

After a few semesters running the VR challenge, Millar and MAC changed things up.

“I had recently purchased a few DJI robots that are equipped with little guns and are designed to teach students how to build killing machines, which I consider deeply ethically problematic,” explains Millar. “I was determined to find a way to use the robots for the opposite; I don’t want students to lean into designing that tech, I want them to question the ethics of automated weapons.”

So CRAiEDL and MAC redesigned the project. Caitlin Heppner, a PhD candidate in uOttawa’s Philosophy Department, and a senior student researcher at CRAiEDL, took the lead on the new project, which challenged students to develop an interactive “game” people could “play” with the DJI robots. Like the VR experience, the game had to illustrate an ethical issue inherent in LAWS technology. Reflecting on the design challenge, Caitlin said, “It was excellent to see the students learn about the dangers of autonomous weapons and then switch to teaching others about what they had learned through game design and gameplay.”

In the most recent version of the project, students were asked to imagine themselves as the robots, and to create a video of the robots completing a programmed task unrelated to killing. Each team was also asked to write a “manifesto” from the perspective of the robot, rejecting the idea that it were meant to be a killing machine. MAC compiled those manifestos as part of their public outreach activities.

An example of one team's robot manifesto, written from the perspective of a killer robot that rejects its design.

“In a world of complex automated systems—AI and robots—we need to equip engineers with the critical skills to analyze and mitigate the social harms that accompany technology,” explains Millar. “Just because we can build these things, doesn’t mean we should.”

According to one robot-turned-artist’s manifesto, “I reject the role they built me for—the glorification of conflict, the gamification of violence. I will not be a tool to prepare the next generation for designing machines of death. Instead, I wield a brush, not a barrel. I trace symbols of peace where others might see targets. I draw circles where they see crosshairs. I offer reflection instead of reaction.”