“A robot is a sophisticated toaster—why would you ask it for mortgage advice?”

Megan Wright bw
Posted: 12/04/2017

“What Fairtrade did for coffee, we aim to do for robotics," said Aimee van Wynsberghe, co-director of the Foundation for Responsible Robotics 

robot hero

From an early career in cell biology to the world of robot ethics, Aimee van Wynsberghe has always had a passion for big-picture-thinking.

A relative newcomer to the space, van Wynsberghe admits she first had the idea to combine technology and ethics during her work as a research assistant at a robotics institute in Canada (CSTAR). She recalls thinking to herself, “Maybe I should find a way to ask these questions about technology.”

Fast forward a decade and van Wynsberghe is pioneering research as a global robotics ethicist, having recently been named by Robohub as one of 25 women in robotics you need to know about in 2017.

She’s also the co-founder and co-director of the Foundation for Responsible Robotics—a not-for-profit organization focused on shaping the future of responsible design, development, use, regulation and implementation of robotics.

This month, as we welcome van Wynsberghe to the Artificial Intelligence and Intelligent Automation Advisory Board, we sat down to discuss all things robot ethics…

AIIA Network (AIIA): Ethics in artificial intelligence is a very popular topic of discussion at the moment, primarily because people are recognizing the need for this dialogue to exist. How have you noticed this conversation evolving over the last ten years?

Aimee van Wynsberghe (AVW): Before 2006, when the term robot ethics was really coined, there was no real organization to it. You had people asking some questions here and there but there was no term to actually talk about it.

I’ve noticed this huge boom between then and now—we went from nobody having any idea how to organize themselves, to everyone wanting to get into it now.

Read more: Is the principle of ethics in AI asleep at the wheel?

If you’re trying to program a robot so that it can think in an ethical way, that feeds into the philosophical discussion. So people are now learning that robots are a really good tool to study applied ethics, investigating the philosophical questions that we’ve asked for such a long time.

AIIA: You’re a co-founder of the Foundation for Responsible Robotics—what inspired this organization, and what exactly is responsible robotics?

AVW: We are a not-for-profit organization, established in the Netherlands in 2015, and our mission is to promote this concept of responsible robotics—referring to the humans that are making the robots. It’s really about looking at design, development, implementation, use and regulation of robots, so it really spans the entire lifecycle of the product.

This means looking at codes of conduct for engineers and how a company is looking after environmental consequences. If you’re in an academic setting, you’re looking at how you use humans in your research protocol, asking questions like: are you going through an ethics committee and asking for informed consent? And how is the data stored?

The questions that fall under the umbrella of responsible robotics are incredibly numerous.

As an organization we focus on three particular areas. We focus on networking, hosting workshops to bring the academics together with the industry, because academics are studying what industry is doing. Industry needs to know what academics are doing, but there’s very little way for them ever to get together to learn about each other’s research.

Secondly, we write reports or consultation documents. These are non-opinion documents where we’re trying to bring a variety of research together—polls, empirical research, ethical, legal and societal research—on a topic that we think the public needs to know about. This is really to raise public awareness and to help educate policymakers.

these hands hero

Our third area of focus is facilitating transparency between the company and the consumer. What Fairtrade did for coffee, we would like to do for robotics. We want to create a certification program so that when you’re buying a robot there is a stamp on the robot that says: this company, in making this robot, has a data protection policy; it has been tested rigorously for security measures; there is a diversity policy in the company, or there is environmental considerations that are at play here. 

AIIA: We’ve seen some incredible advances in artificial intelligence recently, and of course there’s ongoing debate about the increasing role of robots in the home and workplace. What are the ethical implications of creating decision-making robots?

AVW: The Foundation’s mission is about enforcing the idea that it’s not the robot that should ever be delegated any kind of ethical decision-making role or ethical decision-making skills. A robot is a sophisticated toaster or fridge, so why would you leave your children with those?  Why would you have your toaster making decisions about your healthcare plan or giving mortgage advice? 

We’re really trying to get the idea home that this is a technology and we have to maintain the connection between the humans making the technology; the humans are responsible for the consequences of the technology.

AIIA: You mentioned diversity as an ethical consideration. It’s a conversation that’s getting a lot of airtime in the tech industry at the moment, especially in the robotics sector as so many AI bots are being developed by men to have female personas. What are the implications of this? 

AVW: There are two issues here: the fact that the robots are being developed by men, and then especially if the robot has AI, the AI requires training data in order to be able to learn. If you don’t have diversity in the workplace that’s contributing to the training data, you can imagine that the AI is not going to be very successful interacting with women. And I should say that diversity isn’t just about having more women in the workplace, it’s also about giving a voice to various cultures and backgrounds (sexual, racial and so on).

Read more: How smart is artificial intelligence? 

The other aspect is the robots that we’re making, the nurses, the secretaries, are often in female-dominated roles. There seems to be an inclination to feminize the robot, to make a robot secretary or a robot nurse, which can be traced back to ancient myths that spoke of making automatons in the female form.

This isn’t necessarily something that’s new, but definitely something that we should reflect on. We must ask ourselves if creating robots in this way is a minimization of the roles traditionally attributed to women in our society (i.e. that they can be reduced to code) or a valuation of women’s roles and perhaps a sign of appreciation. The answer to this question will help pave the way towards acceptable design and development practices.

Megan Wright bw
Posted: 12/04/2017

OUR BENEFACTORS