Ver o conteúdo principal

How Humans Judge Machines

13 Jan, 2021
18h00
Livestream on Facebook of NOVA IMS

Conference: How Humans Judge Machines

Are you afraid of losing your job to a machine? What would be your reaction to a discriminatory AI system? And how would that reaction compare to your reaction to an discriminatory human? What about an algorithm that creates a blasphemous piece of art?

About Event

How Humans Judge Machines compares people's reactions to actions and decisions taken by humans and machines. Using answers of thousands of participants, the book reveals the biases that affect the way people see and judge machines in scenarios involving natural disasters, labor displacement, policing, privacy, algorithmic bias, and much more.

Website of the book: https://www.judgingmachines.com

Lecturer

Diana Orghian

Invited professor at NOVA IMS and UX Researcher at Outsystems

Image content

Diana is currently an Invited Professor at NOVA IMS and User Experience Researcher at Outsystems. She is a Ph.D. in Psychology. Previously, she was a researcher at the MIT Media Lab and University of Lisbon. Her research consists of investigating the cognitive processes underlying social perceptions, memory, and the way people perceive artificial agents. In the past, she was also a visiting scholar at NYU (in 2015) and Harvard University (in 2016/2017). 

Event Wrap Up

This conference was held last January 13th, livestreamed on NOVA IMS’ Facebook page. Professor Pedro Saraiva, the Dean of NOVA IMS introduced us to the theme and welcomed Professor Diana Orghian. Professor Diana was the keynote speaker at the conference, talking about the book “How Humans Judge Machines”, in which she is the coauthor. How Humans Judge Machines compares people's reactions to actions and decisions taken by humans and machines. Using answers of thousands of participants, the book reveals the biases that affect they way people see and judge machines in scenarios involving natural disasters, labor displacement, policing, privacy, algorithmic bias, and much more.