OnQ Blog

Nexi the “Expressive Robot” Provides Clues About Human Trust

Sep 20, 2012

Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

What makes us trust one another? Research from Cornell University suggests that body language and facial expressions are key. Researchers at Northeastern University are using an innovative robot to help identify the nonverbal “cues” that help drive complex human interactions. 

Developed at MITNexi is a four-foot tall, blue-eyed robot capable of mimicking human behaviors (visual cues).   Northeastern’s scientists are using the robot to study whether these cues might cause test subjects to question the trustworthiness of this creepy-cool machine.

Building on the methodology from a 1993 Cornell study, the new research team, headed up by North­eastern Uni­ver­sity psy­chology pro­fessor David DeSteno, introduced study participants to Nexi in a brief, one-on-one conversation. They then asked participants to play a low-stakes gambling game.  It was designed to measure whether humans would play the game in a cooperative way -- by being generous and nice -- or betray their robotic opponent by going for the jugular and selfishly trying to make as much money as possible.

The outcome was influenced by the pivotal upfront conversation, which involved everyday questions like, “Where are you from?” For some participants, Lexi sat passively, showing no body language. For others, the robot (controlled by hidden researchers) was highly animated, exhibiting body language associated with deception or “bad intent.”

“When Nexi touched its face and hands during the ini­tial inter­view, or leaned back or crossed its arms, people did not trust it to coop­erate in the game and kept their money to themselves,” Northeastern news writer Angela Herring explained in a summary of the results.  They will soon be published in the journal Psychology Science.

Research like this may help answer the question, “How do nice people manage to survive in the world?’’  

“The issues in our project are to try and understand the signals people rely on to decide whom to trust. . . . What’s interesting to me is how mechanical the process of interacting with another being turns out to be,’’ Cornell economist  Robert H. Frank recently told Caroline Johnson of the Boston Globe.

Alex Deixler, a sophomore at Northeastern, who participated in the study and was also interviewed by the Boston Globe, described her own feelings on the subject of human-robot interaction. “At first it was kind of a little weird, obviously, because you’re talking to a piece of metal,’’ Deixler said. But she said things began to feel more natural, and the robot seemed a lot like what you would expect from science fiction.

All in all, researchers found that humans base trust at least partially on language and facial cues – whether a machine makes them or not. 

(H/T to The VergeNews@NortheasternBoston.com)

 

 

 

 

Engage with us on

and