Phd Student Jessica Barfield & Assistant Prof. Jiangen He Get Award to Fund Diversity in Robot-Human Interaction Research
CCI doctoral student Jessica Barfield’s recent research with SIS Professor Dania Bilal looked at diversity and how that affected the way people interact with AI-powered voice digital assistants. Now, in collaboration with SIS Assistant Professor Jiangen He, she’s building on that research by taking similar principles and applying it to human-robot interaction. The two will be able to buy a robot for this research with funding they were recently awarded through a University of Tennessee, Knoxville Student/Faculty Research Award.
The aim of this project, “Diversity as a Factor in the Design of Human-Robot Interfaces,” is to see how people will interact with a robot that mirrors their own gender, ethnic background, and/or ethnicity; they will also study how people interact with a robot who does not mirror themselves. Half of the study participants will see a robot that looks and sounds like them, and the other half will see a robot who doesn’t appear to be anything like them.
“We are attracted to people who are similar to us, and racial mirroring is a theory that we tend to trust people who are in the same race as themselves versus other races. We want to validate this kind of theory,” He said.
Barfield’s previous research with voice digital assistants confirmed that people like to imagine that assistants look like a version of themselves. The racial mirroring theory she used with Bilal for that project was inspired by He, and is one reason she is collaborating with him on this project. The other reason his expertise in designing interfaces will lend valuable insight to the research.
Barfield said the reason why research like this is important to conduct is because it can influence how people design robot interfaces—especially as robots become a part of the societal landscape.
“Conscious effort and considerations need to be taken into account for the voice, the ethnicity, the appearance of these designs. They need to be accessible and trustworthy. Looking at it from an industry perspective, companies designing and selling robots, and people feeling they can trust these robots that they’re interacting with many times a day, is important,” she said.
There’s already been issues in technology where the inherent racial, gender, or ethnic bias of designers influences the interfaces they create and can adversely affect people of other genders, races, or ethnic backgrounds. Barfield said she wants good data on how to prevent this type of discrimination before the design process even starts.
But robot research is expensive, so Barfield and He had to look for funding opportunities, and were thrilled when they were awarded the money to buy the robot they wanted, Misty II. Barfield said it is a fairly advanced social robot that can be used in a variety of research scenarios. In fact, she pointed out that it could also be used by others for their research, so it’s a beneficial purchase for others at UT who want to study robot-human interactions.
Her research for this project will consist of scenarios in which the study participants will be recorded having a conversation with the robot, and then they’ll complete a survey about various aspects regarding that interaction. They’ll rate things such as the level of trustworthiness, satisfaction, likeability, and other factors to gauge how positively or negatively they viewed the interaction.