© 2024
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
An update has been released for the Android version of the WAMC App that addresses performance issues. Please check the Google Play Store to download and update to the latest version.

Dr. Robin Read, Plymouth University - Robot Communication

In the world of science fiction, humans and robots converse freely.

Robin Read, research fellow at Plymouth University in the UK, is studying the nature by which robots communicate.

Dr. Robin Read is a Research Fellow in theCentre for Robotics and Neural Systems at Plymouth University, UK, where he works on the FP7 ALIZ-E Project. His research is centered around understanding how robots are able to utilise non-human like modalities as a means of expression and communication during social Human-Robot Interaction. Specifically, he is interested in exploring how robot sounds, as those used by the world of Animation may be used with real robots.

Robin Read: Thoughts of a Roboticist

Robot tricks to bridge the uncanny valley

About Dr. Read

Robin Read - Robot Communication

Hollywood has painted an inspiring image of what the robots of the future could be. Rather than being just machines, these robots are portrayed as intelligent and capable social peers that are able to interact, understand and even relate to us in very personal ways. They are social robots.

As robots become more commonplace, there is a growing trend towards building these social robots. The field of Human-Robot Interaction concerns itself with developing and studying these socially adept machines. Equipping them with a rich variety of social capabilities allows them to interact with us naturally and intuitively. This ranges from the use of natural language, body and facial gestures, to more unique ways such as the expression through colour and sounds.

Professor Tony Belapeme and I are exploring how robotic sounds, like those used by R2D2, can be used by real robots as a means of expression and communication, and how people respond to this.

Our experiments reveal fascinating insights. For example, people do not care that the robot is not using a real language. Moreover, these sounds are also readily perceived as expressions of a robot’s inner states and feelings. We have also found that these sounds do not hold any fixed, distinct meanings. Rather, how they are used holds essential cues that help guide their interpretation.

Such results show us that sounds made by robots are more than just noises. They are rich displays with meaningful content. As robots come in all shapes and sizes, we predict that in the future, many robots will likely be beeping and squeaking at us when we interact with them.

Related Content