A Prod, a Stroke, or a Hug?

Soft robots with transparent artificial skin can detect human touch with internal cameras and differentiate between a prod, a stroke, or a hug. This is what New Scientist writes in its article “Robot that looks like a bin bag can understand what a hug is“. According to the magazine, the technology could lead to better non-verbal communication between humans and robots. What is behind this message? A scientific experiment that is indeed very interesting. “Guy Hoffman and his colleagues at Cornell University, New York, created a prototype robot with nylon skin stretched over a 1.2-metre tall cylindrical scaffold atop a platform on wheels. Inside the cylinder sits a commercial USB camera which is used to interpret different types of touch on the nylon.” (New Scientist, 29 January 2021) In recent years, there have been several prototypes, studies and surveys on hugging robots. For example, the projects with PR2, Hugvie, and HUGGIE are worth mentioning. Cornell University’s research certainly represents another milestone in this context and in a way puts humans in the foreground.