At the end of the ACI conference, the “Paper Session 6” was held, which was titled “Investigating Human-Animal Relations”. Sarah Webber (University of Melbourne) gave a talk on “Watching Animal-Computer Interaction: Effects on Perceptions of Animal Intellect”. In the experiment, people observed orangutans interacting with computer applications. It was examined how they changed their judgments regarding the animals’ intelligence and behavior. The talk that followed came from Alexandra Morgan (Northumbria University) and was titled “Blind dogs need guides too: towards technological support for blind dog caregiving”. She addressed the needs of blind dogs and showed what gadgets are on the market to assist them. Her team developed an app called “My Blind Dogo” that could help owners of blind dogs. The session ended with a talk on “A Face Recognition System for Bears: Protection for Animals and Humans in the Alps” by Oliver Bendel (University of Applied Sciences and Arts Northwestern Switzerland). He presented an integrated system with cameras, robots, and drones that Ali Yürekkirmaz and he had designed. The ACI took place from 5 to 8 December 2022 in Newcastle upon Tyne. It is the world’s leading conference on animal-computer interaction. More information on the conference via www.aciconf.org/aci2022.
Talking Eggs
After the keynote on the morning of December 8, 2022, ACI2020 continued with “Paper Session 4: Sensors & Signals, Part I: Origin Stories”. David L. Roberts (North Carolina State University) presented on “Motion-Resilient ECG Signal Reconstruction from a Wearable IMU through Attention Mechanism and Contrastive Learning”. The next talk, “TamagoPhone: A framework for augmenting artificial incubators to enable vocal interaction between bird parents and eggs”, was given by Rebecca Kleinberger (Massachusetts Institute of Technology & Northeastern University). The starting point of her research was that some birds have pre-hatching vocal communication. The last presentation before the lunch break that was given online was “Simultaneous Contact-Free Physiological Sensing of Human Heart Rate and Canine Breathing Rate for Animal Assisted Interactions: Experimental and Analytical Approaches” by Timothy Holder and Mushfiqur Rahman (North Carolina State University). More information on the conference via www.aciconf.org/aci2022.
About Parrots and Dogs
The ACI2022 conference continued on the afternoon of December 7, 2022 after the coffee break (“Paper Session 3: Learning From and With Each Other”). Cristóbal Sepulveda Álvarez (Universidad de Chile) gave a talk on the topic “Measuring Digitally Comparative Abilities Between Discreet and Continuous Quantities through a Digital Enrichment Application”. He showed a parrot that had to choose different quantities on a touch screen. Dirk van der Linden (Northumbria University) was present on behalf of Jasmine Forester-Owen (Northumbria University). He spoke about “Noisy technology, anxious dogs: can technology support caregiving in the home?”. In their prototype, they combine noise detection and body language identification in dogs. Jérémy Barbay (Universidad de Chile) gave the last three presentations of the day: “Comparing Symbolic and Numerical Counting Times between Humans and Non-Humans Through a Digital Life Enrichment Application”, “Popping Up Balloons for Science: a Research Proposal”, and “A Loggable Aid to Speech (for Human and Non-Human Animals): A Research Proposal”. More information on the conference via www.aciconf.org.
Talking with Animals
We use our natural language, facial expressions and gestures when communicating with our fellow humans. Some of our social robots also have these abilities, and so we can converse with them in the usual way. Many highly evolved animals have a language in which there are sounds and signals that have specific meanings. Some of them – like chimpanzees or gorillas – have mimic and gestural abilities comparable to ours. Britt Selvitelle and Aza Raskin, founders of the Earth Species Project, want to use machine learning to enable communication between humans and animals. Languages, they believe, can be represented not only as geometric structures, but also translated by matching their structures to each other. They say they have started working on whale and dolphin communication. Over time, the focus will broaden to include primates, corvids, and others. It would be important for the two scientists to study not only natural language but also facial expressions, gestures and other movements associated with meaning (they are well aware of this challenge). In addition, there are aspects of animal communication that are inaudible and invisible to humans that would need to be considered. Britt Selvitelle and Aza Raskin believe that translation would open up the world of animals – but it could be the other way around that they would first have to open up the world of animals in order to decode their language. However, should there be breakthroughs in this area, it would be an opportunity for animal welfare. For example, social robots, autonomous cars, wind turbines, and other machines could use animal languages alongside mechanical signals and human commands to instruct, warn and scare away dogs, elks, pigs, and birds. Machine ethics has been developing animal-friendly machines for years. Among other things, the scientists use sensors together with decision trees. Depending on the situation, braking and evasive maneuvers are initiated. Maybe one day the autonomous car will be able to avoid an accident by calling out in deer dialect: Hello deer, go back to the forest!