COVID-19 demonstrates that digitization and technologization can be helpful in crises and disasters. In China, service robots deliver medicine and food in hospitals and quarantine stations, and drones track people without breathing masks. Those who has to stay at home can continue to perform their tasks and receive further training via a computer workstation and e-learning applications. Globalisation is a problem in the spread of the virus, but also a solution in combating it: research was immediately carried out worldwide on a drug against SARS-CoV-2. The use of robots and drones in China has been criticised for the loss of privacy. There was also criticism of the fact that the communist party and the media market the use of robots as a chinese success story, but that some of them originate from abroad. Well-known transport robots are, for example, from Starship Technologies and Savioke. Both companies are based in California.
Moral and Immoral Machines
Since 2012, Oliver Bendel has invented 13 artifacts of machine ethics. Nine of them have actually been implemented, including LADYBIRD, the animal-friendly vacuum cleaning robot, and LIEBOT, the chatbot that can systematically lie. Both of them have achieved a certain popularity. The information and machine ethicist is convinced that ethics does not necessarily have to produce the good. It should explore the good and the evil and, like any science, serve to gain knowledge. Accordingly, he builds both moral and immoral machines. But the immoral ones he keeps in his laboratory. In 2020, if the project is accepted, HUGGIE will see the light of day. The project idea is to create a social robot that contributes directly to a good life and economic success by touching and hugging people and especially customers. HUGGIE should be able to warm up in some places, and it should be possible to change the materials it is covered with. A research question will be: What are the possibilities besides warmth and softness? Are optical stimuli (also on displays), vibrations, noises, voices etc. important for a successful hug? All moral and immoral machines that have been created between 2012 and 2020 are compiled in a new illustration, which is shown here for the first time.
Extended Deadline for RP2020
The organizers of the conference “RP2020: Culturally Sustainable Social Robotics” announced that the deadline for submission has been extended to May 1, 2020. The CfP raises questions like that: “How can we create cultural dynamics with or through social robots that will not impact our value landscape negatively? How can we develop social robotics applications that are culturally sustainable? If cultural sustainability is relative to a community, what can we expect in a global robot market? Could we design human-robot interactions in ways that will positively cultivate the values we, or people anywhere, care about?” (Website Robophilosophy Conference) In 2018 Hiroshi Ishiguro, Guy Standing, Catelijne Muller, Joanna Bryson, and Oliver Bendel had been keynote speakers of the Robophilosophy conference. In 2020, Catrin Misselhorn, Selma Sabanovic, and Shannon Vallor will be presenting. More information via conferences.au.dk/robo-philosophy/.
Co-Robots as Care Robots
The paper “Co-Robots as Care Robots” by Oliver Bendel, Alina Gasser and Joel Siebenmann was accepted at the AAAI 2020 Spring Symposia. From the abstract: “Cooperation and collaboration robots, co-robots or cobots for short, are an integral part of factories. For example, they work closely with the fitters in the automotive sector, and everyone does what they do best. However, the novel robots are not only relevant in production and logistics, but also in the service sector, especially where proximity between them and the users is desired or unavoidable. For decades, individual solutions of a very different kind have been developed in care. Now experts are increasingly relying on co-robots and teaching them the special tasks that are involved in care or therapy. This article presents the advantages, but also the disadvantages of co-robots in care and support, and provides information with regard to human-robot interaction and communication. The article is based on a model that has already been tested in various nursing and retirement homes, namely Lio from F&P Robotics, and uses results from accompanying studies. The authors can show that co-robots are ideal for care and support in many ways. Of course, it is also important to consider a few points in order to guarantee functionality and acceptance.” The paper had been submitted to the symposium “Applied AI in Healthcare: Safety, Community, and the Environment”. Oliver Bendel will present the results at Stanford University between 23 and 25 March 2020.
Care Robots with Sexual Assistance Functions
The paper “Care Robots with Sexual Assistance Functions” by Oliver Bendel was accepted at the AAAI 2020 Spring Symposia. From the abstract: “Residents in retirement and nursing homes have sexual needs just like other people. However, the semi-public situation makes it difficult for them to satisfy these existential concerns. In addition, they may not be able to meet a suitable partner or find it difficult to have a relationship for mental or physical reasons. People who live or are cared for at home can also be affected by this problem. Perhaps they can host someone more easily and discreetly than the residents of a health facility, but some elderly and disabled people may be restricted in some ways. This article examines the opportunities and risks that arise with regard to care robots with sexual assistance functions. First of all, it deals with sexual well-being. Then it presents robotic systems ranging from sex robots to care robots. Finally, the focus is on care robots, with the author exploring technical and design issues. A brief ethical discussion completes the article. The result is that care robots with sexual assistance functions could be an enrichment of the everyday life of people in need of care, but that we also have to consider some technical, design and moral aspects.” The paper had been submitted to the symposium “Applied AI in Healthcare: Safety, Community, and the Environment”. Oliver Bendel will present the paper at Stanford University between 23 and 25 March 2020.
Towards a Human-like Chatbot
Google is currently working on Meena, a particular chatbot, which should be able to have arbitrary conversations and be used in many contexts. In their paper “Towards a Human-like Open-Domain Chatbot“, the developers present the 2.6 billion parameters end-to-end trained neural conversational model. They show that Meena “can conduct conversations that are more sensible and specific than existing state-of-the-art chatbots”. “Such improvements are reflected through a new human evaluation metric that we propose for open-domain chatbots, called Sensibleness and Specificity Average (SSA), which captures basic, but important attributes for human conversation. Remarkably, we demonstrate that perplexity, an automatic metric that is readily available to any neural conversational models, highly correlates with SSA.” (Google AI Blog) The company draws a comparison with OpenAI GPT-2, a model used in “Talk to Transformer” and Harmony, among others, which uses 1.5 billion parameters and is based on the text content of 8 million web pages.
HTML, SSML, AIML – and MOML?
On behalf of Prof. Dr. Oliver Bendel, a student at the School of Business FHNW, Alessandro Spadola, investigated in the context of machine ethics whether markup languages such as HTML, SSML and AIML can be used to transfer moral aspects to machines or websites and whether there is room for a new language that could be called Morality Markup Language (MOML). He presented his results in January 2020. From the management summary: “However, the idea that owners should be able to transmit their own personal morality has been explored by Bendel, who has proposed an open way of transferring morality to machines using a markup language. This research paper analyses whether a new markup language could be used to imbue machines with their owners’ sense of morality. This work begins with an analysis how a markup language is structured, describes the current well-known markup languages and analyses their differences. In doing so, it reveals that the main difference between the well-known markup languages lies in the different goals they pursue which at the same time forms the subject, which is marked up. This thesis then examines the possibility of transferring personal morality with the current languages available and discusses whether there is a need for a further language for this purpose. As is shown, morality can only be transmitted with increased effort and the knowledge of human perception because it is only possible to transmit them by interacting with the senses of the people. The answer to the question of whether there is room for another markup language is ‘yes’, since none of the languages analysed offer a simple way to transmit morality, and simplicity is a key factor in markup languages. Markup languages all have clear goals, but none have the goal of transferring and displaying morality. The language that could assume this task is ‘Morality Markup’, and the present work describes how such a language might look.” (Management Summary) The promising results are to be continued in the course of the year by another student in a bachelor thesis.