A start-up promises that lucid dreaming will soon be possible for everyone. This was reported by the German magazine Golem on November 10, 2023. The company is Prophetic by Eric Wollberg (CEO) and Wesley Louis Berry III (CTO). In a lucid dream, the dreamers are aware that they are dreaming (Image: DALL-E 3). They can shape the dream according to their will and also exit the dream. Everyone has the ability to experience lucid dreams. One can learn to induce this form of dreaming, but one can also have this form of dreaming as a child and unlearn it again as an adult. The Halo headband, a non-invasive neural device, is designed to make lucid dreaming possible. “The combination of ultrasound and machine learning models (created using EEG & fMRI data) allows us to detect when dreamers are in REM to induce and stabilize lucid dreams.” (Website Prophetic) According to Golem, the neuronal device will be available starting in 2025.
Be My AI
Be My AI is a GPT-4-based extension of the Be My Eyes app. Blind users take a photo of their surroundings or an object and then receive detailed descriptions, which are spoken in a synthesized voice. They can also ask further questions about details and contexts (Image: DALL-E 3). Be My AI can be used in a variety of situations, including reading labels, translating text, setting up appliances, organizing clothing, and understanding the beauty of a landscape. It also offers written responses in 29 languages, making it accessible to a wider audience. While the app has its advantages, it’s not a replacement for essential mobility aids such as white canes or guide dogs. Users are encouraged to provide feedback to help improve the app as it continues to evolve. The app will become even more powerful when it starts to analyze videos instead of photos. This will allow the blind person to move through his or her environment and receive constant descriptions and assessments of moving objects and changing situations. More information is available at www.bemyeyes.com/blog/announcing-be-my-ai.
All that Groks is God
Elon Musk has named his new language model Grok. The word comes from the science fiction novel “Stranger in a Strange Land” (1961) by Robert A. Heinlein. This famous novel features two characters who have studied the word. Valentine Michael Smith (aka Michael Smith or “Mike”, the “Man from Mars”) is the main character. He is a human who was born on Mars. Dr “Stinky” Mahmoud is a semanticist. After Mike, he is the second person who speaks the Martian language but does not “grok” it. In one passage, Mahmoud explains to Mike: “‘Grok’ means ‘identically equal.’ The human cliché. ‘This hurts me worse than it does you’ has a Martian flavor. The Martians seem to know instinctively what we learned painfully from modern physics, that observer interacts with observed through the process of observation. ‘Grok’ means to understand so thoroughly that the observer becomes a part of the observed – to merge, blend, intermarry, lose identity in group experience. It means almost everything that we mean by religion, philosophy, and science – and it means as little to us as color means to a blind man.” Mike says a little later in the dialog: “God groks.” In another place, there is a similar statement: “… all that groks is God …”. In a way, this fits in with what is written on the website of Elon Musk’s AI start-up: “The goal of xAI is to understand the true nature of the universe.” The only question is whether this goal will remain science fiction or become reality.
On Beauty
On 17 October 2023, Oliver Bendel published a little book entitled “ON BEAUTY” in which he posed 26 questions about beauty to GPT-4. The language model’s answers show the direction in which it has developed. They reveal much of the world knowledge it has accumulated. But they are also unassailable and quite general. To some questions that are not usually asked, it has downright woke answers. Only questions about the measurability of beauty or the connection between beauty and evolution elicit some concessions from the chatbot and text generator. Questions and answers are illustrated with images generated by DALL-E 3. They show beautiful people, beautiful animals, beautiful things, beautiful landscapes. Some are highly expressive art, others are kitsch. Like its predecessor “ARTIFACTS WITH HANDCAPS” (24 September 2023), this little book can be downloaded for free. Oliver Bendel has been writing experimental literature for 40 years, from concrete poetry and mobile phone novels to poems in the form of 2D and 3D codes and AI-generated texts. He has toured the Netherlands with his mobile phone novels and poems on behalf of two Goethe Institutes. The standard reference “Die Struktur der modernen Literatur” (Mario Andreotti) devotes two pages to his work (Photo: DALL-E 3).
ChatGPT can See, Hear, and Speak
OpenAI reported on September 25, 2023 in its blog: “We are beginning to roll out new voice and image capabilities in ChatGPT. They offer a new, more intuitive type of interface by allowing you to have a voice conversation or show ChatGPT what you’re talking about.” (OpenAI Blog, 25 September 2023) The company gives some examples of using ChatGPT in everyday life: “Snap a picture of a landmark while traveling and have a live conversation about what’s interesting about it. When you’re home, snap pictures of your fridge and pantry to figure out what’s for dinner (and ask follow up questions for a step by step recipe). After dinner, help your child with a math problem by taking a photo, circling the problem set, and having it share hints with both of you.” (OpenAI Blog, 25 September 2023) But the application can not only see, it can also hear and speak: “You can now use voice to engage in a back-and-forth conversation with your assistant. Speak with it on the go, request a bedtime story for your family, or settle a dinner table debate.” (OpenAI Blog, 25 September 2023) More information via openai.com/blog/chatgpt-can-now-see-hear-and-speak.
AI-generated Short Stories
The technology philosopher and writer Oliver Bendel published the book “ARTIFACTS WITH HANDICAPS” on 24 September 2023. The information about the author reads: “Oliver Bendel featuring Ideogram and GPT-4”. In fact, the entire work was created with the help of generative AI. It consists of 11 images, each followed by a short story. This one deals with the imperfection of representation. Once a hand looks like that of a mummy, once a skateboard floats in the air above the wheels. But there is also one or another representation that looks perfect. In this case, the story explains what is different about the person, their history, or their behavior. Ultimately, it is about the otherness and the fact that this is in fact a special feature. The book is freely available and can be distributed and used as desired, with credit given to the authors, i.e. the artist and the AI systems. Oliver Bendel has been writing experimental literature, including digital literature, for 40 years. As of 2007, he was one of the best-known cell phone novelists in Europe. In 2010, he attracted attention with a volume of haiku – “handyhaiku” – in which the poems were printed in the form of QR codes. In 2020, the volume “Die Astronautin” was published, in which the poems are printed in the form of 3D codes. The standard work “Die Struktur der modernen Literatur” (“The Structure of Modern Literature”) by Mario Andreotti devotes two pages to the writer’s work.
Artificial Intelligence & Animals
The online event “Artificial Intelligence & Animals” will take place on 16 September 2023. “AI experts and attorneys will discuss the intersection of AI and animals in this UIA Animal Law Commission and GW Animal Law webinar” (Website Eventbrite) Speakers are Prof. Dr. Oliver Bendel (FHNW University of Applied Sciences and Arts Northwestern Switzerland), Yip Fai Tse (University Center for Human Values, Center for Information Technology Policy, Princeton University), and Sam Tucker (CEO VegCatalyst, AI-Powered Marketing, Melbourne). Panelists are Ian McDougall (Executive Vice President and General Counsel, LexisNexis London), Jamie McLaughlin (Animal Law Commission Vice President, UIA), and Joan Schaffner (Associate Professor of Law, George Washington University). Oliver Bendel “has been thinking on animal ethics since the 1980s and on information and machine ethics since the 1990s”. “Since 2012, he has been systematically researching machine ethics, combining it with animal ethics and animal welfare. With his changing teams, he develops animal-friendly robots and AI systems.” (Website Eventbrite)
AI-based Robots for the Disposal of Discarded Ammunition
The Robotics Innovation Center (RIC) at the German Research Centre for Artificial Intelligence (DFKI) in Bremen wants to clear the seabed of discarded ammunition in the North Sea and Baltic Sea. This was reported by the online magazine Golem on 14 June 2023. The researchers are using the autonomous underwater vehicle Cuttlefish, developed at DFKI, as a test platform. According to Golem, the robot has been equipped with two deep-sea-capable gripper systems. These are designed to enable flexible handling of objects under water, even difficult objects such as explosive devices. The AI-based control system allows the robot to change its buoyancy and centre of gravity during the dive. According to the online magazine, the AUV is equipped with numerous sensors such as cameras, sonars, laser scanners, and magnetometers. This is how it is supposed to approach an object without colliding with it. The system will certainly be effective – whether it is efficient remains to be seen.
Robot Hand Can Operate in the Dark
“Researchers at Columbia Engineering have demonstrated a highly dexterous robot hand, one that combines an advanced sense of touch with motor learning algorithms in order to achieve a high level of dexterity.” (Website Columbia Engineering, 28 April 2023) Columbia Engineering reported this on its website on April 28, 2023. The text goes on to say: “As a demonstration of skill, the team chose a difficult manipulation task: executing an arbitrarily large rotation of an unevenly shaped grasped object in hand while always maintaining the object in a stable, secure hold. This is a very difficult task because it requires constant repositioning of a subset of fingers, while the other fingers have to keep the object stable. Not only was the hand able to perform this task, but it also did it without any visual feedback whatsoever, based solely on touch sensing.” (Website Columbia Engineering, 28 April 2023) “While our demonstration was on a proof-of-concept task, meant to illustrate the capabilities of the hand, we believe that this level of dexterity will open up entirely new applications for robotic manipulation in the real world”, said Matei Ciocarlie according to the website. He is the Associate Professor in the Departments of Mechanical Engineering and Computer Science who developed the hand together with his graduate student Gagan Khandate (Photo: Columbia University ROAM Lab).
Self-driving Cars Stopped by Fog
“Five self-driving vehicles blocked traffic early Tuesday morning in the middle of a residential street in San Francisco’s Balboa Terrace neighborhood, apparently waylaid by fog that draped the southwestern corner of the city.” (San Francisco Chronicle, 11 April 2023) The San Francisco Chronicle reported this in an article published on April 11, 2023. The fact that fog is a problem for Waymo’s vehicles has been known to the company for some time. A blog post from 2021 states: “Fog is finicky – it comes in a range of densities, it can be patchy, and can affect a vehicle’s sensors differently.” (Blog Waymo, 15 November 2021) Against this background, it is surprising that vehicles are allowed to roll through the city unaccompanied, especially since Frisco – this name comes from sailors – is very often beset by fog. But fog is not the only challenge for the sensors of self-driving cars. A thesis commissioned and supervised by Prof. Dr. Oliver Bendel presented dozens of phenomena and methods that can mislead sensors of self-driving cars. The San Francisco Chronicle article “Waymo says dense S.F. fog brought 5 vehicles to a halt on Balboa Terrace street” can be accessed at www.sfchronicle.com/bayarea/article/san-francisco-waymo-stopped-in-street-17890821.php.