Talking robot guide dog uses AI to describe the world as it leads
A robot dog that talks back may sound like a novelty. In this case, it is meant to solve a practical problem: guide dogs can lead, but they cannot explain. That gap is what researchers at Binghamton University and State University of New York, set out to address with a robotic guide dog system that uses a large language model to hold spoken conversations with visually impaired users. The machine can suggest routes, explain trade-offs before a trip begins, and describe what is happening during the walk itself. “For this work, we’re demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs,” said Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science’s School of Computing. “Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities.” The project builds on earlier work from Zhang’s team, which trained robotic guide dogs to respond to leash …







