To improve guide dog robots, first listen to the blind.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

What features does a robotic guide dog need? Ask the blind, ask the authors of award-winning papers. Led by researchers at the University of Massachusetts Amherst, a study identifying how to develop robotic guide dogs with insights from guide dog users and trainers was presented at the CHI 2024: Conference on Human Factors in Computing Systems (CHI). I won the best paper award.

Guide dogs enable remarkable autonomy and mobility for their handlers. However, only a fraction of visually impaired people have a companion. Barriers include lack of trained dogs, cost (which is $40,000 for training alone), allergies and other physical limitations that prevent dog care.

Robots have the ability to step in where canines can't and fill a really big need — if designers can get the features right.

“We're not the first to develop a guide dog robot,” says Donghyun Kim, an assistant professor in the UMass Amherst Manning College of Information and Computer Science (CICS) and one of the corresponding authors of the award-winning paper. “There have been 40 years of studies, and none of these robots are actually used by end users. How do leaders use dogs and what technology do they use?

The research team conducted semi-structured interviews and observation sessions with 23 visually impaired guide dog handlers and five trainers. Through thematic analysis, they addressed the current limitations of canine guide dogs, the characteristics of an effective guide in search of an effective guide and considerations for future robotic guide dogs.

One of the more important themes that emerged from these interviews was the delicate balance between robot autonomy and human control. “Originally, we thought we were developing an autonomous driving car,” says Kim. They envisioned that the user would tell the robot where they wanted to go and the robot would autonomously navigate with the user to that location.

There is no such thing.

Interviews revealed that handlers do not use their dog as a global navigation system. Instead, the handler controls the overall path while the dog is responsible for local obstacle avoidance. However, even this is not a hard and fast rule. Dogs can also learn routes by habit and eventually lead a person to regular destinations without instructions from the handler.

“When the handler trusts the dog and gives the dog more autonomy, it's a bit more fragile,” says Kim. “We can't just make a robot that's completely passive, just following a handler, or just completely autonomous, because then [the handler] Feels unsafe.”

The researchers hope the paper will serve as a guide not only in Kim's lab, but also for other robot developers. “In this paper, we also provide guidance on how we should develop these robots to make them usable in the real world,” says Huchel Hwang, first author of the paper and a postdoctoral fellow in Kim's robotics lab. Candidate

For example, he says a two-hour battery life is an important feature for commuting, which itself can be an hour. “About 90% of people mentioned battery life,” he says. “This is an important part when designing the hardware because current quadruped robots don't run for two hours.”

These are just a few of the findings in the paper. Others include: adding more camera orientation to help remove overhead obstructions; adding audio sensors for threats from disturbed areas; understanding 'footpath' to indicate, “go straight”, which means follow the road (not travel in a perfectly straight line); and helping users get on the right bus (and then also find a seat).

The researchers say the paper is an excellent starting point, adding that there is more to unpack from their 2,000 minutes of audio and 240 minutes of video data.

Winning the Best Paper Award was a distinction that placed the work in the top 1% of all papers submitted to the conference.

“The most exciting aspect of winning this award is that the research community recognizes and values ​​our direction,” says Kim. “Since we don't believe that guide dog robots will be available to the visually impaired within a year, nor will we solve every problem, we hope that this paper will be useful in the field of robotics and human-robot interaction. will influence a wide range of researchers, which will help our vision be realized sooner.”

Other researchers who contributed to this paper include:

Ivan Lee, associate professor at CICS and co-author of the article with Donghyun, expert in adaptive technologies and human-centered design; Joydeep Biswas, associate professor at the University of Texas at Austin, who presented his experience in creating artificial intelligence (AI) algorithms that allow robots to navigate unstructured environments. Hee Tae Jung, assistant professor at Indiana University, who brings her expertise in human factors and qualitative research to collaborative studies with people with chronic conditions; and Nicholas Giudice, a professor at the University of Maine who is blind and provided valuable insight and interpretation of the interviews.

Ultimately, Kim believes that robotics can do most good when scientists miss the human element. “My PhD and postdoctoral research is about how to improve these robots,” says Kim. “We tried to find out. [an application that is] Something practical and meaningful to humanity.”

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment