The robotic system feeds people with severe mobility restrictions.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Cornell researchers have developed a robotic feeding system that uses computer vision, machine learning and multimodal sensing to safely feed people with severe mobility restrictions, including spinal cord injuries. People with cerebral palsy and multiple sclerosis.

“It's difficult to feed people with severe mobility limitations with robots, because many people can't bend forward and eat on their own,” said Tapumayuk “Tapu” Bhattacharjee, assistant professor of computer science at Cornell's N.S. Bowers College. It needs to be kept inside the mouth.” of Computing and Information Science and the senior developer behind the system. “The challenge intensifies when feeding individuals with additional complex medical conditions.”

A paper on the system, “Feel the Bite: Robot-assisted Inside-Mouth Bite Transfer Using Robust Mouth Perception and Physical Interaction-Aware Control,” was presented at the Human Robot Interaction Conference, held March 11-14 in Boulder, Colorado. . It was recognized as a Best Paper Honorable Mention, while the research team's demo of an elaborate robotic feeding system received the Best Demo Award.

A leader in assistive robotics, Bhattacharjee and his EmPRISE lab have spent years teaching machines the complex processes by which we humans feed ourselves. Teaching the machine is a complex challenge — everything from identifying food items on a plate, picking them up, and then transferring them to the caregiver's mouth.

“This last 5 cm, from the vessel to the inside of the mouth, is extremely challenging,” Bhattacharjee said.

Bhattacharjee said some care recipients may have a very limited mouth opening, measuring less than 2 centimeters, while others experience involuntary muscle spasms that cause That can happen unexpectedly, even when the pot is still inside their mouth. Also, some can only bite into food in specific places inside their mouth, which they indicate by pushing the container using their tongue.

“Current technology looks at a person's face only once and assumes it will remain stationary, which is often not the case and can be very limiting for care recipients,” said Rajat Kumar Jainmani, of the paper. Lead author and doctoral student in the field. Computer Science.

To address these challenges, the researchers designed and developed their robot with two essential features: real-time mouth tracking that adapts to the user's movements, and a dynamic response mechanism that allows the robot to Enables detection of the nature of physical interactions as they occur, and reacts. Appropriately, the researchers said this enables the system to distinguish between sudden spasms, deliberate bites and attempts to manipulate the vessel inside the user's mouth.

The robotic system successfully fed 13 people with various medical conditions in a user study spanning three locations: the EmPRISE Lab on the Cornell Ithaca campus, a medical center in New York City, and a caregiver's home in Connecticut. . Users of the robot found it safe and comfortable, the researchers said.

“This is the most extensive real-world evaluation of any autonomous robot-assisted feeding system with end users,” Bhattacharjee said.

The team's robot is a multi-joint arm with a custom-made vessel at the end that can sense the forces applied to it. The mouth-tracking method – trained on thousands of images containing different participants' head poses and facial expressions – combines data from two cameras mounted above and below the vessel. This allows for accurate detection of the mouth and overcomes visual obstructions caused by the vessel, the researchers said. This physical interaction-aware feedback mechanism uses both visual and force sensing to understand how users are interacting with the robot, Jenamani said.

“We're empowering individuals to control a 20-pound robot with just their tongue,” he said.

He cited the user study as the most gratifying aspect of the project, noting the robot's significant emotional impact on care recipients and their caregivers. During one session, the parents of a daughter with schizophrenic quadriplegia, a rare birth defect, observed her successfully feed herself using the system.

“It was a moment of real emotion; his father lifted his hat in celebration, and his mother was almost in tears,” said Janmani.

While more work is needed to explore the system's long-term use, the promising results highlight its potential to improve the level of independence and quality of life of care recipients, the researchers said.

“It's amazing,” Bhattacharjee said, “and very, very fulfilling.”

Co-authors of the paper are: Daniel Stable, MS '23; Xiang Liu, a doctoral student in the Department of Computer Science; Abrar Anwar of the University of Southern California and Catherine Dimetropoulou of Columbia University.

This research was primarily funded by the National Science Foundation.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment