Researchers are leveraging AI to train (robotic) dogs to respond to their masters.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

An international collaboration seeks to innovate the future of how a mechanical man's best friend interacts with its owner, using a combination of AI and edge computing called edge intelligence.

The project is sponsored by a one-year seed grant from the Institute for Future Technologies (IFT), a partnership between New Jersey Institute of Technology (NJIT) and Ben-Gurion University of the Negev (BGU).

Kasthuri Jiraja, an assistant professor in NJIT's Young Woo College of Computing, is researching how to design a socially supportive model of her Unitree Go2 robotic dog that mimics the nature of its behavior and interactions with people. will activate based on who it interacts with.

The main goal of the project is to bring the dog to life by adopting wearable sensing devices that can detect physical and emotional stimuli linked to one's personality and traits, such as introversion, or temporary states, including pain and comfort. The surface

The invention will impact home and health care settings in combating loneliness in the elderly population and aid in treatment and rehabilitation. Jayaraja's preliminary work where robotic dogs understand and respond to the gestures of their partners will be presented at the International Conference on Intelligent Robots and Systems (IROS) later this year.

Co-principal investigator Shelly Levy-Tzedek, associate professor in BGU's Department of Physical Therapy, is an experienced researcher and leader in rehabilitation robotics, focused on studying the effects of aging and disease on body control.

The researchers noted that wearable devices are increasingly accessible, and everyday models such as earphones can be repurposed to extract wearer states such as mental activity and microexpressions. The project aims to combine such multimodal wearable sensors with conventional robot sensors (e.g. visual and audio) to objectively and passively track user attributes.

According to Jayaraja, while the concept of socially assistive robots is interesting, long-term deployment is a challenge due to cost and scale. “Robots like Unitree Go2 are not yet ready for large AI tasks. Compared to large GPU clusters they have limited processing power, not much memory and limited battery life,” he said.

Early stages of the project include building on traditional sensor fusion, as well as exploring carefully designed deep learning-based architectures to help develop commodity wearable sensors to extract user features and adapt to motion commands. Will prove.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment