Researchers find better way to detect falls

  • July 22, 2024
  • Steve Rogerson
Yu Chen.

Researchers at Binghamton University in New York have found a better way to detect when older adults fall at home.

The research aims to cut reaction times with a human action recognition (HAR) algorithm that uses local computing power to analyse sensor data and detect abnormal movements without transmitting to a processing centre offsite.

Professor Yu Chen and PhD student Han Sun from the Thomas J Watson College of Engineering & Applied Science (www.binghamton.edu/watson) designed the rapid response elderly safety monitoring (Resam) system to leverage advancements in edge computing.

In a paper (ieeexplore.ieee.org/document/10547327) recently published in the IEEE Transactions on Neural Systems & Rehabilitation Engineering, they show Resam can run using a smartphone, smartwatch, laptop or desktop computer with 99% accuracy and a 1.22-second response time, ranking it among the most accurate methods available.

“When many people talk about high tech, they are discussing something cutting edge, like a fancier algorithm, a more powerful assistant to do jobs faster or having more entertainment available,” said Chen. “We observed a group of people – senior citizens – who need more help but normally do not have sufficient resources or the opportunity to tell high-tech developers what they need.”

By using devices already familiar to older people, rather than a full smart-home setup, he thinks it gives them a better sense of control over their health. They don’t need to learn new technology for the system to be effective.

Also, to protect people’s privacy, Resam reduces the monitored images to skeletons, which still allows analysis of key points such as arms, legs and torso to determine if someone has fallen or suffered a different accident that could lead to injury.

“The most dangerous place for falls is the bathroom, but nobody wants to set up a camera there,” Chen said. “People would hate it.”

He sees the Resam system as a cornerstone for a wider concept he’s calling Happy Home, which could include thermal or infra-red cameras and other sensors to assess remotely other aspects of a person’s environment and well-being.

“Adding more sensors can make our system more powerful, because we are not only monitoring someone’s body movements,” he said. “We can monitor someone’s health with one more dimension, so we better predict if something’s going to happen before it happens.”

Another idea, which Chen is exploring with associate professor Shiqi Zhang from the Department of Computer Science, is for the system to include a robot dog or similar pet that would keep a closer watch as someone did their daily tasks. Last autumn, Zhang demonstrated how a robot dog might guide someone with visual impairment through tugs on a leash (www.binghamton.edu/news/story/4565/binghamton-computer-scientists-program-robotic-seeing-eye-dog-to-guide-the-visually-impaired).

“You could have a conversation with the robot,” Chen said. “For example, when you are heading to the bathroom, the dog may ask you, ‘Would you mind if I follow you?’ The dog can make a better decision to move closer to monitor your status instead of having only fixed sensors in the room.”