- ABOUT IMC
- IoT LIBRARY
- RFP PROGRAMME
Visually impaired see world with AI-powered backpack
- March 30, 2021
- Steve Rogerson
Researchers at the University of Georgia have designed an AI-powered, voice-activated backpack using Intel technology that can help the visually impaired navigate and perceive the world around them.
The backpack helps detect common problems such as traffic signs, hanging obstacles, crosswalks, moving objects and changing elevations, all while running on a low-power, interactive device.
“Last year when I met up with a visually impaired friend, I was struck by the irony that while I have been teaching robots to see, there are many people who cannot see and need help,” said Jagadish Mahendran from the Institute for Artificial Intelligence at the University of Georgia. “This motivated me to build the visual assistance system with Open CV’s Oak-D artificial intelligence kit with depth, powered by Intel.”
The World Health Organisation estimates that globally 285 million people are visually impaired. Meanwhile, visual assistance systems for navigation are fairly limited and range from GPS-based, voice-assisted smartphone apps to camera-enabled smart walking sticks. These lack the depth perception necessary to facilitate independent navigation.
“It’s incredible to see a developer take Intel’s AI technology for the edge and quickly build a solution to make their friend’s life easier,” said Hema Chamraj, a director at Intel. “The technology exists; we are only limited by the imagination of the developer community.”
The system is housed inside a small backpack containing a host computing unit, such as a laptop. A vest jacket conceals a camera, and a fanny pack is used to hold a pocket-size battery pack capable of providing approximately eight hours of use. A Luxonis Oak-D spatial AI camera can be affixed to either the vest or fanny pack, then connected to the computing unit in the backpack. Three tiny holes in the vest provide viewports for the Oak-D, which is attached to the inside of the vest.
“Our mission at Luxonis is to enable engineers to build things that matter while helping them to quickly harness the power of Intel AI technology,” said Brandon Gilles, CEO of Luxonis. “So it is incredibly satisfying to see something as valuable and remarkable as the AI-powered backpack built using Oak-D in such a short period of time.”
The Oak-D unit is an AI device that runs on Intel Movidius VPU and the Intel distribution of the Open Vino toolkit for on-chip edge AI inferencing. It is capable of running neural networks while providing accelerated computer vision functions and a real-time depth map from its stereo pair, as well as colour information from a single 4k camera.
A Bluetooth-enabled earphone lets the user interact with the system via voice queries and commands, and the system responds with verbal information. As the user moves through their environment, the system audibly conveys information about common obstacles including signs, tree branches and pedestrians. It also warns of upcoming crosswalks, kerbs, staircases and entryways.