Arm puts neural processing into IoT

  • April 17, 2024
  • Steve Rogerson
Paul Williamson at last week’s Embedded World.

Arm used last week’s Embedded World show in Nuremberg to show its recently launched neural processing unit (NPU) and IoT reference design.

As edge AI scales, silicon innovators must navigate growing system and software complexity, an ever-increasing demand for AI performance, and pressure to accelerate their time to market. At the same time, software developers need more consistent, streamlined experiences and easy integration with emerging AI frameworks and libraries.

The NPU and IoT reference design were announced last week (www.iotm2mcouncil.org/iot-library/news/iot-newsdesk/arm-announces-new-iot-reference-design-platform) and Arm senior vice president for IoT Paul Williamson was at the show to explain their benefits.

“This brings support for machine-learning frameworks,” he said. “It significantly improves the efficiency and is the technology behind large-language models.”

The Arm Ethos-U85 (newsroom.arm.com/blog/ethos-u85) provides the same consistent toolchain as others in the range so partners can leverage existing investments for a seamless developer experience. Importantly, it provides support for AI frameworks such as TensorFlow Lite and PyTorch.

It supports transformer networks as well as convolutional neural networks (CNNs) for AI inference. Transformer networks can drive applications in vision and generative AI use cases for tasks such as understanding videos, filling in missing parts of images or analysing data from multiple cameras for image classification and object detection.

“The world is increasingly using AI for IoT edge applications such as anomaly detection,” said Williamson. “It has been used for voice and image analysis but now increasingly with video for, say, monitoring for defects in industrial applications. It can do this at high speed on a production line.”

With the deployment of microprocessors into more high-performance IoT systems for use cases such as industrial machine vision, wearables and consumer robotics, Arm designed the Ethos-U85 to work with its Cortex-A CPUs to accelerate ML tasks and bring power-efficient edge inference into a broader range of higher-performing devices.

“It can also be used for people detection in smart home and smart city applications,” said Williamson.

The Corstone-320 IoT reference design works with the Ethos-U85 to deliver the performance required to span edge AI applications for voice, audio and vision, such as real-time image classification and object recognition, or enabling voice assistants with natural language translation on smart speakers. The platform includes software, tools and support including Arm virtual hardware. This combination of hardware and software should accelerate product timelines by enabling software development to start ahead of silicon being available, improving time to market for these increasingly complex edge AI devices.

“It has the highest performance Cortex processor and a Mali image signal processor,” said Williamson. “The reference design helps people get to market quickly. It can be used in logistics to monitor stock in a warehouse or be doorbells to recognise who is outside.”

As AI adoption grows, he said, everyone from start-up innovators to the world’s biggest microcontroller players were converging on Arm as the platform of choice to deliver their AI from cloud to edge.

The Industrial Technology Research Institute, or Itri (www.itri.org), has established the Itri-Arm SystemReady Lab in Taipei, in partnership with Arm. This certification centre is the fourth of its kind globally, following ones in the USA, Europe and India. The lab combines Itri’s R&D strengths with the Arm SystemReady compliance programme to deliver certification services for the AIoT industry. This initiative is poised to drive Taiwan’s industrial integration of AI and IoT ecosystems.