- ABOUT IMC
- IoT LIBRARY
- RFP PROGRAMME
Microsoft moves AI to the edge
- March 17, 2021
- William Payne
At its Ignite digital conference, Microsoft unveiled the public preview of Azure Percept, a platform of hardware and services that aims to simplify the ways in which customers can use Azure AI technologies on the edge – including taking advantage of Azure cloud offerings such as device management, AI model development and analytics.
The Azure Percept platform includes a development kit with an intelligent camera, Azure Percept Vision. There is also a user-friendly environment called Azure Percept Studio that guides customers with or without a lot of coding expertise or experience through the entire AI lifecycle, including developing, training and deploying proof-of-concept ideas.
Azure Percept Vision and Azure Percept Audio, which ships separately from the development kit, connect to Azure services in the cloud and come with embedded hardware-accelerated AI modules that enable speech and vision AI at the edge, or during times when the device isn’t connected to the internet.
Roanne Sones, corporate vice president of Microsoft’s edge and platform group, said the goal of the new offering is to give customers a single, end-to-end system, from the hardware to the AI capabilities, that “just works” without requiring a lot of technical know-how.
In addition to announcing hardware, Microsoft says it is working with third-party silicon and equipment manufacturers to build an ecosystem of intelligent edge devices that are certified to run on the Azure Percept platform, Sones said.
“We’ve started with the two most common AI workloads, vision and voice, sight and sound, and we’ve given out that blueprint so that manufacturers can take the basics of what we’ve started,” she said. “But they can envision it in any kind of responsible form factor to cover a pattern of the world.”
Microsoft’s goal with the Azure Percept platform is to simplify the process of developing, training and deploying edge AI solutions, making it easier for more customers to take advantage of these kinds of offerings, according to Moe Tanabian, a Microsoft vice president and general manager of the Azure edge and devices group.
Most successful edge AI implementations today require engineers to design and build devices, plus data scientists to build and train AI models to run on those devices. Engineering and data science expertise are typically sets of skills held by different groups of highly trained people.
“With Azure Percept, we broke that barrier,” Tanabian said. “For many use cases, we significantly lowered the technical bar needed to develop edge AI-based solutions, and citizen developers can build these without needing deep embedded engineering or data science skills.”
The hardware in the Azure Percept development kit uses the industry standard 80/20 T-slot framing architecture, which Microsoft says will make it easier for customers to pilot proof-of-concept ideas everywhere from retail stores to factory floors using existing industrial infrastructure, before scaling up to wider production with certified devices.
As customers work on their proof-of-concept ideas with the Azure Percept development kit, they will have access to Azure AI Cognitive Services and Azure Machine Learning models as well as AI models available from the open-source community that have been designed to run on the edge.
Azure Percept devices automatically connect to Azure IoT Hub, which helps enable reliable communication with security protections between Internet of Things, or IoT, devices and the cloud. Customers can also integrate Azure Percept-based solutions with Azure Machine Learning processes that combine data science and IT operations to help companies develop machine learning models faster.
In the months to come, Microsoft aims to expand the number of third-party certified Azure Percept devices, so anybody who builds and trains a proof-of-concept edge AI solution with the Azure Percept development kit will be able to deploy it with a certified device from the marketplace, according to Christa St. Pierre, a product manager in Microsoft’s Azure edge and platform group.
“Anybody who builds a prototype using one of our development kits, if they buy a certified device, they don’t have to do any additional work,” she said.
The Azure Percept team is currently working with a number of early customers to understand their concerns around responsible development and deployment of AI on edge devices, and the team will provide them with documentation and access to toolkits such as Fairlearn and InterpretML for their own responsible AI implementations.
Ultimately, Sones said, Microsoft hopes to enable the development of an ecosystem of intelligent edge devices that can take advantage of Azure services, in the same way that the Windows operating system has helped enable the personal computer marketplace.
“We are a platform company at our core. If we’re going to truly get to a scale where the billions of devices that exist on the edge get connected to Azure, there is not going to be one hyperscale cloud that solves all that through their first-party devices portfolio,” she said. “That is why we’ve done it in an ecosystem-centric way.”