SoftBank-owned chip designer Arm unveiled a host of developments designed to boost the IoT sector by pushing AI into billions of small, power constrained and embedded devices.

The company added new machine learning (ML) IP, a new Cortex processor and neural processing unit (NPU) to its AI platform. In a statement, Arm said the combination of its latest IP with a related set of tools would provide hardware and software developers with more options for innovation when delivering on-device ML, while lowering overall silicon and development costs.

“Enabling AI everywhere requires device makers and developers to deliver ML locally on billions, and ultimately trillions of devices,” Dipti Vachani, SVP and GM of Arm’s Automotive and IoT unit, explained.

To achieve this, the company added the Cortex-M55 to its line of processors, its first to run the Arm v8.1-M architecture, which Arm said “significantly” enhances the performance and energy efficiency of DSP and ML, delivering up to a five-time boost to the former and up to 15-times for the latter.

Also unveiled was the Ethos-U55, the company’s first microNPU, which Arm said can contribute to a 480-times increase in ML performance over existing Cortex-M processors when paired with the M55. The neural unit employs “advanced compression techniques” to lower power requirements and “ML model sizes” to “enable execution of neural networks that previously only ran on larger systems”.

Arm explained the meeting of IoT with AI and 5G will fuel greater use of on-device intelligence, with a push towards smaller and more cost-sensitive units. It highlighted this would boost privacy and reliability by reducing reliance on the cloud or internet. Another benefit is lowering costs and time-to-market for microcontroller makers “looking to efficiently enhance digital signal processing and ML capabilities on-device”, it stated.