- NPUs are gaining traction for IoT applications, driven by their efficiency in handling neural network tasks.
- AI chipset revenues are projected to reach US$7.3 billion by 2030 due to the rise of NPUs.
- Major chipset vendors like ST Microelectronics, Infineon, and NXP Semiconductors are introducing NPUs to their portfolios.
- NPUs have a strong presence in TinyML applications within Personal and Work Devices but are still emerging in other areas.
- MLOps toolchains are becoming crucial for both established companies and startups.
- Advances in software, such as Eta Compute’s collaboration with NXP, are lowering the barriers to TinyML adoption.
- NPUs and FPGAs are enhancing the range of applications for embedded devices, including object detection and NLP.
- The growth of AI in edge devices and IoT infrastructure is expected to drive significant market expansion.
Main AI News:
The rise of Neutral Processing Units (NPUs) for Internet of Things (IoT) applications is set to drive substantial growth in AI chipset revenues, reaching an estimated US$7.3 billion by 2030. This shift is driven by NPUs’ efficient processing of neural network tasks, which is making them increasingly preferred over traditional Microcontrollers (MCUs). ABI Research highlights that this transition will be pivotal as NPUs gain prominence in IoT applications, particularly at the edge, where deeper insights and enhanced intelligence are crucial.
Paul Schell, Industry Analyst at ABI Research, notes that NPUs have already made significant inroads in TinyML applications within Personal and Work Devices (PWDs). However, their presence remains limited outside these segments, with major players like ST Microelectronics, Infineon, and NXP Semiconductors only now introducing NPUs into their product lines. Schell emphasizes that ABI Research’s IoT modeling, which includes key verticals such as Smart Home and Manufacturing, has benefited from a deeper examination of PWDs.
On the software front, a robust suite of MLOps tools is becoming essential for both established companies and emerging startups like Syntiant, GreenWaves, Aspinity, and Innatera. This investment in software development, often paralleling hardware advancements, has proven fruitful, as seen in Eta Compute’s collaboration with NXP to deploy the Aptos software platform. These advancements are also lowering the barriers to TinyML adoption by minimizing the need for specialized in-house data science expertise.
Incorporating high-performance NPUs and FPGAs into embedded devices is expanding the range of on-device applications, from object detection and classification to natural language processing for audio analytics. This trend, alongside the growing adoption of AI in larger edge devices such as PCs and gateways, is expected to enhance AI scalability, reduce networking costs, and diminish reliance on cloud computing. The continued evolution of IoT infrastructure, intelligent vehicles, and smart home devices is anticipated to drive significant growth in the TinyML market, fueled by these technological advancements.
Conclusion:
The shift towards NPUs in IoT applications is poised to significantly boost AI chipset revenues, with projections indicating a rise to US$7.3 billion by 2030. This trend underscores a major transformation in the market, driven by the efficiency of NPUs in neural network processing and the increasing need for sophisticated intelligence at the edge. As major vendors integrate NPUs into their offerings and investments in MLOps tools continue to grow, the adoption of TinyML is expected to expand, reducing reliance on cloud computing and enhancing the scalability of AI solutions. The evolving landscape of IoT infrastructure, coupled with advancements in embedded device capabilities, will likely drive substantial growth in the TinyML market.