Google has revealed new data center AI chips and an Arm-based central processor

  • Google unveils a new version of data center AI chips and an Arm-based central processor.
  • Tensor processing units (TPUs) offer an alternative to Nvidia’s AI chips, available via Google Cloud.
  • Axion CPU promises superior performance to x86 and general-purpose Arm chips.
  • Google emphasizes ease of migration for existing workloads to Arm architecture.
  • Rival cloud operators like Amazon and Microsoft are also exploring Arm CPU development.
  • Axion chip delivers 30% better performance than general-purpose Arm chips and 50% better than current x86 chips.
  • TPU v5p chip boasts twice the raw performance of its predecessor, optimized with liquid cooling.
  • Google plans to expand Axion’s usage and availability to the public later this year.

Main AI News:

In a significant development on Tuesday, Google unveiled insights into its latest iteration of data center artificial intelligence chips, alongside the introduction of an Arm-based central processor.

Google’s tensor processing units (TPUs) represent a notable alternative to Nvidia’s advanced AI chips. However, access to these TPUs is currently limited to Google’s Cloud Platform, which is unavailable for direct purchase.

Google’s strategy includes offering the Axion, an Arm-based central processing unit (CPU), via its Cloud services. Promising superior performance compared to x86 chips, the Axion also outpaces general-purpose Arm chips available in the cloud.

Mark Lohmeyer, Vice President, and General Manager of Compute and Machine Learning Infrastructure at Google Cloud, emphasized, “We’re simplifying the migration of customer workloads to Arm. Axion, rooted in open foundations, facilitates seamless adoption for Arm users without necessitating app re-architecture or rewriting.”

Major cloud competitors like Amazon.com and Microsoft have ventured into developing Arm CPUs, aiming to distinguish their computing services. While Google has previously engineered custom chips for YouTube, AI, and smartphones, this marks its debut in CPU design.

Broadcom’s collaboration with Google on earlier TPU chip generations is noteworthy. However, Google hasn’t disclosed any involvement or partnership for Axion’s development, leaving speculation on Broadcom’s role in TPU v5p.

Google’s subsidiary, Alphabet, reported that the TPU v5p is optimized for deployment in pods comprising 8,960 chips, delivering twice the raw performance of its predecessor. Liquid cooling mechanisms ensure optimal pod performance.

The Axion chip exhibits a remarkable 30% performance enhancement over general-purpose Arm chips and a staggering 50% improvement over current-gen x86 chips by Intel and AMD.

Axion has already found applications in various Google services, such as YouTube ads on Google Cloud. Google aims to broaden its usage and extend availability to the public “later this year.” Meanwhile, TPU v5p has been accessible via Google’s cloud services since Tuesday.

Conclusion:

Google’s introduction of a new version of data center AI chips and an Arm-based central processor signals a significant shift in the market. With superior performance and ease of migration to Arm architecture, Google is poised to challenge competitors like Nvidia, Amazon, and Microsoft in the cloud computing arena. The advancements in AI chip technology and CPU design will likely prompt further innovation and competition, driving the evolution of data center infrastructure and services. Businesses operating in the cloud computing sector should closely monitor these developments to remain competitive and capitalize on emerging opportunities.

Source