Arista Introduces Etherlink AI Networking Platforms

  • Arista Networks introduces Arista Etherlink AI platforms for optimal network performance in AI workloads.
  • The platforms support AI clusters of various sizes with efficient one and two-tier network topologies.
  • Features include flow-level visibility and support for over 10,000 XPUs in a single-tier network.
  • The platforms leverage industry-leading Ethernet chips for superior performance and flexibility.
  • Arista EOS Smart AI Suite complements the platforms with advanced networking, security, and visibility features.
  • Availability: 7060X6 available now, 7800R4-AI and 7700R4 DES expected in 2H 2024.

Main AI News:

Arista Networks, a frontrunner in cloud and AI networking solutions, has unveiled its latest innovation: the Arista Etherlink™ AI platforms. These platforms are meticulously engineered to meet the escalating demands of AI workloads, encompassing both training and inferencing processes.

Powered by cutting-edge AI-optimized Arista EOS features, the Arista Etherlink AI portfolio is engineered to support AI cluster sizes ranging from thousands to hundreds of thousands of XPUs (AI accelerators). These platforms boast highly efficient one and two-tier network topologies, ensuring superior application performance compared to conventional multi-tier networks. Additionally, they offer advanced monitoring capabilities, including flow-level visibility.

According to Alan Weckel, Founder and Technology Analyst for 650 Group, “The network is core to successful job completion outcomes in AI clusters.” He emphasizes that the Arista Etherlink AI platforms provide customers with a unified 800G end-to-end technology platform, spanning front-end, training, inference, and storage networks. Leveraging well-proven Ethernet tooling, security, and expertise, customers can effortlessly scale up for any AI application.

Arista’s Etherlink AI Platforms

  1. 7060X6 AI Leaf Switch Family: This family features Broadcom Tomahawk 5® silicon, boasting a capacity of 51.2 Tbps and supporting either 64 800G or 128 400G Ethernet ports.
  2. 7800R4 AI Spine: Representing the 4th generation of Arista’s flagship 7800 modular systems, this platform is powered by the latest Broadcom Jericho3-AI processors. It offers non-blocking throughput with a state-of-the-art virtual output queuing architecture. The 7800R4-AI supports up to 460 Tbps in a single chassis, equating to 576 800G or 1152 400G Ethernet ports.
  3. 7700R4 AI Distributed Etherlink Switch (DES): Engineered to support the largest AI clusters, this platform offers customers massively parallel distributed scheduling and congestion-free traffic spraying. It is the inaugural product in a new series of ultra-scalable, intelligent distributed systems designed to deliver unparalleled throughput for extensive AI clusters.

These platforms enable a single-tier network topology to support over 10,000 XPUs, while a two-tier network can accommodate more than 100,000 XPUs. Minimizing the number of network tiers is crucial for optimizing AI application performance, reducing optical transceiver count, lowering costs, and enhancing reliability. Moreover, all Etherlink switches support the emerging Ultra Ethernet Consortium (UEC) standards, poised to deliver additional performance benefits upon the availability of UEC NICs in the near future.

Arista EOS Smart AI Suite

Complementing these revolutionary networking-for-AI platforms are the rich features of Arista EOS and CloudVision. This innovative software suite enhances AI-grade robustness and protection for high-value AI clusters and workloads. For instance, Arista EOS’s Smart AI suite integrates with SmartNIC providers to deliver advanced RDMA-aware load balancing and QoS. Furthermore, Arista AI Analyzer, powered by Arista AVA™, automates configuration, enhances visibility, and facilitates intelligent performance analysis of AI workloads.

Hugh Holbrook, Chief Development Officer at Arista Networks, highlights the company’s competitive edge stemming from its rich operating system and extensive product portfolio tailored to AI networks of all sizes. He underscores how the innovative AI-optimized EOS features expedite deployment, mitigate configuration issues, provide flow-level performance analysis, and enhance AI job completion times for any scale of AI cluster.

Availability

The 7060X6 is currently available, while the 7800R4-AI and 7700R4 DES are undergoing customer testing and are slated for release in the second half of 2024.

Conclusion:

Arista’s unveiling of the Etherlink AI platforms marks a significant advancement in AI networking capabilities. With support for diverse AI cluster sizes, efficient network topologies, and advanced features, Arista is poised to capture a significant share of the AI networking market, offering customers unparalleled performance, scalability, and reliability.

Source