Google claims the supercomputers used for training its artificial intelligence (AI) models are faster and more energy-efficient than those employed by multinational technology firm Nvidia.
Google researchers detailed how they created a supercomputer from more than 4,000 fourth-generation Tensor Processing Units (TPUs), as well as custom optical switches to link individual machines.
The AI models are segmented across thousands of chips, which must collaboratively train the models for weeks or more.
Google's Norm Jouppi and David Patterson explained, "Circuit switching makes it easy to route around failed components. This flexibility even allows us to change the topology of the supercomputer interconnect to accelerate the performance of an ML (machine learning) model."
Google says its new supercomputer is up to 1.7 times faster and 1.9 times "greener" than a system based on Nvidia's A100 chip.
From The Tech Portal (India)
View Full Article
Abstracts Copyright © 2023 SmithBucklin, Washington, D.C., USA
No entries found