Google Claims Its AI Supercomputer That Uses Tensor Chips Is Faster, More Power Efficient Than Nvidia
Alphabet's Google on Tuesday released new details about the supercomputers it uses to train its artificial intelligence models, saying the systems are both faster and more power-efficient than comparable systems from Nvidia . Google has designed its own custom chip called the Tensor Processing Unit, or TPU. It uses those chips for more than 90 percent of the company's work on artificial intelligence training, the process of feeding data through models to make them useful at tasks like responding to queries with human-like text or generating images. The Google TPU is now in its fourth generation. Google on Tuesday published a scientific paper detailing how it has strung more than 4,000 of the chips together into a supercomputer using its own custom-developed optical switches to help connect individual machines. Improving these connections has become a key point of competition among companies that build AI supercomputers because so-called large language models that power tech