COMMERCE TECH

Google Reveals a New AI Chip and a Supercomputer

Google recently revealed a new chip and a cloud-based machine-learning supercomputer.

At the company’s annual developer conference, CEO Sundar Pichai announced a new computer processor designed to perform deep learning. This establishes how AI focused Google is and also that the company will lead the development of every relevant aspect of software and hardware.

Called the Cloud Tensor Processing Unit (Cloud TPU), the chip is named after Google’s open-source TensorFlow machine-learning framework. Pichai also announced the creation of machine-learning supercomputers, or Cloud TPU pods, based on clusters of Cloud TPUs wired together with high-speed data connections. And he said Google was creating the TensorFlow Research Cloud, consisting of thousands of TPUs accessible over the Internet.

The new processor, Cloud TPU, not only executes at blistering speed, it can also be trained incredibly efficiently. The calculations required to train a model are so vastly complex that training might take days or weeks, even for something as direct as recognizing a bag of French fries. Google, providing some measure of performance acceleration achieved by TPUs, has claimed, what would require a full day of training on 32 of the best GPUs, can be done in an afternoon using one-eighth of one of its TPU Pods.

“We are building what we think of as AI-first data centers,” Pichai said during his presentation. “Cloud TPUs are optimized for both training and inference. This lays the foundation for significant progress [in AI].” Pichai also announced a number of AI research initiatives during his speech.

Google announced in its blog that it will make 1,000 Cloud TPU systems available to artificial intelligence researchers willing to openly share details of their work.

Google itself uses TensorFlow to power search, speech recognition, translation, and image processing. Strategically, Google would want to prevent another hardware company (read NVIDIA) from becoming too dominant in the machine-learning space.

Google in an attempt to democratize machine learning, will allow researchers to design algorithms using other hardware, before porting it over to the TensorFlow Research Cloud. Google now boasts that it is the most widely used deep-learning framework in the world.