Katana VentraIP

Tensor Processing Unit

Tensor Processing Unit (TPU) is an AI accelerator application-specific integrated circuit (ASIC) developed by Google for neural network machine learning, using Google's own TensorFlow software.[2] Google began using TPUs internally in 2015, and in 2018 made them available for third-party use, both as part of its cloud infrastructure and by offering a smaller version of the chip for sale.

This article is about the chip developed by Google. For the smartphone system-on-chip, see Google Tensor. For other devices that provide tensor processing for artificial intelligence, see AI accelerator.

Designer

2015[1]

Comparison to CPUs and GPUs[edit]

Compared to a graphics processing unit, TPUs are designed for a high volume of low precision computation (e.g. as little as 8-bit precision)[3] with more input/output operations per joule, without hardware for rasterisation/texture mapping.[4] The TPU ASICs are mounted in a heatsink assembly, which can fit in a hard drive slot within a data center rack, according to Norman Jouppi.[5]


Different types of processors are suited for different types of machine learning models. TPUs are well suited for CNNs, while GPUs have benefits for some fully-connected neural networks, and CPUs can have advantages for RNNs.[6]

History[edit]

The tensor processing unit was announced in May 2016 at Google I/O, when the company said that the TPU had already been used inside their data centers for over a year.[5][4] The chip has been specifically designed for Google's TensorFlow framework, a symbolic math library which is used for machine learning applications such as neural networks.[7] However, as of 2017 Google still used CPUs and GPUs for other types of machine learning.[5] Other AI accelerator designs are appearing from other vendors also and are aimed at embedded and robotics markets.


Google's TPUs are proprietary. Some models are commercially available, and on February 12, 2018, The New York Times reported that Google "would allow other companies to buy access to those chips through its cloud-computing service."[8] Google has said that they were used in the AlphaGo versus Lee Sedol series of man-machine Go games,[4] as well as in the AlphaZero system, which produced Chess, Shogi and Go playing programs from the game rules alone and went on to beat the leading programs in those games.[9] Google has also used TPUs for Google Street View text processing and was able to find all the text in the Street View database in less than five days. In Google Photos, an individual TPU can process over 100 million photos a day.[5] It is also used in RankBrain which Google uses to provide search results.[10]


Google provides third parties access to TPUs through its Cloud TPU service as part of the Google Cloud Platform[11] and through its notebook-based services Kaggle and Colaboratory.[12][13]

Lawsuit[edit]

In 2019, Singular Computing, founded in 2009 by Joseph Bates, a visiting professor at MIT,[53] filed suit against Google alleging patent infringement in TPU chips.[54] By 2020, Google had successfully lowered the number of claims the court would consider to just two: claim 53 of US 8407273  filed in 2012 and claim 7 of US 9218156  filed in 2013, both of which claim a dynamic range of 10-6 to 106 for floating point numbers, which the standard float16 cannot do (without resorting to subnormal numbers) as it only has five bits for the exponent. In a 2023 court filing, Singular Computing specifically called out Google's use of bfloat16, as that exceeds the dynamic range of float16.[55] Singular claims non-standard floating point formats were non-obvious in 2009, but Google retorts that the VFLOAT[56] format, with configurable number of exponent bits, existed as prior art in 2002.[57] As of January 2024, subsequent lawsuits by Singular have brought the number of patents being litigated up to eight. Towards the end of the trial later that month, Google agreed to a settlement with undisclosed terms.[58][59]

Cognitive computer

AI accelerator

a mathematical foundation for TPU's

Structure tensor

a similar architecture by Nvidia

Tensor Core

a similar device simulating spiking neurons instead of low-precision tensors

TrueNorth

a similar device specialised for vision processing

Vision processing unit

(Documentation from Google Cloud)

Cloud Tensor Processing Units (TPUs)

Photo of Google's TPU chip and board

Photo of Google's TPU v2 board

Photo of Google's TPU v3 board

Photo of Google's TPU v2 pod