Google has a speedy new AI chip it doesn’t really want to talk about

Google has a speedy new AI chip it doesn’t really want to talk about


Image: Google


Google yesterday confirmed rumors that it has been working on a custom chip designed to speed up computing related to its artificial intelligence efforts.

The result, it said at its I/O developer conference, is a chip it calls a Tensor Processing Unit. It’s designed to work with TensorFlow, an open source software library for developing AI applications.

The TPU chips, Google says, are designed to be built into its existing computing infrastructure and are already in use boosting the performance of services like Street View and voice recognition. They also played a part in Google’s AlphaGo software that defeated the human champion at the board game Go.

Naturally, engineers and chip experts around the world have a lot of questions about this new chip. We asked Google for some answers but were told there are “more details coming later this year.”

1. Is the TPU pre-trained?

Artificial intelligence is closely linked to a science called machine learning, which is exactly what it sounds like. It takes millions of examples of data in order to train a computer to recognize patterns. “For example, if you want a computer to recognize pictures, you have to show it literally millions of pictures, and a human has to check the answers,” said Pat Moorhead, head of research firm Moor Insights and Strategy. Once the training is done, the AI system executes based on what it has “learned.” Examples of these “pre-trained” chips include IBMs TrueNorth and the Fathom developed by the startup Movidius.

2. Can the TPU be re-programmed if AI algorithms change?

Google referred to the chip as an ASIC (pronounced A-sick), which in the nomenclature of the chip industry stands for Application-Specific Integrated Circuit. ASICs are like the Kentucky Fried Chicken of chips: They are designed to do one thing, and only one thing really well. Those functions are hard-coded directly into the circuitry of the chip. That means that if the needs for those functions change, you have to redesign the chip itself and manufacture new ones, which can take months. Networking giant Cisco Systems uses ASIC chips in its routers, and ASIC chips can often be found in smartphones handling video and audio functions.

An ASIC is also a step up on the taxonomy of chips from an FPGA, or field-programmable gate array, which is essentially a chip that can be re-programmed to do specialized tasks. (Last year, Intel spent nearly $17 billion to buy an FPGA company, Altera.) Logically, the algorithms associated with AI applications will be subject to change over time at Google. Microsoft has been using FPGA chips to enhance the AI capabilities of its Bing search engine. So why not use an FPGA?

3. Will TPUs work only with TensorFlow?

TensorFlow is one of several AI software libraries. Is this chip open only to one?

4. Could several TPUs be connected in a system to work together?

This is common for other chips. Could several TPUs work on especially complicated AI problems together, or even teamed up with other chips?

5. In the server rack, why is the TPU inserted inside a hard drive?

Putting a chip close to the hard drive and not closer to a server’s main computing engine, typically an Intel Xeon chip, seems to place the TPU away from where the computing action is.

6. Where is the memory?

There’s probably a lot that’s obscured by the large metal heatsink that is the TPU’s most prominent visual feature and which is used to conduct heat away from the chip itself. Given the apparent size of the component Google has displayed, there doesn’t seem to be room for much memory, Moorhead said. “If you’re doing any training, you need a lot of memory,” which sends us back to question No. 1.

7. Where is the chip being built?

Google isn’t a chip company, and unless it’s been hiding one it doesn’t have a chip factory — typically called a fab — where this chip could have been built. Google has the resources to design it, but it would have farmed the job of manufacturing it out to a foundry company that builds chips under contract, probably Taiwan Semiconductor Manufacturing or GlobalFoundries. So which is it?

The Google rep remained mum.


May 19, 2016 / by / in , , , , , , , , , ,

Leave a Reply

Show Buttons
Hide Buttons

IMPORTANT MESSAGE: is a website owned and operated by Scooblr, Inc. By accessing this website and any pages thereof, you agree to be bound by the Terms of Use and Privacy Policy, as amended from time to time. Scooblr, Inc. does not verify or assure that information provided by any company offering services is accurate or complete or that the valuation is appropriate. Neither Scooblr nor any of its directors, officers, employees, representatives, affiliates or agents shall have any liability whatsoever arising, for any error or incompleteness of fact or opinion in, or lack of care in the preparation or publication, of the materials posted on this website. Scooblr does not give advice, provide analysis or recommendations regarding any offering, service posted on the website. The information on this website does not constitute an offer of, or the solicitation of an offer to buy or subscribe for, any services to any person in any jurisdiction to whom or in which such offer or solicitation is unlawful.