Google may be best known for its software, but when it comes to powering this new generation of AI computing, Google is equally embedded in the hardware development and deployment side.
The company’s custom TPU silicon provides the necessary energy efficiency savings needed to deploy machine learning on a large cloud scale. It also offers up notably higher performance for these specific tasks than more generalized CPU and GPU hardware. We’re seeing a similar trend in the mobile space, with SoC manufacturing increasingly turning to dedicated DSP hardware to efficiently run these mathematically intensive algorithms. Google could become a major hardware player in this market too.
We’re still waiting to see what Google has in store for its first generation smartphone AI hardware, the Pixel Visual Core. The chip will soon be switched on for faster HDR processing and will no doubt play a role in some further AI tests and products that the company rolls out to its Pixel 2 smartphones. At the moment, Google is leading the way forward with its Cloud TPU AI hardware and software support with TensorFlow. It’s worth remembering that Intel, Microsoft, Facebook, Amazon, and others are all vying for a piece of this quickly emerging market too.
With machine learning and neural networks powering an increasing number of applications both in the cloud and on edge devices like smartphones, Google’s early hardware efforts have positioned the company to be a leader in this next generation field of computing.
On – 25 Nov, 2017 By Robert Triggs