Google AI Chips Gang Up To Speed Up Training Process
Dhir Acharya
According to the announcement during its I/O conference, Google will now allow combining various TPU chips to enhance the performance.
- 4 Ways AI Could Change The Mobile Gaming Industry
- Google To Support 1 Million Women Entrepreneurs In Rural India
- Google May Be Working On A Foldable Phone That Looks Like This
If you think that Google data centers are limiting your AI abilities, the tech giant - whose parent company is Alpahbet Inc. - will now allow you to combine various TPU (tensor processing unit) chips to enhance the performance.
According to the company’s announcement during its I/O conference on Tuesday, the Google Cloud service now offers users with TPUs that are linked into “pods.” As a result, this will speed up the training phase of AI, mostly benefiting the process where AI systems learn to detect patterns in real-world data.
With the speedup of AI training thanks to such larger systems, customers can not only build more complex models but also update their models more regularly with new data.
Since the processor progress based on Moore’s Law has become less effective, lots of companies are shifting to make AI chips or integrate AI abilities into their existing processors. These tech firms include giants Apple and Google, as well as other chip powers such as Qualcomm, Intel, Nvidia, startups such as Flex Logix and Wave Computing, and finally other players such as Tesla, the car making company that has made its own AI chip to enable self-driving mode in the Model S, Model X and Model 3 automobiles.
At its annual I/O conference this year, Google extensively uses AI and showed off a lot of new uses for this technology. But this move is like what Facebook and Apple have done, pushing AI processing off the servers while integrating it into devices including home hubs and phones. This will both relieve Google’s servers and help better protect privacy.
Small devices can support AI tasks such as face and speech recognition. However, it requires massive computing systems to train AI, ones found at giant cloud players namely Google, Microsoft, and Amazon.