Amazon’s Inferentia Chip to Handle Alexa Voice Assistant Services

Amazon’s Inferentia Chip to Handle Alexa Voice Assistant Services

Amazon recently announced that it has shifted part of the computing for Alexa voice assistant into its custom-designed chips. Amazon had previously handled the computing using chips from Nvidia Corporation.

The company is currently aiming to make the work economical and faster while moving things away from Nvidia. Amazon says this move has resulted in nearly 25% better latency – a measure of speed – at about 30% lower cost.

Read More: The Threat Intelligence Platform Market to Touch $234.9 Million by 2022

The majority of the processing will now utilize the ‘Inferentia’ computing chip. These chips are designed to accelerate large volumes of machine learning tasks, including tasks such as translating text to speech and recognizing images.

Source: Reuters

Previous articleGroup of 165 Companies Seek Strict EU Antitrust Action Against Google
Next articleMcKean Defense Announces Signing of Definitive Agreement to Acquire Mikros Systems Corporation