Amazon.com is developing a chip designed for artificial intelligence to work on the Echo and other hardware powered by Amazon’s Alexa virtual assistant, says a person familiar with Amazon’s plans. The chip should allow Alexa-powered devices to respond more quickly to commands, by allowing more data processing to be handled on the device than in the cloud.
The effort makes Amazon the latest major tech company, after Google and Apple, to design its own AI chips, in hopes of differentiating their products from those of rivals. That strategy has major ramifications for chip companies like Intel and Nvidia, which are now competing with companies that previously have been their customers.
• Amazon working on AI chip for Alexa devices
• AWS may also be working on AI chip for data centers
• Chip efforts reduce reliance on outside suppliers
The moves by Amazon and Google harken back to tech’s earlier days, when companies like IBM and Sun Microsystems used to do everything like making their own chips and software. For the past few years, Apple has also been pushing more aggressively into making its own chips to differentiate from the competition and lessen its reliance on outside partners.
Amazon has quietly developed chip-making capabilities over the past two years, through both acquisitions and hiring. It started in 2015 through the $350 million acquisition of Annapurna Labs, a secretive Israeli chipmaker. Amazon said nothing about its plans for Annapurna at the time, although Annapurna said in 2016 it was making a line of chips, called Alpine, for data storage gear, WiFi routers, smart home devices and media streaming devices.
But Annapurna is now working on the AI chip for Alexa-powered devices, said the person familiar with Amazon’s plans.
Amazon declined to comment.
Amazon’s purchase of security camera maker Blink last December, for an undisclosed amount, also beefed up its chip design capabilities. Blink was founded in 2008 as Immedia Semiconductor, building custom processing chips to do low-power video compression. After finding it difficult to make money selling chips to others, the company started building its own cameras powered by these chips.
Including Blink, Amazon has 449 people with chip expertise already on the payroll in the United States, according to research firm Paysa, with another 377 job openings in the field. Amazon has hired from all over, particularly Intel, Microsoft, Apple and Qualcomm. These numbers don’t include Annapurna, which has about 120 employees mostly in Israel, according to LinkedIn. Annapurna is hiring more talent for its Haifa, Israel, operation.
Developing a chip that can run AI algorithms could make Alexa-powered devices more appealing to consumers. It would mean the devices could handle some of the AI processing instead of shipping everything to the cloud.
Currently, the Echo has a relatively simple chip to allow the device to pick up the wake word “Alexa.” After the wake word is detected, what a person asks Alexa is sent to Amazon’s servers in the cloud to be processed. That creates a slight delay in the response and also makes it possible for hackers to intercept the communication. Amazon says the data being sent to the cloud is encrypted, reducing the risk of it being hacked.
The device would still need to talk to the cloud for more complex tasks and accessing online services, such as playing music. But simple tasks, such as checking the time, could be handled on the device.
“The experience can be so much better if you move a substantial part of the speech recognition to the device,” said Chris Rowen, a chip industry veteran and CEO of speech processing startup Babblabs. “It will also allow hardware and power costs to go down dramatically. I think it’s absolutely the natural thing to do.”
Data Center Chips
“The experience can be so much better if you move a substantial part of the speech recognition to the device.”
Amazon is also hiring chip engineers in its Amazon Web Services unit, and industry executives say there are indications it may be designing an AI chip for servers in AWS’ data centers. AWS has hired at least nine former engineers from defunct chip startup Tabula, according to LinkedIn profiles.
Founded in 2003, Tabula worked on a new design of a chip called a field-programmable gate array (or FPGA), which can be reprogrammed on the fly. It shut down in 2015. AWS currently uses FPGA chips made by Xilinx in the servers in its data centers.
One of the ex-Tabula engineers at AWS is Randy Huang, whose title is principal engineer at AWS, is leading an AI project at AWS and is looking to hire additional people, says a person familiar with Amazon’s plans. Mr. Huang was a high-up engineer at Tabula who led the chip architecture team, said person. Mr. Huang describes his expertise on LinkedIn as reformulating deep learning to take advantage of FPGA’s strengths. Mr. Huang didn’t respond to a message sent through LinkedIn.
If Amazon does develop an AI chip for its data centers, it would be following in the footsteps of Google. In 2016, Google unveiled a specialized processor called the Tensor Processing Unit, which is designed to run its deep learning algorithms. Google said the chip was running a number of its services like Search, Street View, Photos and Translate.
Google had been working on this processor since 2013, as detailed in a corporate blog post: “The situation became urgent in 2013. That’s when we realized that the fast-growing computational demands of neural networks could require us to double the number of data centers we operate."
Any moves that Amazon makes toward building its own data center chip poses a threat for Intel, which controls 98% of the market for the main chips inside servers, and Nvidia, which makes AI chips that work alongside Intel’s main chips in those servers.
“Players like Intel could be in a lot of trouble if this trend continues to play out,” said Geoff Tate, a long-time semiconductor entrepreneur who is currently the CEO of FPGA licensing startup Flex Logix Technologies. “At some point, data center players will build their own chips and compete with chip suppliers.”
There are also dozens of new startups emerging in the race for hardware that powers AI applications. They’re starting to attract plenty of investor interest, with a few players raising more than $100 million before even releasing a product. But the number of customers they can sell to is constrained, especially in areas like the data center. It’s unlikely the market can support more than a few of these chip startups. Still, these startups argue that they have the focus and specialization to build the best hardware.
"Just about every hyperscaler has at least one internal [AI chip] effort," said the CEO of a stealth AI chip startup. "The problem looks simple at first glance. In practice, achieving performance is much more complicated."
UPDATE: This story has been updated to include Amazon's comment about the risk of Alexa data being hacked.