• Competition in the AI chipset space is heating up; new players are looking to join the fray and they are raising impressive amounts of capital.
• New vendors face stiff competition from tech heavyweights such as Nvidia, hyper-scale cloud providers such as Google and Amazon, and well-funded Chinese organizations.
Just as the market for AI platforms is heating up, so is competition in the AI chipset space. And it isn’t only the large well-established competitors such as Nvidia, Google, and Huawei vying for market share. New players are looking to join the fray as well, and they are raising impressive amounts of capital. Untether, a Toronto-based chip manufacturing start-up, announced in early November that it had raised $20 million in series A funding. The one year old company plans to release a chip designed for AI inference using near-memory design, reducing the distance data must travel, thereby moving data to processors at 2.5 petabits per second, which improves overall processing efficiency. Across the pond, Graphcore, a UK-based organization, has raised a substantial $200 million to develop its Intelligence Processing Units (IPUs), parallel processors designed for machine learning.
Although new entrants have been able to raise impressive amounts of capital in what is a highly lucrative market, it won’t be an easy road for them. They face stiff competition from tech heavyweights such as Nvidia, Intel, and Qualcomm, which have traditionally dominated the chip space. Furthermore, they will be competing with the AI processing solutions offered by hyper-scale cloud providers. Google is on its third generation of TPUs and has created TPU pods to further accelerate processing; Amazon has announced that it has its own chip in the works, AWS Inferentia.
And as if the landscape weren’t daunting enough, new players face competition from well-funded Chinese organizations that are often supported by a government that views AI as a matter of national security and is building its own self-sufficient AI ecosystem. Large companies such as Baidu, Huawei, and Alibaba have recently launched AI-optimized chips; and several Chinese start-ups such as Horizon Robotics and Cambricon are looking to enter the market with lower-cost AI chips designed for edge and mobile deployments.
Given the competitive environment, is there even room for another AI chip manufacturer? Can a newcomer build enough momentum via technical differentiation or pricing incentives to pull customers away from current market leaders and gain a foothold in the space? And then, how will developments in quantum computing and quantum simulation affect demand? Much remains to be seen, but what is crystal clear is that market trends around AI point to a need for flexibility. On the one hand, cloud-based processing can have its advantages when it comes to cost and scale; but not all organizations will embrace this model. Many companies will want to maintain their own AI processing infrastructure and will require affordable AI processing chipsets. And furthermore, organizations are increasingly evaluating AI at the edge, which will require processing near the site of data collection. Against this backdrop, demand for innovations that streamline and lower the cost of AI deployments is only going to gain momentum.