AI chipmakers are forced to flee to a quick exit – Techtime

by time news

February 21, 2023

Venture capital funds are holding back investments, most companies focus on outdated technology, and customers expect the level of support they are used to getting from Nvidia. According to Omdia, the start-up companies are starting to look for a buyer – and immediately!

Startup companies that develop artificial intelligence (AI) chips are entering an era of uncertainty and difficulty in reaching the market. As a result, they will soon be forced to present a business alternative, with the most common being an exit through their sale to the giant companies. This is what the Omdia research company estimates in a new report on the chip industry. According to the company’s estimate, between the years 2018-2022, venture capital funds in the world invested about 6 billion dollars in new companies focused on the production of artificial intelligence chips, but this era is over.

“The transition from a market suffering from a shortage of components to a market suffering from an excess of components, the changes in monetary policy in the world and the economic crisis that began to develop in 2022, changed the economic atmosphere and made it difficult to raise venture capital.” Although these are problems that all start-up companies face, there is another difficulty in the field of artificial intelligence. Senior analyst at Omdia, Alexander Harwell, explained: “Even AI companies that enjoy the best financial backing are now required to provide software support at the high level that the market has become accustomed to receiving from Nvidia. This is a very large entry barrier that makes it difficult for companies to reach the market.”

The capital of the industry is in the cloud companies

Following this, the research company predicts that leading companies in the field of AI chips will decide to exit this year, most likely by selling them to one of the cloud giants or one of the major chip manufacturers: “Apple has an available capital of $23 billion and Amazon has $35 billion, while chip manufacturers such as Nvidia, Intel and AMD have about 10 billion dollars available for investment – each. The large cloud companies (hyperscalers) have already shown that they are interested in integrating dedicated AI components, and that they can afford to purchase these capabilities.”

The technological trap of CGRA

It is interesting to note that about half of all the capital raised ($6 billion) is directed towards only one technology: Coarse Grained Reconfigurable Array acceleration components – CGRA. These are accelerators that work alongside the central processing unit (CPU), which are based on large arrays of parallel processing units (Processing Elements), reminiscent of the concept of the ALU in programmable FPGA components.

In most cases, they are built with the aim of fully loading the component with the artificial intelligence model – however, today there are doubts about the effectiveness of this strategy – especially in light of the continuous increase in the scope of these models. Harvel: “In 2018-2019 it made sense to try to load an entire model on a single chip, because they worked at a very high speed and solved input/output problems of the models.” However, the dramatic growth in the size of the models creates problems with the ability to expand (scalability). The newer models are more sophisticated and complex, therefore requiring a great deal of programming capabilities for their CPU components. “It may be that the future of artificial intelligence chips lies elsewhere.”

Posted in the categories: artificial intelligence, news, research and market data, semiconductors

Posted in tags: artificial intelligence, semiconductors, chips

You may also like

Leave a Comment