The emergence of AI chips has started to have an impact on data centers that are run by cloud providers. Earlier, artificial intelligence training and inference tasks were handled by mega data centres. But with the latest advancements in the semiconductor industries, AI chips can not only work in data centers but also enable edge devices like assisted driving vehicles and portable MRI machines.
While the AI chip market is at its initial stage, many experts believe that the market is going to boom in the coming five years. According to a leading analyst firm, the AI-chip market will reach approximately $129 billion in 2025, three times the market in 2018. Out of this, AI memory devices will alone account for $60 billion, and processors will account for $68 billion, both projecting an increase from their standings in 2019. These estimations include semiconductors in systems with AI capabilities like memory and processing devices of those systems.
"Semiconductors represent the foundation of the AI supply chain, providing the essential processing and memory capabilities required for every AI application on earth. AI is already propelling massive demand growth for microchips", said Luca De Ambroggi, senior research director for AI at IHS Markit, who made the above forecasts.
According to Ambroggi, the major players in this field include Intel and Nvidia. Intel stated last week that it had $3.8 billion in AI revenue in 2019 in the global market and estimated to reach $25 billion by 2024. Intel's CEO Bob Swan told analysts that the market for artificial intelligence-based silicon is growing rapidly.
While calculating the numbers, De Ambroggi analyzed 50 different AI companies to narrow it down to 10 in the coming years. Those companies include startups and non-traditional companies like Habana, Graphcore, Cambricon Technology, Cerebras, Kalray, Novumind, Thinci, Gryfalcon Technology, Syntiant, Greenwaves, Horizon Robotics, and Wave Computing. Some of these companies are into making microprocessors (MPUs), microcontrollers (MCUs), graphics processing units (GPUs), digital signal processors (DSPs), and field programmers (FPGAs).
De Ambroggi further explained, "Old definitions of what makes an MPU, DPS, or MCU are beginning to blur in the AI era. With processor makers offering turnkey solutions using ASICs and SoCs, it makes less difference to system designers whether their algorithm is executed on a GPU, CPU, or DSP."
Typically, AI work requires enormous amounts of power and memory. To make this possible, according to De Ambroggi, the latest technologies are putting memory close to the computational core by enabling processing parallelism with dedicated memory cells for each processing core. With these advancements, the conditions have pushed AI chipmakers to focus on different approaches like moving early stages of data computation into the memory with a technique called PIM, processing into memory, or using new memory technologies that allow back-end silicon integration or volatile performance.
"As of today, there are enough business cases for taking both directions, all with their own advantages and challenges. AI is still in its infancy. Moreover, AI grows and develops at an extremely high speed in different directions in all industries", said De Ambroggi.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.