The Zhitong Finance App learned that IDC published an article stating that up to now, most of the investors' attention has been focused on semiconductor companies involved in the AI value chain, and there are good reasons for this. In 2022, the IT infrastructure market for AI semiconductors was only $42.1 billion. IDC estimates that by 2023, this market size will have grown 64% to reach $69.1 billion. By 2024, IDC expects the market growth rate to increase further to 70%, reaching a market size of US$117.5 billion. By the end of 2027, IDC predicts that the IT infrastructure market for AI semiconductors will reach US$193.3 billion, with a five-year compound growth rate (CAGR) of 35.7%.
Note: The above chart only shows the semiconductor-related market in IT infrastructure, not including chips involved in terminal devices such as mobile phones or PCs.
So far, although AI training side growth has mainly occurred in data centers, we expect the next wave of growth to arrive as networks continue to be virtualized and AI is integrated into network infrastructure workloads. After that, the marginal side will be the next major opportunity. Because by the end of the prediction period, the number of billions of edge-connected devices will double, generating most of the data that drives AI's inference.
AI in the server
Looking at the server market as a whole, there are currently two very different trends.
In 2023, due to continued macroeconomic pressure, corporate investment slowed, and global server shipments fell 19.4% to 12 million units. However, during the same period, as hyperscale enterprises and large cloud service providers expanded GPU deployments one after another, GPU servers grew rapidly. This was largely driven by AI demand, so the average selling price (ASP) also rose 37.1% to $11,376. The rise in ASP offset the impact of the decline in sales, driving an overall market size increase of 10.5% to reach US$136 billion.
As for the AI server market, IDC estimates that AI servers will account for about 23% of the total server market in 2023, and this proportion will continue to rise in the future. By 2027, we expect revenue from the AI server market to reach $49.1 billion. One key assumption is that GPU-accelerated servers will continue to grow in revenue share across all acceleration types.
Application of AI in terminal devices
Although most AI workloads currently occur in the cloud and data centers, there are good reasons to run AI locally on devices such as PCs and smartphones. This is mainly due to the following three reasons:
Privacy and security: Businesses concerned about uploading sensitive data to the public cloud may choose to store the data locally to retain more control.
Cost savings: Running AI workloads locally throughout the life of the device can limit the need for expensive cloud subscriptions or access to resources, making it more cost effective.
Performance and latency: Running AI on devices can eliminate delays in transferring workloads between the cloud and the network.
AI PC Market
Although the market is still in its infancy, IDC expects AI PC shipments to rise sharply starting in 2024 and becoming popular in 2027. According to preliminary forecasts, AI PC shipments in 2024 will be close to 54.2 million units, accounting for about 21% of the total PC market. By 2028, we expect this ratio to rise to nearly 60%, which means that shipments of AI PCs will reach 166.4 million units.
AIPC's share of China's overall PC market is expected to reach 55% in 2024, while in 2027 it will reach 85%. Among them, AIPC will grow the fastest in the SME market due to lightweight IT deployment and cloud demand, which will exceed 60%; the consumer market will also develop rapidly, driven by scenarios such as gaming and learning and creation, reaching 55.4% in 2024; large enterprises are affected by security and other factors, and initial deployment is relatively slow, but it is expected that in 2027, AIPC's share of the large customer market will reach 76.8%.
Unlike foreign countries, the competition of the Chinese manufacturer AIPC focuses more on each vendor's personal intelligence capabilities, large model platforms, and related ecology. This makes the cooperation between the model computing power, related software, and related intelligent hardware devices provided to users by each manufacturer essential. As demand on the end side increases, local computing power requirements will also increase. Starting in the second half of 2024, NPU chips with 40 tops or more computing power will be released one by one. In addition to Intel, AMD, and Apple, Qualcomm will also release processors and related products with high NPU computing power. In the future, more manufacturers may enter the chip competition, and at the same time, market demand for NPU computing power will further grow.
AI smartphone market
Today, the vast majority of smartphones on the market have some degree of AI functionality integrated, especially when it comes to photography. A “next-generation AI phone” as defined by IDC refers to a device equipped with a system-on-chip (SoC). The built-in neural processing unit (NPU) has a performance of up to 30 trillion operations per second (TOPS) or more, and can run generative AI models more efficiently on the device.
According to our preliminary estimates, 170 million AI phones will be shipped by 2024, accounting for about 15% of total global smartphone sales. This figure has more than tripled from 2023. As more specific application scenarios emerge, we expect the share of AI smartphones to continue to rise.
In the Chinese market, IDC predicts that with the rapid iteration of new chips and user usage scenarios, the share of next-generation AI phones in the Chinese market will rise rapidly after 2024 — reaching 150 million units in 2027, and more than 50% of phones are expected to become next-generation AI phones. This transformation only takes 4 years. Objectively speaking, it took 8 years for traditional smartphones to reach the same milestone. Looking back at the smartphone era, driven by waves of innovation such as WeChat, mobile payments, and the sharing economy, it took 5 years for the smartphone penetration rate to reach 10%, and another 3 years to reach 59%.
Looking to the future: use cases and monetization
As the industry continues to evolve, artificial intelligence will become ubiquitous across different mediums and deployment environments. While most of the heavy training work will continue to take place in the cloud, it is foreseeable that more inference work will be done on the device and edge side. Therefore, it is particularly important for investors to track how and how fast AI is spreading throughout the value chain.
Importantly, in order to maintain the growth momentum already seen in the capital market, the following two points must be achieved:
Use cases for artificial intelligence must deliver on promises of increased productivity and efficiency. Although hardware is getting faster and more powerful, what new solutions these new features unlock and what “killer apps” they bring are the most important.
The path to monetization needs to be clear and implemented at a speed that investors deem acceptable. Although investors have so far largely accepted soaring capital expenditure, their scrutiny in this regard is likely to become more stringent as shareholders evaluate return on investment with high revenue expectations.