Arm is preparing for a blockbuster initial public offering at a time when investors are very interested in both semiconductors and artificial intelligence.
Part of that may be down to Softbank
But the company is a different proposition to Nvidia and is unlikely to see the benefits of the AI boom in the near-term, analysts told CNBC.
Nvidia vs. Arm: A comparison
AI has been thrust into the spotlight, in large part thanks to OpenAI’s ChatGPT. This is a technology known as generative AI because the AI is able to generate answers in response to user prompts.
Such an AI is based on a model which is trained on huge amounts of data. A vast amount of computing power is required to train these AI models.
Nvidia designs a type of semiconductor called a graphics processing unit or GPU, which go into data centers to train and run these AI models.
The soaring interest in generative AI has seen Nvidia’s earnings surge.
Arm, meanwhile, is a company that designs the blueprint or “architectures” of certain semiconductors. These architectures are the overall designs, including components and programming language instructions that other companies use to build chips. Arm mainly designs central processing units or CPUs.
Arm-based CPUs are in 99% of the world’s smartphones including from major players like Apple.
While CPUs are also required in the data center, they’re often used in conjunction with a GPU to train data, but not always.
Arm makes most of its money from royalties and licensing its architecture. More than 50% of this revenue comes from smartphones and consumer electronics. So far, it is not seeing a big boost from AI.
“Growth in the near term for Arm is really not about AI, it’s about mobile, it’s about royalty increases,” Jamie Mills O’Brien, investment director at Abrdn, told CNBC’s “Street Signs Europe” on Monday.
“In the longer term, I think Arm is trying to focus investors minds on the potential … AI in the edge, AI in the data center, but at the moment that’s not a huge part of the company’s exposure.”
Arm’s future in AI
Arm’s AI future is unlikely to come from the huge amounts of chips required to train big data models.
Instead, it’s more likely to be a major player in AI on the “edge.” This phrase refers to AI processes carried out on a device, such as a smartphone, rather than in the cloud, like ChatGPT.
For this to happen, devices will require low-power but high-performance chips able to carry out the computing required for AI applications. Arm is designing the architecture for these chips.
“If you’re doing AI on a smartphone or car you’re not going to have that same level of compute power, so you need to optimize the model to run locally,” Peter Richardson, research director at Counterpoint Research, told CNBC.
“Those processors will almost certainly be Arm-based”
Arm said in its IPO filing that its processors already run AI workloads “and every smartphone currently in the market efficiently runs AI inference applications, such as voice recognition and applying filters to digital images.”
However, Arm is unlikely to see the benefit from AI filter through to its revenue for at least three-to-five years, Richard Windsor, founder of Radio Free Mobile, told CNBC.
What SoftBank has been required to do is to sell Arm as an AI company like Nvidia,” Windsor said.
“Now, in the long term absolutely, I’m a big proponent on running AI on end-devices, it makes an awful lot of economic sense for the provider of the service, and also much more in general in terms of the quality of the service, privacy and security and so on and so forth. But those revenues are not accruing to Arm right now.”