Equities
5 min read 1 Oct 25
The fundamental driver for increasing AI performance remains advancement in processing power from semiconductors. As semiconductors become more powerful, more advanced workloads are possible.
The increase in demand for accelerated computing processors for AI training began in early 2023 and remains strong. At its most recent annual user conference, Nvidia laid out its product roadmap until 2027 and said it expects performance to double every year. This is the fuel powering the mighty AI product growth engine.
We have entered the next phase of growth for the semiconductors that enable AI. As AI becomes trained to deliver AI services, inference infrastructure1 is being built that will require GPUs (graphics processing units), as well as customised ASICs (Application Specific Integrated Circuits). It is believed that the infrastructure for AI inference could be larger than the infrastructure required for AI training as AI models become deployed.
We are now seeing a significant acceleration of revenue growth from companies providing AI infrastructure services. For example, Snowflake, a provider of data warehouse services in the cloud, is seeing growth accelerate as companies need a place to store data for AI purposes. Oracle and Microsoft are also seeing strong demand for AI infrastructure services.
While there have been some concerns of an “overbuild of AI infrastructure”, Microsoft and Oracle have stated that they have contracted business for the infrastructure investments they are making.
In its recent quarterly results, Oracle reported that remaining performance obligations – essentially a measure of future revenues – were a jaw-dropping US$455 billion, up 359% year-on-year and indicative of the multi-year deals it has signed with large customers2. This sort of pent-up demand was above and beyond what even the most ardent of bulls expected.
We continue to believe the opportunity for enterprise AI (where AI is integrated into organisations) remains massive in all aspects of business operations. To borrow a phrase often attributed to Bill Gates, this could be a classic example of “overestimating what can be done in a year and underestimating what can be done in 10 years”.
However, a report from the Massachusetts Institute of Technology’s project Nanda, titled “The GenAI Divide: State of AI in Business 2025”, revealed that many AI projects are yet to be truly transformational in an enterprise setting. This caused some concern that the promises of AI are falling short of expectations.
Yet we are observing mounting evidence that interest in AI remains high at a wide variety of companies, to drive both new revenues and cost efficiencies. Nonetheless, implementation of a “holistic” AI system in an enterprise setting is complex because data across multiple systems must be linked together and Large Language Models (LLMs) need to be “tuned” specifically for organisations. We believe this is coming, though, and will likely accelerate in the next few years.
We recently travelled to the San Francisco Bay Area and met with several leading AI companies such as OpenAI, Nvidia, Broadcom, Netflix, AppLovin, Reddit, Databricks, and many more. We also heard from established venture capital firms (VCs) investing in this space. Our overwhelming takeaway is the pace of innovation and change in this sector is staggering. Our main observations include:
In the enterprise world, this could pose a challenge for large organisations as it requires a lot of heavy blocking and tackling to get the data “AI-ready” so AI tools can properly access the data. For many enterprises, data infrastructure and security are hugely complex issues and will require a lot of investment to address bottlenecks.
We categorize the AI opportunity set into Enablers – companies that supply the foundational technology such as semiconductors; Providers – companies, primarily in the enterprise software sector, which use AI to make their products easier to use; and Beneficiaries – companies that employ AI internally to improve their products and services.
The AI investment opportunity is also global and potential investments are spread across all sectors of the economy. Our strategy seeks to have a balance between these three categories in the portfolio, as well as identifying opportunities from across different countries and sectors.
We think the AI investment opportunity will evolve over time, moving from the enablers towards beneficiaries, as the technology becomes more pervasive throughout the real economy. Earlier this year, we took advantage of the tariff-related sell-off to increase our exposure to the enabler category. More recently, we increased our non-US exposure by adding investments in China and Latin America to the portfolio.
The generative AI era is rapidly evolving into agentic AI, where AI can think and reason like humans. Beyond agentic AI, we believe robotic AI will soon follow.
We continue to believe that the AI innovation cycle is multi-decade. In our view, each iteration of AI increases the addressable market for the technology, enabled by increases in computing power, creating the potential for long-term growth.
The value of investments will fluctuate, which will cause prices to fall as well as rise and investors may not get back the original amount they invested. Past performance is not a guide to future performance. The views expressed in this document should not be taken as a recommendation, advice or forecast.