Exploring the evolving AI landscape

5 min read 1 Oct 25

This summer saw no shortage of news flow and events concerning artificial intelligence (AI). In this update, Jeffrey Lin, Head of Thematic Technology Equities, reflects on the evolving AI landscape, including technological advances and demand for infrastructure, and outlines why he believes AI innovation has the potential to deliver strong multi-decade revenue growth. 

Compounding computing

The fundamental driver for increasing AI performance remains advancement in processing power from semiconductors. As semiconductors become more powerful, more advanced workloads are possible.

The increase in demand for accelerated computing processors for AI training began in early 2023 and remains strong. At its most recent annual user conference, Nvidia laid out its product roadmap until 2027 and said it expects performance to double every year. This is the fuel powering the mighty AI product growth engine.

We have entered the next phase of growth for the semiconductors that enable AI. As AI becomes trained to deliver AI services, inference infrastructure1 is being built that will require GPUs (graphics processing units), as well as customised ASICs (Application Specific Integrated Circuits). It is believed that the infrastructure for AI inference could be larger than the infrastructure required for AI training as AI models become deployed.

“…the infrastructure for AI inference could be larger than the infrastructure required for AI training.”

We are now seeing a significant acceleration of revenue growth from companies providing AI infrastructure services. For example, Snowflake, a provider of data warehouse services in the cloud, is seeing growth accelerate as companies need a place to store data for AI purposes. Oracle and Microsoft are also seeing strong demand for AI infrastructure services.

Listen to the Oracle

While there have been some concerns of an “overbuild of AI infrastructure”, Microsoft and Oracle have stated that they have contracted business for the infrastructure investments they are making.

In its recent quarterly results, Oracle reported that remaining performance obligations – essentially a measure of future revenues – were a jaw-dropping US$455 billion, up 359% year-on-year and indicative of the multi-year deals it has signed with large customers2. This sort of pent-up demand was above and beyond what even the most ardent of bulls expected.

We continue to believe the opportunity for enterprise AI (where AI is integrated into organisations) remains massive in all aspects of business operations. To borrow a phrase often attributed to Bill Gates, this could be a classic example of “overestimating what can be done in a year and underestimating what can be done in 10 years”.

“…this could be a classic example of “overestimating what can be done in a year and underestimating what can be done in 10 years”.

However, a report from the Massachusetts Institute of Technology’s project Nanda, titled “The GenAI Divide: State of AI in Business 2025”, revealed that many AI projects are yet to be truly transformational in an enterprise setting. This caused some concern that the promises of AI are falling short of expectations.

Yet we are observing mounting evidence that interest in AI remains high at a wide variety of companies, to drive both new revenues and cost efficiencies. Nonetheless, implementation of a “holistic” AI system in an enterprise setting is complex because data across multiple systems must be linked together and Large Language Models (LLMs) need to be “tuned” specifically for organisations. We believe this is coming, though, and will likely accelerate in the next few years.

Views from the AI frontline

We recently travelled to the San Francisco Bay Area and met with several leading AI companies such as OpenAI, Nvidia, Broadcom, Netflix, AppLovin, Reddit, Databricks, and many more. We also heard from established venture capital firms (VCs) investing in this space. Our overwhelming takeaway is the pace of innovation and change in this sector is staggering. Our main observations include:

  • Winners in AI are growing exponentially. Leading VCs spoke of recent AI investments where their portfolio companies have grown revenues 10x in a year. OpenAI, the owner of ChatGPT, said its revenues are expected to more than triple this year to $13bn (from $4bn last year). On the flipside, companies who are not adopting AI fast enough are at risk of seeing their businesses disintermediated.
  • As a result, speed is of the essence. Databricks, a cloud-based data analytics platform, alluded to this in relation to one of its acquisitions. The CEO acknowledged he could have potentially acquired this company cheaper a year later, but he could not afford to wait. This is one of the key reasons we are seeing such a rapid pace of spending in building AI factories and a frenzy in hiring key AI talent.
  • Demand for compute is insatiable. OpenAI said it has nowhere near the amount of compute it needs to satisfy its customers' needs. In fact, OpenAI mentioned that it often has to make trade-offs with many of its products because of the lack of compute. This is consistent with our long-term belief that the addressable market for computing increases as the capability of computing increases.

“OpenAI said it has nowhere near the amount of compute it needs to satisfy its customers' needs.”

  • Multiple LLMs will continue to co-exist. There will not be a ‘one-stop-LLM-shop’. Some models such as ChatGPT will be good for certain applications, while others such as xAI’s Grok will excel in specific tasks such as coding. In fact, Microsoft and Oracle are supporting multiple LLMs on their platforms and enabling customers to use those that are best for their use cases.
  • Data is critical to AI success. Reddit, a news and discussion forum which is a key holding in our strategy, is clearly seeing this as customers such as Google and OpenAI are heavily dependent on Reddit’s data: Reddit represents 20% of Google’s search results and 30% of OpenAI’s search results.

In the enterprise world, this could pose a challenge for large organisations as it requires a lot of heavy blocking and tackling to get the data “AI-ready” so AI tools can properly access the data. For many enterprises, data infrastructure and security are hugely complex issues and will require a lot of investment to address bottlenecks.

Evolving opportunities

We categorize the AI opportunity set into Enablers – companies that supply the foundational technology such as semiconductors; Providers – companies, primarily in the enterprise software sector, which use AI to make their products easier to use; and Beneficiaries – companies that employ AI internally to improve their products and services.

The AI investment opportunity is also global and potential investments are spread across all sectors of the economy. Our strategy seeks to have a balance between these three categories in the portfolio, as well as identifying opportunities from across different countries and sectors.

We think the AI investment opportunity will evolve over time, moving from the enablers towards beneficiaries, as the technology becomes more pervasive throughout the real economy. Earlier this year, we took advantage of the tariff-related sell-off to increase our exposure to the enabler category. More recently, we increased our non-US exposure by adding investments in China and Latin America to the portfolio.

What next for AI?

The generative AI era is rapidly evolving into agentic AI, where AI can think and reason like humans. Beyond agentic AI, we believe robotic AI will soon follow.

We continue to believe that the AI innovation cycle is multi-decade. In our view, each iteration of AI increases the addressable market for the technology, enabled by increases in computing power, creating the potential for long-term growth.

 

Latest insights

Sign up to receive our future insights.

Subscribe

The value of investments will fluctuate, which will cause prices to fall as well as rise and investors may not get back the original amount they invested. Past performance is not a guide to future performance. The views expressed in this document should not be taken as a recommendation, advice or forecast.