From Electrons to Photons: The Next Great Compute Transition

8 min read 30 Sep 25

Computing has evolved through significant shifts in the medium and mechanisms of processing information, leading to transformative impacts on industries and economies. Carl Vine, Co-Head of Asia Pacific Equities, argues that another such change is happening now, with the use of photons. He explores what this shift might mean for today’s dominant technology firms and who could win in the new photonics age.

For the best part of a century and a half, computing has progressed at a constant but incremental rate. Within that story, however, are a handful of moments when the change was more drastic. Those moments reordered the fortunes of companies, industries and even nations. Another such moment is now upon us and the consequences will be far-reaching.

The history of compute has been defined by two fundamentals: the medium that carries state (information or data), and the mechanism that shapes it into logic. Most upheavals in computing have been mechanism shifts: new ways of manipulating the same medium. Gears and rods gave way to relays, which in turn gave way to valves. Valves to transistors. Transistors to integrated circuits and microprocessors. Each time, old leaders fell, new winners rose, and the economics of the industry were reset.

The medium of compute itself has changed only once: from motion – gears, rods, cams and punched cards – to electrons. That singular change gave birth to modern computing.

In between, there was one great integration and scale revolution: the rise of integrated circuits and microprocessors. Not a new medium, not a new mechanism, but the packaging of billions of switches into a single chip, with consequences just as disruptive.

Today, for the first time since the 1940s, both medium and mechanics are shifting together. The medium is shifting from electrons to photons. The mechanisms will need to change accordingly. The building blocks of logic will shift from transistors to light modulators, interferometers (a high-precision measuring device) and detectors.

As before, the giants of the current era may well find themselves reordered, just as the calculator makers, valve-tube giants, and minicomputer kings once were. Given that global semiconductor stocks currently command in excess of US$10 trillion in collective market capitalisation, the impact could be seismic.

“As before, the giants of the current era may well find themselves reordered.”

The first medium shift: From motion to electrons

For centuries, the medium was motion. Gears, rods, and cams powered devices from Pascal’s calculator1 to Babbage’s Analytical Engine2 to IBM’s punched-card tabulators. The mechanisms were linkages and levers.

By the 1940s, motion had run out of road. It was too slow, too fragile, and impossible to scale. Electrons replaced motion as the new medium. Relays, then valves, became the mechanisms that shaped them.

Winners: RCA, Bell Labs, Ferranti, and IBM, which managed a hard pivot. National labs also gained: Bletchley Park in the UK and the US Army Ballistics Research Lab absorbed and advanced the new medium.

Losers: Mechanical calculator companies like Brunsviga and Monroe, and with them much of Europe’s precision instrument industry.

Mechanism shift: Valves to semiconductors

Valves proved electrons could compute, but they were fragile, hot, and power-hungry. Sound familiar?

The world’s first general purpose electronic computer, ENIAC or Electronic Numerical Integrator and Computer, which was built by the US during the Second World War, filled 1,800 square feet, weighed 30 tons, and consumed 150 kilowatts. Valves burned out constantly.

The transistor – a new mechanism for the same medium – solved those problems3. Cooler, smaller, and more reliable, it made computation practical and scalable. The transistor era enshrined scale as destiny, with Moore’s Law as its North Star4.

Winners: IBM cemented dominance with the System/360 family of mainframe computers in 1964. Digital Equipment Corporation (DEC), founded in 1957, created the minicomputer. Fairchild and Intel turned transistor know-how into integrated circuits. Consumer electronics firms like Texas Instruments, Motorola, and Sony built new empires.

Losers: RCA, once a tube giant, and many British firms – Ferranti, Elliott, ICL – which were technologically advanced but commercially fragmented. Nations without semiconductor industries lost global influence.

Integration and scale: Integrated circuits and microprocessors.

The medium was still electrons. The mechanism was still the transistor. But the way transistors were packaged changed everything. Integrated circuits allowed thousands, then millions, of switches on a single chip. Microprocessors went further, pulling an entire CPU (central processing unit) into one die. This was not a new mechanism, but an industrial revolution in scale and economics.

Winners: Intel’s 4004 (1971) unlocked the personal computer. Microsoft standardised the software layer. Apple defined the consumer experience. Compaq and Dell built the Wintel ecosystem. Software giants, like Oracle, Adobe and SAP, rode the wave as PCs proliferated.

Losers: Minicomputer firms like DEC, Wang, and Prime. Mainframe holdouts like Burroughs and Univac. Even IBM faltered in PCs. Xerox PARC invented the graphical user interface (GUI), mouse, and Ethernet but failed to capitalise. Japan’s Fifth Generation Computing project became a national-scale casualty. Scale, not switching physics, reordered the industry.

The next great leap?

Moore’s Law has already slowed. GPUs (graphics processing units) are devouring megawatts. Data centres are hitting cooling and energy walls. Electrons are becoming the relays of our time – too hot and too power-hungry to scale. The solution is photons5.

“Electrons are becoming the relays of our time – too hot and too power-hungry to scale. The solution is photons.”

The photonic revolution started more than 30 years ago but it stopped at the door of the processor. In 1988, fibre-optic cables leapt across the Atlantic, and through the internet boom photons displaced electrons in global transmission. Copper had reached exhaustion; light kept scaling.

Yet compute itself stayed stubbornly electronic. Transistors were still shrinking, power was cheap, and the tools for logic in light simply didn’t exist. Photonics solved the between problem – how to move information – but never the inside problem of how to process it.

That is now changing. Inside hyperscale data centres, electrical interconnects are already being replaced with optical ones. Beyond interconnects, new photonic mechanisms are being forged: modulators to shape light, interferometers to process it, detectors to extract results.

For the first time since the 1940s, the medium and the mechanism are shifting together.

The analogue comeback?

To be clear, photons don’t merely offer the promise of faster digital machines. Photons re-open the door to analogue computing, a dream as old as Babbage and Vannevar Bush6.

Babbage’s gears were analogue in their way, embodying polynomials in brass. Bush’s Differential Analyzer in the 1930s was a room-sized analogue machine, solving equations by spinning wheels and shafts. Both were ingenious, but both were doomed by scale and accuracy.

The shift to transistors more than half a century back required a bargain to be struck with physics: digital over analogue. Transistors gave us repeatability, precision, and the comforting certainty of bits.

The price was ever-growing complexity, power consumption, and layers of abstraction. This overhead was accepted because Moore’s Law kept paying the bill.

Photons change the equation. Waves superpose, interfere, diffract – and in doing so they compute. A lens performs a Fourier transform. An interferometer multiplies matrices. A phase mask is a program.

Photons therefore offer a new bargain: trade a sliver of precision for orders-of-magnitude gains in throughput and energy. Machine-learning inference is perfectly happy with that exchange. The heavy lifting (matrix multiplication, convolutions, Fast Fourier Transforms) maps naturally to light. The rest (memory, control, non-linearity) can stay electronic. Call it what it is: hybrid compute. Electrons keep the book; photons do the sums.

If the photon is the next medium, then the new mechanisms of logic are unapologetically analogue: modulators to shape the wavefront, interferometers to process it, detectors to read it out.

“If the photon is the next medium, then the new mechanisms of logic are unapologetically analogue.”

Whether they live in free space, on III/V semiconductor platforms7, or in silicon photonics is the open contest. But the direction is clear: back to analogue – this time at the speed of light.

Future winners and losers?

Previous computing shifts have followed the same brutal logic: the old medium or mechanism hits a wall, a new one scales better, incumbents who cling to the old order fail, and winners are those already fluent in the new or bold enough to pivot hard.

RCA owned vacuum tubes but missed transistors. DEC dominated minicomputers but dismissed the microprocessor. Britain led in early computers but failed to industrialise, ceding the industry to the US. History is not gentle with those who hesitate.

How will the current landscape evolve from here? Companies with strong optical heritage are positioned to win if they can seize the moment and organise around it. Intel has spent two decades building a silicon photonics portfolio – could it re-emerge as a leader in a photonic age, after falling behind in silicon?

“Companies with strong optical heritage are positioned to win if they can seize the moment and organise around it.”

Japan has a deep reservoir of intellectual property (IP) and know-how: Sony’s dominance in optical sensors is an underappreciated advantage, in our view; Nikon’s IP sits at the intersection of optics and semiconductor compute (perhaps why EssilorLuxottica, the maker of Ray-Ban sunglasses and Meta’s augmented reality lens partner, took an interest), and NTT’s IOWN (Innovative and Optical Wireless Network) roadmap is one of the most ambitious and co-ordinated bets we see on photonic compute.

Elsewhere, Corning, with its fibre franchise, and Cisco and Ciena, who built the optical internet, are also worth watching. Startups like Lightmatter, Ayar Labs, PsiQuantum, and Flux are fluent in photonics-first compute and may define the new primitives.

The analogue angle widens the field further. Companies adept at calibration, sensing, and control could find themselves unexpectedly central, since analogue compute must be tuned and stabilised. Defence primes, already steeped in lasers and sensing, could find that their analogue heritage positions them to compete in compute. Might an EDA toolmaker build the “Optical CUDA” to become a new kingmaker?8

And what of today’s GPU incumbents? Nvidia, AMD, and the band of beneficiaries riding their ecosystem are winning big now, but will they pivot as IBM did in the 1950s, or will they be eclipsed like DEC and Wang?

TSMC and Samsung, titans of electron-based fabs, face the same question if photonics scales outside silicon. And hyperscalers, such as Microsoft, Google, and Amazon, must decide whether to back photonics at the core of their AI infrastructure or risk locking into yesterday’s economics.

To be clear, it’s too early to count the powerhouse incumbents out. As per the recent article in Nature magazine, Microsoft is already exploring free-space optical/analogue compute architectures9. For its part, Nvidia is investing heavily in co-packaged optics. The playing field, at this stage, remains wide open. New entrants, however, have a window to shake things up.

In terms of timelines, we should also note that none of this will happen overnight. Photonic compute will take years to scale; we are in the early innings. In addition, it’s unlikely that one will replace the other. Just as Hard Disk Drives co-exist with Solid State Memory (NAND), it’s far more likely that photonic compute will live and work side by side electronic compute, sharing workloads. The stakes, however, remain high.

Battle for optical capacity

At the national level, optical manufacturing capacity will become a battleground of industrial policy. The US is seeding startups through DARPA (the Defense Advanced Research Projects Agency) and CHIPS (Creating Helpful Incentives to Produce Semiconductors) Act channels; China is pouring resources into III/V foundries; the EU and Japan are scrambling to secure their own supply chains. Countries with traditions in optics and precision instrumentation – Japan, Germany and Switzerland – may find themselves advantaged if analogue compute takes root.

And what of the UK? It may have famously missed the transistor age, but the UK retains deep academic depth in photonics and quantum optics: universities, such as Southampton, Cambridge, Oxford, and UCL, are all world leaders. It has industrial niches in fibre, lasers, and metrology, and a defence sector with long optical experience. Britain cannot out-TSMC Taiwan in silicon, but if the next medium is light, it may yet play a role more like the fibre era, when its scientists helped lay the foundations.

The Information Age was born with electrons. The Cognitive Age may only be possible with photons — and perhaps, with the return of analogue.

1 Pascal’s calculator, or the Pascaline, was a mechanical calculator designed and built by Blaise Pascal in 1643. The machine could perform addition and subtraction.
2 English mathematician Charles Babbage developed the idea of the Analytical Machine in the 1830s. It was designed to perform mathematical calculations by reading punched cards but was never fully built.
3 Invented in 1947, transistors are tiny electronic components that act like a switch or amplifier for electrical signals. They can turn current on or off, increase the strength of signals and process information. Modern memory chips contain billions of transistors.
4 Moore’s Law is the observation by Gordon Moore, the co-founder of Intel, in 1965 that the number of transistors on an integrated circuit would double every two years.
5 Photons are the basic unit of electromagnetic energy and also the smallest particles of light. They exhibit wave-particle duality, meaning they have the characteristics of both waves and particles.
6 Vannevar Bush (1890-1974) was an American engineer, inventor, and science administrator who played a central role in shaping modern computing, wartime research, and US science policy.
7 III/V platforms are semiconductor platforms made up of elements from the third and fifth columns of the periodic table. They are considered more efficient that silicon-based semiconductors for devices that emit light.
8 Electronic Design Automation or EDA is the software used in designing and manufacturing semiconductors. CUDA (Compute Unified Device Architecture) is Nvidia’s platform and programming model that allows developers to use the power of GPUs.
9 Nature, ‘Analog optical computer for AI inference and combinatorial optimization’, (nature.com), September 2025.
By Carl Vine, Co-Head of Asia Pacific Equities

The value of investments will fluctuate, which will cause prices to fall as well as rise and investors may not get back the original amount they invested. Past performance is not a guide to future performance. The views expressed in this document should not be taken as a recommendation, advice or forecast, nor a recommendation to purchase or sell any particular security.