Winter isn’t coming – why AI is here to stay

Key players insights
Winter isn’t coming – why AI is here to stay
There’s no denying that the rate of AI’s progress has waxed and waned over the years. The fallow period of the 70s, 80s and 90s is often described as the first ‘AI winter’ - characterised as a general loss of interest, brought on by the failure to realise early, inflated expectations. The lull was partly down to slowing investment in research from bodies such as the US Government, but more significantly, the yawning gap between AI theory and the availability of computer systems capable of bringing those ideas to life.

Writing in 1973, the British mathematician Sir James Lighthill gave a downbeat assessment of the state of AI: “Various reasons including limitations of computer power have restricted the universe of discourse”.

The AI renaissance of the past decade has seen a greater alignment of those three factors: research has been re-invigorated, not just by renewed government interest, but by the patronage of technology hyperscalers. Meanwhile, computer systems have roughly kept pace with the demands of leading-edge AI – although the continued exponential growth in model sizes is prompting industry to look for more imaginative solutions than simply adding more aisles to the datacentre.

Graphcore was created to help maintain and even accelerate this exciting momentum. Our made-for AI Intelligence Processing Unit (IPU) unlocks new models and techniques, while enabling much more efficient compute through approaches such as model sparsification.

But there’s another important new variable that has come into play - one that promises to give the AI revolution real staying-power: Adoption.

Every innovative technique that Graphcore’s researchers and in-house engineers work on is either in partnership with real end-users, or is quickly adopted by customers across the public and private sectors.

Our work with German foundation model developer Aleph Alpha yielded a sparsified chatbot model that required just 20% of the processing power of its dense counterpart. This work is at the very leading-edge of AI, but from its inception Luminous Sparse was destined for commercial deployment.

When we developed the Graph Neural Network for predicting quantum properties of molecular graphs that won the Open Graph Benchmark Large Scale Challenge, we did so in partnership with Researchers from Canada’s Mila AI lab and the Université de Montréal as well as Valence Discovery – a commercial AI drug discovery company.

The path from the research lab to commercial deployment in AI has never been more direct, or faster. The result is a virtuous cycle of investment, innovation and adoption.

As applied AI finds a wider market, a fast-growing AI stack is emerging to enable its adoption – akin to the tools and platforms that the rest of our technological world is built on.

The growth in demand for AI compute in the cloud and AI-as-a-service is dramatic. Today it is possibly to build an entire business on rented hardware, using models that are either entirely-off-the-shelf, or lightly modified. Even industries that traditionally demanded control of their own hardware – like financial services – are now looking to the cloud for their AI compute.

While Graphcore is known to many as the ‘artificial intelligence chip company’, our business today is built around the new AI stack. Most of our customers will never see an IPU processor, but instead are buying their compute from cloud partners such as Gcore. Others are firing up notebooks on-demand through Paperspace, perhaps using Hugging Face’s extensive library of language models.

For some, the hardware has become fully abstracted. AI-as-a-service companies are building low-code, no-code tools on Graphcore systems for users with no prior experience of machine intelligence. Our partner Pienso allows subject experts and department heads to run sophisticated AI models on text data, such as customer service conversations, with no need to rely on data science teams.

To borrow an analogy from the fashion world, AI now has its haute couture – the leading-edge of research and innovation, it has bespoke tailoring – an industry built on the adaptation of foundation and other pre-existing models, and it has the high street – where artificial intelligence meets mass market.

This rich ecosystem and diverse audience of users is one of the main reasons that the current AI revolution will endure. It has reached critical mass.

There will still be times where things seem to move faster or slower. Some breakthroughs will feel seismic; others incremental. But it’s finally time to cast aside the seasonal metaphor: this time, the AI winter is not coming.