Monday, October 14, 2024

The breakthrough AI needs | Mint

Date:


Large language models have a keen appetite for electricity. The energy used to train OpenAI’s GPT-4 model could have powered 50 American homes for a century. And as models get bigger, costs rise rapidly. By one estimate, today’s biggest models cost $100m to train; the next generation could cost $1bn, and the following one $10bn. On top of this, asking a model to answer a query comes at a computational cost—anything from $2,400 to $223,000 to summarise the financial reports of the world’s 58,000 public companies. In time such “inference” costs, when added up, can exceed the cost of training. If so, it is hard to see how generative AI could ever become economically viable.

This is frightening for investors, many of whom have bet big on AI. They have flocked to Nvidia, which designs the chips most commonly used for ai models. Its market capitalisation has risen by $2.5trn over the past two years. Venture capitalists and others have ploughed nearly $95bn into ai startups since the start of 2023. Openai, the maker of ChatGPT, is reportedly seeking a valuation of $150bn, which would make it one of the biggest private tech firms in the world.

There is no need to panic. Plenty of other technologies have faced limits and gone on to prosper thanks to human ingenuity. The difficulty of getting people into space led to innovations that are now used on Earth, too. The oil-price shock in the 1970s encouraged energy efficiency and, in some countries, alternative means of generation, including nuclear. Three decades later, fracking made it possible to reach oil and gas reserves that had previously been uneconomical to extract. As a consequence, America now produces more oil than any other country.

Already, developments in AI are showing how constraints can stimulate creativity. As our Technology Quarterly this week sets out, companies are developing chips especially for the operations needed to run large language models. This specialisation means that they can run more efficiently than more general-purpose processors, such as Nvidia’s. Alphabet, Amazon, Apple, Meta and Microsoft are all designing their own AI chips. More money has flowed into funding AI-chip startups in the first half of this year than in the past three combined.

Developers are also making changes to AI software. Bigger models that rely on the brute force of computational power are giving way to smaller and more specialised systems. OpenAI’s newest model, o1, is designed to be better at reasoning, but not generating text. Other makers are employing less onerous calculations, so as to make more efficient use of chips. Through clever approaches, such as using a mixture of models, each suited to a different type of problem, researchers have drastically cut down on processing time. All this will change how the industry operates.

Investors and governments have become used to the idea that, among tech companies, the incumbent has a natural advantage. For AI, that assumption can no longer be taken for granted. Today Nvidia sells four-fifths of the world’s ai chips. But other more specialised rivals could well eat into its share. Already Google’s ai processors are the third-most-used in data centres around the world.

Openai may have launched the pioneering large language model. But as resource constraints have struck, other big modelmakers such as Anthropic, Google and Meta are catching up. Although a gap between them and the second-tier models, such as France’s Mistral, still exists, it may close. If the trend towards smaller and more specialised models continues, then the ai universe could contain a constellation of models, instead of just a few superstars.

This means that investors are in for a rocky ride. Their bets on today’s leaders look less certain. Nvidia could lose ground to other chipmakers; OpenAI could be supplanted. The big tech firms are hoovering up talent, and many of them make the devices through which, they hope, consumers will reach their ai assistants. But competition among them is fierce. Few firms yet have a strategy for turning a profit from generative ai. Even if the industry does end up belonging to one winner, it is not clear who that will be.

Governments, too, will need to change their thinking. Their fondness for industrial policy focuses on handouts. But progress in ai is as much about having the right talent and a flourishing ecosystem as it is about amassing capital and computing power. Countries in Europe and the Middle East may find that the hard graft of cultivating ingenuity matters as much as buying in computer chips. America, by contrast, is blessed with chips, talent and enterprise. It has many of the world’s best universities and, in San Francisco and Silicon Valley, an enviable and long-established cluster of talent.

Chipped away

Yet America’s attempt to restrain China is backfiring. Hoping to prevent a strategic rival from gaining the lead in a crucial technology, it has sought to restrict China’s access to cutting-edge chips. By doing so it has unintentionally stimulated the growth of a research system in China that excels at working round constraints.

When ingenuity counts for more than brute force, a better way to ensure America’s lead would be to attract and keep top researchers from elsewhere, for example through easier visa rules. The ai era is still in its infancy, and much remains uncertain. But the breakthroughs ai needs will come from giving ideas and talent the space to flourish at home, not trying to shut down rivals abroad.

© 2024, The Economist Newspaper Limited. All rights reserved. From The Economist, published under licence. The original content can be found on www.economist.com



Source link

Share post:

spot_img

Popular

More like this
Related

OpenAI’s Swarm AI agent framework: Routines and handoffs

Join our daily and weekly newsletters for the...

Best eco gifts and sustainable tech for Christmas 2024

You don’t need a Muppets medley to know...

Windows Security: What is Memory Integrity?

Windows Security is the built-in antivirus suite within...