Your browser is: WebKit 537.36. This browser is out of date so some features on this site might break. Try a different browser or update this browser. Learn more.
AI

Illustration: Carolina Moscoso for Bloomberg Markets

AI Wants More Data. More Chips. More Real Estate. More Power. More Water. More Everything

Businesses, investors and society brace for a demand shock from artificial intelligence.

It looks easy enough: Ask ChatGPT something, and it responds. But pull back the curtain, and you’ll find that every ChatGPT prompt and Microsoft Copilot task consumes vast resources. Millions of human beings engineering, correcting and training models. Enough terawatt-hours of electricity to power countries. Data center mega­campuses around the world. Power line networks and internet cables. Water, land, metals and minerals. Artificial intelligence needs it all, and it will need more.

Researchers have estimated that a single ChatGPT query requires almost 10 times as much electricity to process as a traditional Google search. Your typical search engine crawls the web for content that’s filed away in a massive index. But the latest AI products rely on what are known as large language models, or LLMs, which are fed billions of words of text—from the collected works of William Shakespeare to the latest forecasts of the Federal Reserve. The models detect patterns and associations and develop billions and billions of so-called parameters that help them mimic human behavior. Using these models, ChatGPT and the like create new content—hence the term generative AI.