- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.
It’s consuming the energy equivalent of 33,000 homes, okay. Is it doing work equivalent to 33,000 people or more? Seems likely to me.
Exactly my thought too.
For so long human progress has been limited by population size.
A large part of the reason we lept so far over the past century was the huge increase in population size which allowed for greater subspecialization.
But that population growth is unsustainable if not already well past the practical limit.
If we can successfully get to a point where we have exponential gains in productivity unseen in human history while also decoupling that progress from the massive resources it would require from more humans, we might be able to outpace the collective debts we’ve racked up as a species without obsessive focus on procreation (like Musk) as necessary to burden the next generation with our fuck ups.
33,000 homes is way less than I’d have thought given the degree to which it is being used. And the promises of future hardware revisions like with photonics means we’ll be looking at exponential decreases in energy consumption while also seeing increases in processing power.
it’s an intelligent competitor for resources, and it will never quit