...

The Download: AI’s energy future


Three big things we still don’t know about AI’s energy burden

—James O’Donnell

Earlier this year, when my colleague Casey Crownhart and I spent six months researching the climate and energy burden of AI, we came to see one number in particular as our white whale: how much energy the leading AI models, like ChatGPT or Gemini, use up when generating a single response. 

We pestered Google, OpenAI, and Microsoft, but each company refused to provide its figure for our article. But then this summer, after we published, a strange thing started to happen. They finally started to release the numbers we’d been calling for.

So with this newfound transparency, is our job complete? Did we finally harpoon our white whale? I reached out to some of our old sources, and some new ones, to find out. Read the full story.

MIT Technology Review Narrated: Google DeepMind has a new way to look inside an AI’s “mind”

We don’t know exactly how AI works, or why it works so well. That’s a problem: It could lead us to deploy an AI system in a highly sensitive field like medicine without understanding its critical flaws.But a team at Google DeepMind that studies something called mechanistic interpretability has been working on new ways to let us peer under the hood. 

Source link

#Download #AIs #energy #future