• FatCrab@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    Most of these figures are guesses along a spectrum of “educated” since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you’re looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.

    That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn’t matter–there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).