• V0ldek@awful.systems
    link
    fedilink
    English
    arrow-up
    28
    ·
    2 months ago

    Where’s that person who was arguing with me last time that AI doesn’t actually use that much energy and the corps missing their climate goals was not AI related

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        19
        ·
        2 months ago

        That’s the fucking problem, it’s impossible to tell since MSFT won’t tell you directly, and only the people who run the datacenters could.

        The only relatively reliable numbers I was able to find were in this research paper by Luccioni and Strubell from ACM Conference on Fairness, Accountability, and Transparency 2024. Now, that’s an obscure conference (not even ranked by CORE), by Dr. Luccioni appears to be right on the money about dangers of AI (https://www.sashaluccioni.com/).

        • skillissuer@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          13
          ·
          2 months ago

          they will tell total tho https://www.latitudemedia.com/news/microsoft-reveals-the-energy-impact-of-artificial-intelligence

          this works out to 2.7GW in 2023, on average. that’s comparable to peak daily consumption in croatia (today), if that 30%-ish figure is accurate then something closer to 700MW is ai-only, that’s smaller country like macedonia

          which only highlights how bizarre is their 5GW proposition. hey let’s outbuild ms 2x, like, now

          • skillissuer@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            2 months ago

            that sounds like it’s much less than crypto at its peak, and even 2023 estimate differs by over an order of magnitude (14.5GW avg). there’s also google and fb and whoever else (aws?)

        • skillissuer@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 months ago

          i started to look up satellite photos and openinframap in order to figure out maximum capacity of their substations, but powerlines for them are probably massively oversized, and substations are probably oversized too in order to make it redundant and high-availability so there might be some way to guess it but then some of these will be underground and if they’re doing load-following to match their renewables (which might be cheaper for them) then it’s also oversized a bit on top of that

          • V0ldek@awful.systems
            link
            fedilink
            English
            arrow-up
            10
            ·
            2 months ago

            Well the main problem is that a datacenter is running much more than just AI. You’d need to somehow subtract “normal” cloud usage from just the promptfondling.

            • skillissuer@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              10
              ·
              edit-2
              2 months ago

              ez. remember that announcement when ms said their energy use got up 36%? that’s ai, and includes both training and use

              this still can be fudged with more efficient office heating, shutdowns of least efficient dcs and so on, but only to a limited degree