I haven’t heard about thermodynamic computing before, but it sounds pretty interesting. As IEEE Spectrum explains , “the components of a thermodynamic chip begin in a semi-random state. A program is fed into the components, and once equilibrium is reached between these parts, the equilibrium is read out as the solution. This computation style only works with applications that involve a non-deterministic result … various AI tasks, such as AI image generation and other training tasks, thrive on this hardware.” It sounds almost like quantum computing to my layperson ears. [edit: fixed link]

  • deegeese@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    8
    ·
    3 days ago

    Sloppier compute architecture needed to drive down costs on sloppier method of computing.

    • funkykong@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      23
      ·
      3 days ago

      Slop has nothing to do with it. Some problems just aren’t deterministic and this sort of chip could be a massive performance and efficiency boost for them. They’re potentially useful for all sorts of real world simulations and detection problems.

    • finitebanjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      7
      ·
      3 days ago

      If it makes AI cheaper then great because AI is a massive fucking waste of power, but other than that I am grossed out by this tech and want none of it.

    • VeryInterestingTable@jlai.lu
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      When the results doesn’t matter use “Does not matter chips” They work 3% of the time 100% of the time. BUT they consume way less power! Great for any random statistics, if the results does not match what you want then just press again! Buy “Does not matter chips” now!

    • artifex@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      2
      ·
      3 days ago

      This is it literally. (granted I’m sure there are other use cases, but you know they’re following those AI-dollars)

    • Aceticon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      It makes some sense to handle self-discovered real numbers of infinite precision using analog methods, though I’m curious about how they handle noise, since in the real world and unlike the mathematical world all storage, transmission and calculations have some error.

      That said, my experience way back with a project I did at Uni with Neural Networks in their early days, is that they’ll even make up for implementation bugs (we managed about 85% rate of number recognition with a buggy implementation) so maybe that kind of thing is quite robust in the face of analog error.