• steltek@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    What a super weird question. “Cloud computing” is distributed computing. Distributed computing is practically all we have left. Bitcoin/crypto, Kubernetes, Bit Torrent, and endless AWS/Cloud infra patterns. Then we have our happy little Fediverse here.

    I feel the author was trying to say “is at home distributed computing dying?” In which case, yes, because Mobile took over and you really can’t do background compute on those. Certainly not like how SETI@Home worked.

    • worfamerryman@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Just to add, I really hate how mobile operating systems work. They really prevent them from being used more seriously.

      I hope “Dex” like abilities become the norm. Clearly, phones can handle most tasks and it could prevent the need of a computer or at the least, be down cycled into a computer for a kid or grandparent.

      I have so many old phones laying around that would do fine in some sort of desktop mode for spreadsheets and browsing.

      • GoodPointSir@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I hate how much Apple has purposefully handicapped the iPad. It has such capable hardware, but the software doesn’t even come close to taking advantage of it.

        But God forbid the iPad actually becomes a viable daily use computer (What’s a computer?)

        • worfamerryman@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Absolutely! If I could just run MacOS an an iPad when it has a keyboard and mouse attached, I would get an iPad today. There is literally no reason it cannot do this other than they don’t want it to.

          I would be happy to make it my sole computing device. I wouldn’t even mind if it switched to iPadOS once undocked. But as it stands I literally cannot do my work on an iPad.

    • GoodPointSir@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Aside from personal websites and maybe some Lemmy instances, I can’t think of a single application that’s NOT using distributed computing. Hell, Lemmy as a concept is still distributed computing even if individual instances aren’t necessarily.

      • douglasg14b@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Lemmy is… Not distributed computing.

        If each instance is a separate application than must scale on it’s own, then no distributed computing is occuring.

        There is one database, and you can have the instance itself behind a load balancer.

        Lemmy is not a distributed program, you can’t scale it linearly by adding more nodes. It’s severely limited by it’s database access patterns, to a single DB, and is not capable of being distributed in it’s current state. You can put more web servers behind a load balancer, but that’s not really “distributed computing” that’s just “distributing a workload”, which has a lot of limitations that defeat it being truly distributed.

        Actual distributed applications are incredibly difficult to create at scale, with many faux-distribited applications being made (Lemmy being n-tier im a per instance basis).

        Think of Kafka. Kafka is an actual distributed application.