• 0 Posts
  • 13 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
  • Game engines have their own tools and languages, which can be very different from non-game software, and needless to say require a completely different skill set from your average software without a GUI.

    Most of the time, they cannot easily interoperate with the languages people use for other things. When you are building a game, you will be using the engine’s tools and language from the very start, but porting an existing software to work inside of a game engine is unrealistic, and building normal software inside of a game engine would be completely absurd for most cases, both for performance reasons and also for developer convenience.

    In theory you might be able to pack the original program on its own with no changes and just make the GUI interact with the actual program, but at that point it’s already a completely separate project from the original software - a project that the original developer likely has no interest in, assuming that the original program already fulfills their own needs.

    In other words: While it is possible to use Godot and alike to create a GUI, for most cases you would have to either do some extremely complex things to run the original program inside of the engine or (re)write the entire program from scratch inside of the engine, and odds are the engine will not have direct equivalents of third-party tools the program relies on.








  • Some things you did not mention that caught my eye, please correct me if I misunderstood how they work:

    • No servers, just P2P. Every user doubles as a server to some degree, akin to seeding
    • Their one and only method to prevent bots is Proof of Work

    …I personally can only really see that as cons,

    • waste storage space and bandwidth on other user’s encrypted messages that have nothing to do with you
    • waste computing power every time you want to send a message to anyone
      (and I refuse to dismiss those as “negligible”, wasting any of that means wasting energy after all)

    Not to mention abuse related issues that come with the “100% Censorship resistance”, from scams and social engineering to abusive texts to illegal content to displeasing images.

    I can see an argument for some sorts of communities, but I would never consider that “a good alternative to Twitter/Facebook” in general.
    If anything, their explicit, by-design lack of moderation may make it even worse for vulnerable/sensitive groups.

    Quoting their FAQ before anyone asks for the source:

    (Security > Privacy and Data Security > Where is my data stored?)

    Your data is relayed and stored in your friends’ devices and other random devices available in the network. All data is protected by strong cryptography algorithms and can be accessed only with the owner’s secret seed.
    (Security > Underlying Technology > On what technology is WireMin built in?)
    WireMin users jointly created an open computing platform for messaging and data storage that serves each other within the network for personal communication. WireMin protects the public resource from being abused or attacked by requiring proof-of-work, or PoW, for every message sent and each bit of data stored. A tiny piece of PoW needs to be completed by computing SHA256 hundreds of thousands of times before you can send a message. Such computing tasks can be done in less than a tenth of a second which is a negligible workload for a user device sending messages at human speed. While this introduces a significant effort for an attack to send overwhelming amounts of messages or data, the actual PoW difficulty requirement of a specific message or bit of data is proportional to its size and the duration for which it is to be stored.



  • It is not “in the whole fediverse”, it is out of approximately 325,000 posts analyzed over a two day period.
    And that is just for known images that matched the hash.

    Quoting the entire paragraph:

    Out of approximately 325,000 posts analyzed over a two day period, we detected
    112 instances of known CSAM, as well as 554 instances of content identified as
    sexually explicit with highest confidence by Google SafeSearch in posts that also
    matched hashtags or keywords commonly used by child exploitation communities.
    We also found 713 uses of the top 20 CSAM-related hashtags on the Fediverse
    on posts containing media, as well as 1,217 posts containing no media (the text
    content of which primarily related to off-site CSAM trading or grooming of minors).
    From post metadata, we observed the presence of emerging content categories
    including Computer-Generated CSAM (CG-CSAM) as well as Self-Generated CSAM
    (SG-CSAM).




  • Their arguments still hold up pretty well as far as I can tell. If anything “improved” since then, you could argue that what the biggest platforms decided to use (Mastodon, Lemmy) became the de-facto dialect in use, but you still have to explicitly refer to how certain projects do things if you want to implement ActivityPub, which can be pretty demotivating for developers, and doesn’t makes the user experience any better.

    …and nothing prevents new apps such as Threads from using ActivityPub differently, being incompatible with existing apps and further dividing the space