• 0 Posts
  • 71 Comments
Joined 2 years ago
cake
Cake day: June 6th, 2023

help-circle
  • There is no justification. The “Ends” in E2EE mean the initial sender, and intended recipient. The “transport” should have zero insight into the content. Encrypting a message to the servers is standard even for “non-private” messaging services, it’s usually done with SSL (part of HTTPS).

    Lets compare it to traditional mail. If you send something, the postal company can always just open your mail and read it. With computers, we have black magic (E2EE) that physically prevents the postal company from doing that. In this hypothetical, Facebook (owner of WhatsApp) is the company that provides you with the pen and paper (the app), and is a postal company (their servers). They promise that the black magic on the paper prevents them from reading what you wrote, but then they clearly read the content of your letter to send you a summary of the conversation.

    Mid-message quick edit: They could’ve also done something to the pen (other parts of the app) to have it tell them what you wrote. This would mean the black magic (E2EE) is applied, but is completely useless. (End of edit)

    If the process for making the pen and paper (the app) was publicly known (open source), you could make your own, and be sure the black magic (E2EE) is applied properly. That way you can be certain the postal company (servers) can’t read your letter, only the recipient can.

    If the postal company gives you the pen and paper without telling you how to make it, it’s nearly impossible to tell if the black magic was applied properly.


  • The concept of “End to End Encryption” (E2EE) is that one end encrypts the data, it passes through transport, and the only person who can read the decrypted data is the intended receiver.

    In the case of WhatsApp, this should mean:

    • Your phone (WhatsApp app) encrypts a message
    • Your phone sends the encrypted (“unreadable”) message to Facebook
    • Facebook sends the message to the intended receiver
    • The receiver decrypts the message

    The whole “Meta AI summaries” thing has to run on their servers. Large language models small enough to fit on a phone don’t produce sensible output yet, and your phones battery would drain very quickly. Since each message is (supposed to be) encrypted with different keys, no human nor computer can make sense of the encrypted data without the keys to decrypt it. For their servers to provide a “summary of your chats”, they have to be able to read the content of the messages. Thus proving that the whole “end to end encryption” in WhatsApp is either false, or made entirely useless with them sending all messages to themselves without E2EE.

    The only proof that would invalidate this is evidence of the LLM running locally on device. Even then, the way some of WhatsApp’s services work (like notifications, WhatsApp Web) creates some serious doubt on the “E2EE” claim.

    It is absolutely essential that any communications platform claiming “E2EE” proves this by making the client-side code (the stuff running on your device) fully open source. A proprietary app, like WhatsApp, by definition makes it harder to fully understand its inner workings, and thus fully verify the E2EE claim.






  • XMPP is significantly less decentralized, allowing them to “”“cut corners”“” compared to Matrix protocol implementation, and scale significantly better. (In heavy quotes, as XMPP isn’t really cutting corners, but true decentralization requires more work to achieve seemingly “the same result”)

    An XMPP or IRC channel with a few thousand users is no problem, wheras Matrix can have problems with that. On the other hand, any one Matrix homeserver going down does not impact users that aren’t specifically on that homeserver, whereas XMPP is centralized enough that it can take down a whole channel.

    Meanwhile IRC is a 90s protocol that doesn’t make any sense in the modern world of mainly mobile devices.

    XMPP also doesn’t change much, the last proper addition to the protocol (from what I can tell, on the website) was 2024-08-30 https://xmpp.org/extensions/xep-0004.html






  • Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient.

    Lets assume for the sake of argument, an AI scraper company actually attempted this. They don’t, but lets assume it anyway.

    The next Anubis release could include (for example), SHA256 instead of SHA1. This would be a simple, and basically transparent update for admins and end users. The AI company that invested into offloading the PoW to somewhere more efficient now has to spend significantly more resources changing their implementation than what it took for the devs and users of Anubis.

    Yes, it technically remains a game of “cat and mouse”, but heavily stacked against the cat. One step for Anubis is 2000 steps for a company reimplementing its client in more efficient hardware. Most of the Anubis changes can even be done without impacting the end users at all. That’s a game AI companies aren’t willing to play, because they’ve basically already lost. It doesn’t really matter how “efficient” the implementation is, if it can be rendered unusable by a small Anubis update.


  • Someone making an argument like that clearly does not understand the situation. Just 4 years ago, a robots.txt was enough to keep most bots away, and hosting personal git on the web required very little resources. With AI companies actively profiting off stealing everything, a robots.txt doesn’t mean anything. Now, even a relatively small git web host takes an insane amount of resources. I’d know - I host a Forgejo instance. Caching doesn’t matter, because diffs berween two random commits are likely unique. Ratelimiting doesn’t matter, they will use different IP (ranges) and user agents. It would also heavily impact actual users “because the site is busy”.

    A proof-of-work solution like Anubis is the best we have currently. The least possible impact to end users, while keeping most (if not all) AI scrapers off the site.








  • SteamOS is immutable, and has its own updater because of it. While SteamOS is related to Arch Linux, it is far from the same distro.

    Regular Arch Linux can install local packages, the process is described on the Arch wiki. SteamOS has no built in mechanism to update without an internet connection, and installing packages manually is not recommended (due to the immutable nature of the system).

    In order to update SteamOS from a local file, you would need to figure out how SteamOS updates work, and somehow trick the Steam Deck into accepting your local images. This is far outside the scope of anything related to Arch Linux, and not very well documented. The better option is to update your Steam Deck by connecting it to the internet.

    Also of note, the latest version of SteamOS (as far as I’m aware) does not have any significant changes when it comes to controlling the fan speed.