

And in return, they drive traffic away from the sites that collect the information in the first place, causing the sources to lose revenue.
And in return, they drive traffic away from the sites that collect the information in the first place, causing the sources to lose revenue.
Maybe kinda, but it’s also a third party whose certificates are almost if not entirely universally trusted. Self-signed certs cause software to complain unless you also spread a root certificate to be trusted to any machine that might use one of your self-signed certs.
In a sense, you are hosting the content. You’re retaining a copy (so long as the window is open) and constantly attempting to spread it. It’s literally built on bittorrent protocols if I remember correctly, and it’s already very well established that you can be held responsible for seeding copyright infringing material, so I see no reason at all they’d give you a pass for CP instead. You may not intend to, but remember, my example was someone who looks of age but is not.
Using something like PeerTube is potentially even worse because let’s say you unknowingly open a video where someone in it looks of age but technically isn’t. You as a user help propagate that content while you have it open. You’re not just downloading illegal content at that point. You’re actively sharing it to new people.
Nobody’s throwing a tantrum. They’re just saying they can’t reasonably serve their purpose if they lose 32-bit support. A project so heavily based on other projects is subject to upstream whims, and they probably don’t have the manpower to do anything about it.
I don’t think they’re saying they know better. Seems more like they’re tired of pouring hundreds of hours of free labor specifically into accessibility only to hear people bitch about how they’re not doing enough when the people bitching probably don’t even genuinely care beyond using it as a way to bash GNOME.
To which your response is to take the opportunity to talk shit about GNOME and disregard his meaning, which kinda illustrates his point.
Having learned Nix recently and still not being great at it, writing your personal config is relatively easy. The website has a search feature for options you can use by default, so it’s pretty straightforward. Just search for relevant keywords and set the options you like.
If you want to package software for nixpkgs, define custom options, or anything else that’s going to require custom Nix, it’s… Better than you make it sound but not great. I only read one guide, and it wasn’t great, but it covered the basics well enough. From there, I managed to figure out what I’ve needed so far just from the official documentation for the Nix language. It’s not everything it could be, but it’s not too bad.
If you wanna really get into the thick of it and extensively write Nix for some detailed purpose, you might run into some more problems. I still don’t think it’d be as bad as you make it sound, but you probably won’t be thrilled, either.
Nothing really happens because Linux doesn’t have the market share to demand better, and I never said that wasn’t the case. My point is that they shouldn’t have put themselves in this position, not that we have any power to make them follow through.
Then they should have kept it internal until they were ready to commit. People spent money with them as a result of that commitment, and it may not have been a large part of their customer base, but it is exactly the people they courted with the public statement. They wanted to make the announcement to reap the PR benefits, so now they need to follow through and deliver.
Then maybe they shouldn’t have publicly said they were planning to build this bridge ten years ago.
There’s work being done on it, but from my understanding, it’s slow going…
I use a relatively low spec KVM VPS on another continent. Remember, kids, if all your backups are in one location, you don’t have backups. You have copies.
Man, I wish self hosted email was a reasonable thing to do. But it’s a pain to set up the server and the domain stuff, and once you do, if anyone ever spammed off that IP, you’re probably screwed anyway because good luck getting off the blacklists.
So it seems like if you’re using Office on desktop, not SaaS, but they do offer it in a browser, so would that count? Technically, if it’s in JavaScript or something like that, computing is handled locally, but it still feels close enough to count.
These apps aren’t SaaS, but their alternatives are in at least some cases. LibreOffice competes with Microsoft Office, for example, and Microsoft wants people to pay a subscription for it, although I think you can still buy it outright. Pretty sure I’ve heard similar for Adobe products. Not super familiar with all the options, so can’t say if it’s true for all of them.
Running a server is very doable. There are packages to deploy and configure almost everything for you and removing a ton of headache.
Getting your email recognized as not spam by the major providers is pretty much impossible. You need all sorts of stuff to help verify integrity including special DNS records and public identity keys, but even if you do everything right, your mail can very easily get black holed before it even reaches a user’s inbox because of stupid shit like someone abused your rented server’s IP years ago, and you can’t seem to get it off everyone’s lists.
Email as a decentralized tool has effectively been ruined by spam and anti-spam measures. You’re effectively forced to use a provider because it’s near impossible to make your outgoing mail work as an individual. I think some of those anti-spam measures are anticompetitive, but I do think some are just desperate attempts to reduce the massive flow of spam.
That shit drives me nuts. Wanna be trusted with my life savings, but they can’t be bothered to implement modern security features until they’re already being phased out. I don’t know what will replace modern 2FA schemes, but I guarantee banks will adopt the current ones about three years after the replacements become standard.
Also, they’re charging you a poor tax for not having enough money, whether that’s a minimum balance or just accidentally spending a nickel more than you had on hand.
It saves them money in the same sense it saves every other information source money, it reduces traffic. But just like other sites can’t serve ads without traffic, Wikipedia can’t prove its worth and ask for donations without traffic. Eventually, people will start asking themselves why they need to support Wikipedia when Google’s AI tells them everything they need to know, unaware that Google’s AI can only do so because it scrapes Wikipedia without paying for it.