• 0 Posts
  • 30 Comments
Joined 2 years ago
cake
Cake day: June 19th, 2023

help-circle





  • I have been using the same web browser, in terms of ideology, codebase and heritage, since the release of NCSA Mosaic.

    That was 32 years ago. And holy f**ck, that dates me.

    Sure, I dabbled around with others. There was the original Opera, back when Netscape cratered and the only other real option was IE. Opera’s tab behaviours made me install Tab Mix Plus for FF, and I still find that extension to be the second-most critically important extension FF has, right after UBlock Origin.

    And lately I took a shine to Vivaldi, but I have been weaning myself off of it once I realized that the Manifest v2 shutdown was unavoidable for it as well.

    And the only reason why I even have Chromium is as a sandbox for any Google services I access and as a “naked” web browser for those websites who implement malware and spyware in the name of “website security”. Which, of course, also means a majority of websites that are “protected” by CloudFlare’s incredibly hostile anti-user practices.

    And of course, I also run forks, such as Librewolf and others, also with the appropriate anti-malware and anti-spyware add-ins. It can be useful having multiple web browsers up at once.

    But my main will always be Firefox.




  • Apple is the least bad + most functional option out there.

    Nothing else will go further in being least bad, unless you are willing to completely sacrifice functionality and usability.

    Apple at least walks the walk of protecting user privacy because they aren’t dependent on non-hardware, non-app-store revenue (as in, selling user’s data). Google is absolutely dependent on revenue from selling user data because their hardware and App Store revenue is almost insignificant in comparison.


  • Another tool is yWriter.

    This isn’t a tool for everyone, because it is research-first focused.

    What I mean by that is that it’s a little clunky because background/research data is meant to go into it first, and then you are supposed to lean on that content to write your book second.

    So for a non-fiction book, you would add all the data and facts and references, for a fiction book you would put in all of the important characters and plot points and things that the characters interact with.

    This is so you always have a body of references to work off of so you don’t introduce inconsistencies.

    Some people might find this software useful because assembling and fleshing out the underlying data is loads of fun and/or how they prep. Others might need this feature just to keep track of everything that goes into their book, as they might not be able to keep track of things like character quirks very easily in their head.

    YMMV.



  • And I self-host precisely because of the money I save using surplussed hardware. I have a symmetrical 1Gb SOHO fibre connection from my ISP, so I can host whatever the hell I want, I just need to stand it up. And a beefy older system with oodles of RAM is perfect for spinning up VMs of various platforms for various tasks. This saves me craploads of money over even a single VM on cloud platforms like Vultr. Plus, even if I were to support a “heavy” service sufficiently in demand to warrant its own iron, it still costs me less than a year’s worth of hosting to obtain a decent platform for that service to run on all by it’s lonesome.

    My only cloud costs end up being those services which are distributed for redundancy and geographical distance, such as DNS and caching CDNs.


  • the key is to simply seed all of your content for as long as you have it in your collection.

    Tell that to TheGeeks. If you aren’t actively uploading - not just sitting there sharing, but actively sending data to anyone else - you’ll eventually be warned, then banned.

    Back when I was trying to use their site, they had only one system: strict 1 ratio on a time limit. If you couldn’t maintain a 1+ ratio, and achieve it within a very limited amount of time, it didn’t matter what you grabbed or how long you shared back out, you got banned. At the time they had no other way to get ratio other than sharing back out - no freeleech, nothing. Which meant if you were wanting any content more than 2-3 HOURS old, you were looking at a ratio shortfall because there was no way to make up that ratio you were losing by downloading that content. There were simply too few peers after you to overcome the masses of seeders ahead of you satisfying peers.

    It was absolutely brutal, which is why I now refuse to deal with any sites with that rule (1+ ratio with time limit) even if they have other ways (freeleech, etc.) to mitigate it. Like, f**k those sites. I’ve been seeding some torrents for close to 15 years, I have no problem letting shit remain resident in my client. So sites like MyAnonamouse it’s going to have to remain.


  • If you are talking about sites that have a strict, non-negotiable seeding ratio requirement, it is impossible. Your only real long-term option is to write a script that will grab everything that gets uploaded on a 30-second cadence, and then aggressively super-seed that content back out. And this is regardless of what it is - this script runs 24/7, doing about 2,880 hits on the website a day for new content. Still, even with the script it will be difficult to have your overall ratio exceed more than about 1.5-2, and you may still get banned for individual seeds that never exceed 1 because no-one is very interested in them.

    I have tried to use sites that have strict ratio minimums, and long-term success is impossible without an edge like the script I mentioned. It’s why I now work with sites - like myanonamouse - that have minimum seeding times for everything you grab, regardless if anyone else needs it. They tend to be far less stressful and user-hostile.



  • In that way it’s become adversarial.

    Back in the 2000s, I was able to say that while a fundamental install took only about a half hour to set up, usability tweaks and a full fleshing out of functionality took another 4-8 hours depending on what the user was going to use the machine for.

    I just did a Win11 24h2 install. It took nearly 24 working hours before I considered it even minimally functional for my needs. Cycling through Win10Privacy two or three times was particularly frustrating. Registry work alone took me a good 8-10 hours of trying stuff a step at a time and then rebooting to see how it worked.

    At this point, the only reason why I am still running with a Windows rig is for those half-dozen programs that don’t have appropriate non-Windows variants. It’s why I’m also running a Mac Mini and an OpenSUSE tower through the same 4-port, 6-head KVM.


  • Corporate cuts should always start with the greatest fat that does the least work - the ones at the top.

    Because if the company has found itself in a place where headcount needs to be reduced, these are the people who led it there and deserve all of the blame for hurting the company to that degree. Plus, you should always start cutting where you get the lowest volume of productive work for the greatest money spent, and that is always at the top.



  • rekabis@lemmy.catoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    arrow-up
    19
    arrow-down
    6
    ·
    edit-2
    1 year ago

    We don’t have anyone actively working on Windows support, and there are considerable changes required to make it work well outside a Unix-like environment.

    We would like to do Windows eventually, but it’s not a priority at the moment.

    This is how you make “critical mass” adoption that much more difficult.

    As much as I love Linux, if you are creating a program to be used by everyone and anyone, you achieve adoption inertia and public consciousness penetration by focusing on the largest platform first. And at 72% market share, that would be Windows.

    I hope this initiative works. I really do. But intentionally ignoring three-quarters of the market is tantamount to breaking at least one leg before the starting gate even opens. This browser is likely to be relegated to being a highly niche and special-interest-only browser with minuscule adoption numbers, which means it will be virtually ignored by web developers and web policy makers.


  • His router is tri-band though meaning it has 2 5ghz transceivers.

    Unfortunately, for many models - like the Linksys WRT 3200ACM - that second antenna (technically the third one if you include the 2.4Ghz one) doesn’t function at all without the manufacturer’s firmware. It’s a dead stick with any third-party firmware, and is 100% software-enabled.

    I have found this fact to be reliable whether it is DD-WRT or OpenWRT, and across several different manufacturers including Asus and D-Link.