

I worked on a terrain render of the entire planet. We were filling three 2 Tb drives a day for a month. So this would have been handy.
I worked on a terrain render of the entire planet. We were filling three 2 Tb drives a day for a month. So this would have been handy.
It’s a side effect of the “no algorithm” philosophy. Without anything to suggest what you might like generating high amounts of new content let’s you dominate the feed. News is easy to generate on a daily basis.
The thoughtful channel that puts out a video once every two weeks gets on the feed once every two weeks. Then low effort garbage like man carrying things gets on the feed every day.
They need to at least let users block channels from their feed.
ls -hal
4,375,000,000 of the 3 1/2" disks. Sierra would be proud.
But her emails! /s
Thanks but it’s not so much difficult as I’ve learned I dislike a rolling release. Feels to much like being at work in a production environment.
I think NixOS is going to give me the stability where I want it and the cutting edge where I need it. Being able to roll back changes to the OS sounds great. In theory anyway I’ll see how it goes in practice.
Good news is I should be able to get it like I want on a flash drive and them just port the config to my SSD when I’m ready to nuke Arch.
If you can I’d stick with Mint. I updated my hardware recently and need kernel 6.14 or newer. I’ve not been happy with Arch and miss Mint.
I’m thinking of giving NixOS a try as it also supports 6.14.
I’m going to agree with a lot of the other posters and say QT with QT creator. It’s a tested and well though out implementation. It’s signals and slots event system is straight forward and easy to learn.
Whatever route you take learn Model View Controller (MVC). It gets in the mindset of keeping your data model seprate from things that use the data and things that change the data.
Agreed. I wasn’t trying to say they are always better just explain the difference.
I almost exclusivity use Linux and it handles this great. .so libraries are stored with a version number and a link to the latest. So math3.so and math4.so with math.so being a link to math4.so. that way if needed I can set a program to use math3.so and keep everything else on the latest version.
So the basic purpose of a library is to allow code that does some useful thing to be easily used in multiple programs. Like say math functions beyond what is in the language it self or creating network connections.
When you build a program with multiple source files there are many steps. First each file compiled into an object file. This is machine code but wherever you have calls into other files it just inserted a note that basicly says connect this call to this part of another file. So for example connect this call to SquareRoot function in Math library.
After that has been done to every file needed then the linker steps in. It grabs all the object files combines them into one big file and then looks for all the notes that say connect this call to that function and replaces them with actual calls to the address where it put that function.
That is static linking. All the code ends up in a big executable. Simple but it has two big problems. The first is size. Doing it this way means every program that takes the squareroot of something has a copy of the entire math library. This adds up. Second is if there is an error in the math library every program needs to be rebuilt for the fix to apply.
Enter dynamic linking. With that the linker replaces the note to connect to the SquareRoot function in math library with code that requests the connection be made by the operating system.
Then when the program is run the OS gets a list of the libraries needed by the program, finds them, copies them into the memory reserved for that program, and connects them. These are .so files on Linux and .dll on Windows.
Now the os only needs one copy of math.so and if there is a error in the library a update of math.so can fix all the programs that use it.
For GPL vs LGPL this is an important distinction. The main difference between them is how they treat libraries. (There are other differences and this is not legal advice)
So if math.so is GPL and your code uses it as a static link or a dynamic link you have to providd a copy of the source code for your entire program with any executable and licence it to them under the GPL.
With LGPL it’s different. If math.so is staticly linked it acts similar to the GPL. If it’s dynamicly linked you only have to provide the source to build math.so and licences it under LGPL. So you don’t have to give away all your source code but you do have to provide any changes to the math library you made. So if you added a cubeRoot function to the math library you would need to provide that.
The constant changes were what made me decide it was a bad game. One of the reasons I bought it was the different FTL types. Between that the consent planet management changes I couldn’t enjoy the game. add the stupid war score system and I’ve learned to stay away from Paradox.
Even as a technical user it’s just nice to have a descent mod manager.
I can download and extract the mod but checking them all to see if they have updates or what dependencies they have is a pain.
Then later when I want to change mods what does modClothingTex do? Is it required for a mod I like or is it something I tested and didn’t like. A manager can make that so much better.
It runs fine on Linux Mint Debian under proton for me.
Fun trivia. It’s called a second because it’s the second division of an hour after minutes. You can keep going with thirds and fourths for sub second time.
Clean up the terrible underlying code like how user accounts are stored.
Rip out the closed source garbage like pocket.
Restore user choice like allowing any extension the user wants regardless of mozilla approval.
At work a mix of red hat, fedora, centos, and red hawk. At home mint debian spin. It just works and games run great. I don’t have time to deal with the red hat crap if i’m not getting paid.
Honestly it’s better but still a mess of design choices. For open source graphics editor check out Krita.
For ntsc vhs players it wasnt a component in the vcr that was made for copy protection. They would add garbled color burst signals. This would desync the automatic color burst sync system on the vcr.
CRT TVs didn’t need this component but some fancy tvs would also have the same problem with macrovission.
The color burst system was actually a pretty cool invention from the time broadcast started to add color. They needed to be able stay compatible with existing black and white tv.
The solution was to not change the black and white image being sent but add the color offset information on a higher frequency and color TVs would combine the signals.
This was easy for CRT as the electron beam would sweep across the screen changing intensity as it hit each black and white pixel.
To display color each black and white pixel was a RGB triangle of pixels. So you would add small offset to the beam up or down to make it more or less green and left or right to adjust the red and blue.
Those adjustment knobs on old tvs were in part you manually targeting the beam adjustment to hit the pixels just right.
VCRs didn’t usually have these adjustments so they needed a auto system to keep the color synced in the recording.
Or toss a flash bang in the crib.
Part of the reason it’s so fast is they have the passenger manifest already. So they start the search checking against the hundreds of people that just arrived. Instead off the much larger overall database.