I’ve got them both turned off lol
Just the fact that it needs to be said…
What can you put that?
I loved my 7t, it was the best phone I’d ever had, especially when it was still on oxygen os 10. Every major os update it got since then felt like a downgrade though, unfortunately. Now I have an 8t, which is fine and I’ll probably keep it until it dies, but I don’t think I’m going to get another OnePlus. Any androids that have that old OnePlus feel? I remember I used to like Moto’s too, are they any good nowadays?
It’s not an either/or thing, the tank in the picture is literally sitting under a tree
I’ve never built or trained an LLM, but I did get to play around with some smaller neural networks. My neural networks professor described this phenomenon as “overfitting”: When a model is trained too long on a dataset that’s too small for the model, it will sort of cheat by “memorizing” arbitrary details in the training dataset (flaws in the image like compression artifacts, minor correlations only coincidentally found in the training dataset, etc) to improve evaluation performance on the training dataset.
The problem is, because the model is now getting hung-up analyzing random details of the training dataset instead of the broad strokes that actually define the subject matter, the evaluation results on the validation and testing datasets will suffer. My understanding is that the effect is more pronounced when a model is oversized for its data because a smaller model wouldn’t have the “resolution” to overanalyze like that in the first place
Here’s an example I found from my old homework of a model that started to become overfit after the 5th epoch or so:
By the end of the training session, accuracy on the training dataset is pushing the high 90%'s, but never broke 80% on validation. Training vs validation loss diverged even more extremely. It’s apparent that whatever the model learned to get that 90-something percent in training isn’t broadly applicable to the subject matter, so this model is considered “overfit” to its training data.
Haha sometimes you just need to rip off the bandaid
It can be both, FOSS is just more precise. And just like that, I’ve used up all of my semantic pedanticism for the day
They’re being diluted though! It was so much worse last year
Let me check my email
YoU mEAn YouR gMaiL.COm??
They can fork it, if nobody wants to work with them anymore that’s their problem
Probably the worst way to learn git is “as-you-go” in an actual project. Unfortunately that’s a common way to start, that’s definitely how I started. If I had to learn it again or teach it to somebody else, I’d make a super simple application, like a “Hello World” webpage, and learn and gain confidence with that.
Hell of a comment section under this post. I’m actually getting second-hand cringe
Well thank yoouu for recognizing an excellent example of promoting polite discussion
it doesn’t matter which one you choose
That’s not really true though, every instance has it’s own rules, and it’s own federation policies, not to mention the other instances that don’t federate with it.
I’m already on lemmy, so it’s not like I haven’t gone through this before, yet I still haven’t made a pixelfed account despite being interested because I don’t want to just go for the biggest instance and I have no idea how to vet the other ones.
Most of my immediate family just does sms group chats. I don’t keep up with the extended family, seeing them every other holiday is enough
I mean it kinda has. It wasn’t an insta-kill but users have dropped dramatically and it’s still dropping
I can tell