The more you can filter and label at the source, the less you have to work out in VL.
I use alloy (which is kinda heavy) to extract and prepare only the data I want and it works great so far.
Not everything in black and white makes sense.
The more you can filter and label at the source, the less you have to work out in VL.
I use alloy (which is kinda heavy) to extract and prepare only the data I want and it works great so far.


It’s a cheap Chinese made, it prints when I turn it on and that’s like every two months for the last 3 years. So I guess it’s good. I also only use it to print from linux or my phone.


I got a cheap Xerox laser printer, the toner is cheap and lasts forever.


I agree with you on some points here. The problem is that these crawlers are hostile to the point of DDOSing sites
So the problem is not that someone archives your public accessible data, the problem is that in doing so either breaks your site or makes you pay for the excess traffic.
I think the web is now broken beyond repair. The commercialisation killed it and the tech monopolies are all that’s left.
So I think small invite only fully encrypted enclaves are all that is left, until someone comes up with a “new Internet”, that can resist the " Techbros", but for now I don’t see that.
Also I don’t see the Fediverse as a solution, it’s just under the radar for now, but if it gets bigger it will be coöpted and sunk.


The whole mobile ecosystem is a giant hardware backdoor on every phone. I think it’s too late now to change anything on that level.


Sorry but this whole thing is just snake-oil.
You can verify and sign your whole trust chain down to the last shared library and it doesn’t matter when you don’t know what the binary blobs on your TPM / CPU / BIOS / NIC are doing.
The only guarantee to a secure system is openness an all of that signing won’t help you there.


I don’t trust Microsoft, why should I start trusting IBM/Canonical or Poettering now.
If the possibility is there they will happily lock you out of your own hardware.


Interesting, I wonder if you could do the same now with Internet. We know that DSL works over a wet rope.
What are you using to ship the logs to VL?
If you want to exclude “normal” logs you should start excluding them before they reach VL, so the only logs you have are the interesting ones.