Arch really isn’t very difficult to use these days for someone with a few years debian or similar use (don’t try and use it when first trying linux).
Installing it is straightforward (albeit in a linux rather than, say, windows installation sense), and you can access preferences via the settings app.


I have studied machine learning to an useful level. Based on that it currently looks to me that:
A fine-tuned (for specific purposes) LLM utilising RAG and high quality prompt engineering can make redundant north of 50% white collar office roles over some small number of years of further fine-tuning (post initial deployment of the AI).
But without these techniques LLM’s just aren’t accurate enough to be deployable on a large scale. Usually these techniques require a locally hosted LLM for one reason or another.
The only sized organisations that it might make sense to use these multi-billion data centres for the above purposes (to get the LLM’s accurate enough to be useful) are governments, or huge companies that can pay billions to the likes of microsoft (or OpenAI, or Anthropic or whoever is left standing) to do all of the above on a huge scale.
I might be wrong now, but even if I am not, it is such a rapidly changing technology that in three months this might be an out of date opinion.