

Honestly I hate to admit that its thanks to LLMs that I have been able to fully switch to Arch the irony is that I actually end up reading more and understanding more about my system.
recently I had an issue where my Moza R5 Racing set up just stopped functionning. it turns out the drivers for it which I installed seperatly were now merged in kernel 6.18. I wasn’t aware of that at all and it’s actually the AI that made me aware of it. now essentially I had to uninstall the drivers and purge the configuration for it and reboot. it was a simple fix but I was doompasting commands into terminal from the AI and it was just going in circles eventually I actually had to figure it out myself.
I think that you actually still learn a lot from the AI about linux in general and since I used gemini pro for a month (now using glm-4.7 locally) which was up to date in terms of news and info.
Even though troubleshooting errors on linux with an AI can often end up in circles, I would have not have found out without the AI that the drivers for my niche sim racing rig on linux were merged into the kernel. I probably would have ended up doing a fresh install or a distro hop (most likely the latter since I always have issues with Arch).
if you are going to use llm, if you have a powerful enough rig then please try to run them locally. I did some AI assisted work with gemini (Google) and I find it genuinly creepy that if I ever ask it a question now it always tries to go back to my work even on unrelated topics, different chats and after prompting to stop or I will switch AI provider (I ended up switching in the end) it bugged me for two months then I switch to local.
I suspect it’s trying to get me back to work to analyze my data to serve google’s own interests but that would just be speculation right? RIGHT?








You would be more than welcome