The best example of this is Linux.
Ouch… so, you might want to learn more about technology before commenting in a Technology community…
why does a modern operating system require you to use a terminal
Because a terminal is one of the most powerful modes of interaction ever invented. It can serve as a relatively low-tech UI, but it is also simple enough to be used as a machine interface. It is lightweight, works even when other protocols and interfaces are thwarted by infrastructure issues, because it is simple text, but also meant to be read by a human, it can make for a great interface for logging, you don’t have to guess at which obscure standard (if any) to use to talk to it, compliance with relevant standards is baked into nearly every language ever written, etc.
Try building a system like Kubernetes on graphical UIs… I dare you.
Its THE example of ancient software being pushed on to niave techies
What industry are you working in?! AWS is nearly all Linux. Google Cloud is nearly all Linux. Android is Linux. Hell, even Microsoft finally relented and is now strongly supporting their Windows Subsystem for Linux (WSL) because it’s necessary for supporting modern cloud applications.
that would rather have an insecure open source project than a safe, walled garden like Microsoft Windows 11.
Okay, this has to be a troll… right? This is a troll? Please tell me you can’t be serious.



Which we’ve been told is right around the corner for decades. The issue is that QC doesn’t scale up. If you try you get vastly more noise than signal. Current work in QC is all aimed at reducing that noise, but even for only 70 qbits, the current state of the art can’t eliminate enough of the noise for QC to be useful in most applications.
The only places it’s currently bearing any fruit is where all of the extra work to reduce noise and the delays that incurs are irrelevant because there is no classical approach at all. But even then, the costs are enormous and the benefits are miniscule.