Really? Like what? I mean linux is the backbone of the internet. I think of it as extremely predictable and rock-solid. Describing it as an os where "so many things that regularly go wrong" seems like a totally foreign concept to me.
The last try, I installed Ubuntu on my desktop and checked the 'Auto Login on Start' box on setup, which broke the UI and required a CLI fix. It wasn't hard (because I'm a dev and it was easy to troubleshoot/fix), but it was more than 0 effort and a person that wasn't a dev might not be able to do it.
Prior to that, I had an issue with bluetooth drivers that, IIRC, required finding a custom driver online or some C source file? I don't really remember the specifics, but it was another "I am a dev and this isn't really difficult but is more than 0 effort and my mom couldn't do it."
Prior to that, I installed Ubuntu on my laptop for college and the display drivers were an absolute mess. The screen brightness flickered from 10% to 80% over and over, regardless of what I was pressing or they were set at. I didn't find a fix and ultimately reformatted and went back to windows.
> I mean linux is the backbone of the internet.
Linux containers are great, I use them all the time. Linux as a desktop environment where I use an array of UI applications to develop software, make and observe video files, make games, etc, I have never once had a good experience with. The most recent try, when I encountered the Auto Login issue, I was also totally unable to get Unity to compile/run my company's game. It was yet another thing that I probably could have fixed, but the value-add vs. the effort of constantly having to manually fix each individual piece of software I intended to use, just didn't seem worth it to me. And, what's probably worse (to me), is the general response I found online was "Those issues aren't that big of a deal", which totally ignores my entire point. Death by a thousand paper cuts is a problem, even if each individual paper cut isn't that bad.
Again, I am happy to be told that my experience isn't indicative of the landscape of the env, and that I myself just have terrible luck, but if we're asking what _I_ think, that is my experience which makes me think it's not yet ready for legitimate non-developer use.
Death by a thousand paper drew a visceral response from me, it describes my experience perfectly.
I'm a developer and power user that wants to do various things beyond just browsing.
Things regularly required troubleshooting and fiddling, and for one thing that is fine, but after the 5th serious time consuming issue I get cross and around the 10th I can the migration attempt and go back to windows. Done this every three or four years for the last couple of decades.
I'm due to have another go around 2022 and fingers crossed it will work then, but I doubt it!
Linux application servers tend to be worth it though, and part of that is the use cases are usually much more limited, and on the well worn path.
As a developer really? Which kinds of issues for example? I feel like Linux is the default platform for software development, and Windows is a 3rd class experience by comparison.