Hacker News new | past | comments | ask | show | jobs | submit login

Honest question: What's the deal with this stuff? I'm not interested in the reasoning that it's done just because they can do it. I mean... Why not just ... use Linux? I suspect it's a comfort zone thing, and most people really don't want to move out of their comfort zone (and especially programmers have a hard time admitting this). I understand people developing for the platform, but I don't understand people developing for Linux on Windows. I hear some people talk about Visual Studio being a great IDE, but I know tons of people that don't use VS. They use ST3 on Windows to develop for Linux, and they're happy about this stuff, and I just don't understand why people wouldn't just migrate to Linux in the first place.



For me, the Adobe suite is one thing Windows has over Linux. That and easier hardware support. I'm currently running Ubuntu on my desktop and about 1 in every 3 times I suspend the machine, it needs to do a full reboot. Also every time I do a software update, I need to reconfigure my video card.

Currently, I have to switch over to my MacBook Pro to do any work that requires Adobe software, and then back to my Ubuntu machine for development. So if this Windows 10 thing works out, I might consider switching over (although last I used Windows I wasn't a fan of juggling three different shells--four if you count git bash: PowerShell, cmd.exe, and Cygwin).

Otherwise, I might do what I should've done in the first place and ultimately get a new Mac that can support 4k properly--the one I have is the generation right before 60Hz 4k support, and the 30Hz my current one is at is surprisingly annoying to work with.


> What's the deal with this stuff? I'm not interested in the reasoning that it's done just because they can do it. I mean... Why not just ... use Linux?

For a desktop/laptop, particularly in corporate/enterprise environments (but, also, often for solo devs where its also a personal PC) there are lots of reasons you might want to have Windows outside of the actual dev-specific parts of your work. This is an alternative to second computer / dual boot / VM-based solutions to having your Windows and Linuxing it too.


I absolutely need full Excel&Word compatibility for collaborating on documents with others so Linux is not an option. Neither LibreOffice or MS Office Online provide appropriate compatibility, editing a complex document made by someone else and sending back to them will not preserve its formatting.

This means either MacOS with the associated hardware lock-in or Windows with all the associated problems in installing/compiling dev/research tools that just work on Linux or Mac.


> MacOS with the associated hardware lock-in

Macs will run both Windows and (with some EFI hacking) Linux. It's true that you need a Mac to (legally) run macOS, but that's lock-out, not lock-in. Essentially it is a $1000 license to the OS and perpetual updates.


I meant that if I my standard workflows and software are on MacOS, then I'm locked in to a single hardware provider and their offering.

For some cases this is a limitation. Their laptops are quite nice and the imacs are okay, but if I want to have, say, a powerful desktop workstation with a bunch of nvidia GPUs for CUDA then MacPro doesn't really cut it, and I'll have to run Linux on it; and if I need a extra secondary/tertiary computer/laptop that doesn't really need to be good, then I have to choose between it running a different UI than the main computers or paying a rather hefty premium because of the lockin.



Unfortunately Windows has much better hardware support (e.g. Skylake laptops, multi monitors via displayport MST).


Does it, though? I use four monitors with Linux Mint at work while my laptop is docked. The 'Optimus' dual-GPU thing works out of the box. But I have a Windows 10 gaming rig at home for Star Citizen, and when I installed Windows, it didn't recognize the onboard NIC or the AMD 9790 GPU with default drivers. And as it just so happens, late last night, this same Windows 10 box mysteriously lost network functionality, and I had to reinstall network protocol drivers (!). I hear "hardware support" a lot, but I haven't had problems with hardware support on Linux for a few years at least, and every time I try to use Windows, I have hardware problems. The Windows users on our dev team seem to have problems regularly. You're right that it used to be a thing. But I just don't buy it anymore.


> Does it, though?

Unless you live in a very special bubble, yes.


My desktop, my brother's laptop, and a printer work well with Linux, perhaps because they aren't high-end.

On the other hand my mother's friend had a mysterious driver problem (screen glitching) with his NVidia card after upgrading to Windows 10.


>> perhaps because they aren't high-end

Or because they are not new. I had a bad experience after i bought, in January 2016, Intel NUC with Skylake processor and Iris 530 graphics card.

I had few months of struggle, like:

- problem with installation (installer won't boot without some cryptic kernel parameters passed),

- lack of graphics driver

- random crashes (like Google Maps causing the whole system to hang, requiring hard reset)

- processor not running at full speed

- system seeing only one logical core instead of 4 (2 cores x HT)

- "shutdown" system button causing reboot instead of power off

Most of those were fixed only after Ubuntu 16.04 came out, at the end of April. Some issues however, persist.

So, my impression is that Linux is good choice only if your hardware is quite old (like, say, two years, or at least one processor / graphic card generation behind)

For people like me, who want latest and greatest hardware Linux is not an option.


Except when it forgets that you have said hardware installed.


> I mean... Why not just ... use Linux?

Linux as a subsystem of Windows gives me a much better story as far as hardware compatibility and software than the other way around.


For certain things, I agree. For me, it is easier to dual-boot with Win 8.1 as my primary. I tried running pure linux on my laptop, but then had cut myself off from distributing any game I might make for Windows.

I have struggled with MSYS2 several times on Windows, and have broke the installation. Once it hits critical mass, some dependency or setting ruins my whole experience.

But then again, I've been on the wrong side of these discussions before: I was running Minix on my Amiga 500, and I was rooting for Minix over Linux back in the day. I also ran MkLinux on my PowerPC Mac, and was working on my own OS as a variant of minix.

If you are a web dev, and don't do .NET F#/C# then sure, Linux all the way, however, I am curious to see how .NET Core plays out.


If that's what you have to tell yourself, I guess.


Windows only tools, corporate policies which mandate windows. Tons of reasons


Sometimes you're targetting Linux, but your main target is Windows. Other times, you're just more comfortable primarily using software that's available on Windows, and just need a few tools from Linux every once in a while. MS has always given collections of very powerful developer tools, and this seems something to add to that collection.


Consider people who are developing cross-platform apps that target Windows _and_ Linux. With this thing, you can develop on Windows, immediately build for both (and so e.g. verify that you're not using any VC++-isms in your code), and test both right away. In fact, VS can even debug via gdb these days.


If it's at home, maybe for games? It sounds like if you're going from a Mac to Windows 10, you get a better Unix (closer to Linux) and more games. I might actually take them up on that for a home desktop box, if I can figure out what hardware to buy.


> people talk about Visual Studio being a great IDE

I think this is just something people regurgitate often and believe without evidence. I've used the IDE and it is terrible – probably one of the worst. Last I checked it couldn't open multiple windows for certain file types. Its interface looks like a web IDE. It hangs while it scans your project to provide IntelliSense, which IntellieSense itself is crazy annoying. Who wants stuff popping up while you type or stuff to get underlined while you haven't yet finished. I actually have a big list of annoyances somewhere around.

I really can't agree more – just use Linux, or some variant of UNIX, or use macOS if you don't want to fuss with buggy/missing drivers and piecing together your own computer from parts like it is some kind of difficult feat or accomplishment worthy of nerd respect.


It used to be good. Then WPF happened.


Yes, I also think VC6 was the best IDE. But WPF happened in VS2003, so most people who describe VS as the best IDE ever actually refer to a WPF version.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: