> "I only use Windows as a game launcher (and I’m still on Windows 10), I don’t understand why anyone would need it anymore except games."
Thanks to Valve/Steam and Heroic (Epic Store client), and a few other handy tools including Valve's Proton (WINE fork) I haven't even needed Windows for games for ages now. I can live without the small handful of Linux-hostile Windows-only games that still won't run due to draconian DRM/AntiCheat, as the vast majority of my Steam library is "Click 'Play' and they just work", for Windows games and the many native Linux games I also have. Same for more'n half of the games I have on Epic (all from their freebie giveaways). The ones that don't work I can live without. Every other thing I want to do on my PC I've found native Linux software for that more than handles the task.
Would love to know more, maybe some pointers to tools/utils for getting these more difficult games going. I haven’t had the time to go further than using Steam and Lutris, which covers most of what I play, but would be good to have more options.
#1: Apple locks you into their ecosystem, and feels even creepier than MS.
#2: The Linux distros I've used are designed in a way so you shouldn't have to use the CLI and sudo for normal tasks, but you end up having to anyway in practice. It's a combination of annoying, and I, sooner-or-later, end up in a state where the system is "totaled"; easier to do a clean install than get it working again after the wrong CLI C+P broke something important.
e.g. I used to install qlplugins on every new macos device as our company bought them. Then one day Apple refused to run any 32-bit library despite the fact that 32 bit code still runs natively on x86-64 chips (this when everything still ran on x86-64 btw).
Designers still ask me: "Hey, how do I see the dimensions of this picture when pressing spacebar to preview it?" And I have to say: "Sorry, Apple said you can't do that anymore".
> Because maintaining 32 bit libraries when most of the world is 64 bit doesn’t make sense.
64bit code is the extension, not the base case.
32bit (x86) code just straight up runs on an x86-64 (64-bit) processor.
It's an artificial limitation that just doesn't make sense, and it can also be worse for performance too.
> This is not the same as apple actively forcing you into their ecosystem.
No?
Apple prevents 32bit libraries from running. This effected almost all the common cross-platform layers from running on MacOS.
In a single action, Apple eliminated the libraries that Steam games on MacOS used. Games would now have to be recompiled and were pushed to be downloadable through the Mac App Store, giving Apple their 30% cut.
> This is apple not supporting old technology.
If it's old then apple shouldn't support it? By that logic, we shouldn't be using ints in our code, we should only use 64-bit quaternions.
There is a difference between old and outdated. 32bit code is old. 32bit code is not outdated.
I didn’t read but the first sentence because you clearly don’t know that an OS must have both 32 bit libraries and 64 bit libraries to run both. Apple chose to leave 32 bit behind to stop maintaining it.
In every company I worked for (Germany, Poland, Sweden, Switzerland) it was all Windows laptops, that for the Linux component either used a VM, WSL or straight up servers.
I just don't want to work under such incapable management / IT department, that forces developers to use Windows. I have worked with Linux for 17 years and the last time I was forced to have a Windows VM was 14 years ago.
Yes, it limits available jobs and probably doesn't pay any top salaries. But better than selling myself to people I don't respect.
Although I probably have my price, I too otherwise refuse to work for companies that'd force me to use windows. I've been lucky to be able to use macs since 2008 and some of those companies had linux users. Most top tech companies will happily give you a choice between a mac or windows. For me it has generally mean I don't work for banks, the government, or very large organizations that aren't recent tech companies.
That being said, at a certain scale or type of company centralized management, software support, and security risks mean "allowing" random people to run their own OS becomes difficult and risky. Lots of large older companies probably have proprietary software, too (thankfully more of this is becoming web based).
Yes, in theory linux is more secure and anybody who would want to use it is probably capable of taking care of themselves, but it is probably (at large companies) corporate lawyers and CISOs ruining the fun. And linux can have its own risks and dedicating a team to support (from a security perspective) them isn't economical for the what would be very small user base. Ye old big bank can't and arguably shouldn't allow it without an otherwise good reason.
The good reason is to get your job done, which is what the computer is for. The tools many developers use today (e.g. containers) are Linux only and their target environment (i.e. the servers where their code will run) is Linux. The typical workaround is to use VMs. This results in terrible filesystem performance on both Mac and Windows, and on Windows I've also had to deal with things like time, DNS, and routing breaking in the VM, so now I need to know how to be a Linux and Windows sysadmin (i.e. not what you want to be paying me for) to fix it instead of just using e.g. Fedora which I've never had any issue with; it just works.
Another user commented about how it's unreasonable to ask them to run CAD software in a VM, but this is exactly what (some? many?) companies ask programmers to do. It's especially goofy if it's a software company where that development is their core business.
I admire this spirit, probably because I am still unable to put it in practice -- and those are typically moments where I dream of my own business. IT at my current employer is capable enough ( though there are rumblings of moving everything, but hardware offshore again.. I am starting to think it all moves in waves ), but our executives, well, lets just say over the past few weeks I was questioning what I am doing here.
Figma's valuation is a fragment of that of Adobe. Saying most designers use Figma is as distorted as saying most designers use macOS, even if both things may be true in a specific niche (e.g. tech companies in the US).
In my experience of the rise of macOS among developers, the biggest driving force was mobile app development and mobile web development because Apple refused to provide first party developer tools or emulators usable on other operating systems and made it intentionally difficult to run macOS on non-Apple hardware or in a VM. Previously macOS was largely associated with designers, not developers. This move also started blurring the lines more with designers making the move to development and (to a smaller degree) vice versa.
No, developers flocked to macOS because it runs unix and has a complete and nice looking UI with sensible design patterns designed by an actual designer instead of an engineer.
Development across the board is better on a mac for some / many people.
I think that's a fair take, it really does depend on which platform(s) you're developing for too.
And you're probably correct that Adobe has a larger share overall, but I would highly doubt that Illustrator or XD are being used more than Figma for UI design these days. Even less so, Photoshop, since it's a raster tool. Adobe was about ready to phase out XD when they were getting ready to purchase Figma.
Figma is free starting off, and their basic plans are extremely affordable compared to anything Adobe offers.
I've used a SP (7) with Linux. It works as a PC, but is not the move. Notes:
- Needed a custom kernel to get the pen (or was it touch) working?
- No good note taking applications (eg OneNote competitor)
- Notably latency on the pen.
- The first distro I attempted to install didn't work. (Manjaro?)
I've been using a Linux desktop workstation for like a decade. Never got on the laptop revolution, don't really understand it, seems like you're just setting yourself up for some savage RSI.
Apple Silicon means no discrete graphics card and no CUDA, no DirectX. It seems fine for machine learning, it falls short for other intensive applications.
Apple Silicon has the power envelope advantage, Intel and AMD chips remain faster for those who can deal with the power consumption (under lock when longer battery is really needed, and either be plugged or have a backup battery when going full throttle)
I use Blender3D for everything I (as a total non-professional novice) would need 3D graphics-wise, but as I understand it, there are a few CAD options available on Linux, and several more that run fine in WINE for those who need more professional or proprietary options.
On Linux there's FreeCAD, SolveSpace, Dune3d and OpenSCAD in (subjective) decending order of capability. FreeCAD still suffers from the "topological naming problem", though there's a fork that helps with that and merging it upstream is an ongoing process.
I purchased Alibre Atom3d because it was the only non-subscription affordable package I could find, but it's quite disappointing and won't run in wine anyway due to the licensing rootkit it uses. So I end up using one of the above linux capable tools, depending on the needs of the model. Each one has its strengths and weaknesses and none is satisfying overall. I reach for OnShape for any design that is going to be open source, because it is a pleasure to use (even though it is web-based), but I don't make enough money from CAD to pay for a $1500 annual subscription.
FreeCAD is ok, I use it for simple 3d printing designs, but you have to be very careful how you build the model as it likes to ruin dependencies if you change something in earlier sketch.
1.0 dev builds seem more user friendly.
Like some other good-bones free software, the UI is terrible. I challenge anyone who has prior CAD experience, but no FreeCAD experience, to install the software and design a simple bracket in under an hour.
I tried Fusion under Wine and it didn't really work. I haven't tried NX or Solidworks in the last few years.
I know that a few years back before WSL was even a thing someone at Microsoft "leaked" that some departments were using macOS almost exclusively. I would be surprised if the decision makers who force all that crap into Edge actually dogfood the product. WSL, Azure and Windows Terminal are a different story though, which might explain why they hold up and aren't overloaded with bullshit.