Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Home truths about macOS (eclecticlight.co)
190 points by ingve on Nov 6, 2022 | hide | past | favorite | 174 comments


> What other actively exploited bugs are Big Sur and Monterey now vulnerable to?

Actually, Apple sometimes patches actively exploited vulnerabilities in "unsupported" older versions of both macOS and iOS. The vulnerabilities that don't get quickly patched are the ones that aren't actively exploited.

In any case, security researchers talk about known vulnerabilities that have or haven't been patched, but as the author says, "New versions of macOS are full of bugs". This includes security vulnerabilities! Every new major OS version has brand new, unknown (to Apple) vulnerabilities that did not exist in the previous version. So just because the latest version is "fully patched", that doesn't mean it's "safer". I've seen it happen many times in the past where a new security vulnerabilities in a new OS isn't discovered by the public until many months later.

As far as stability and safety is concerned, I personally consider macOS N-1 to have the best tradeoffs.

Moreover, the most dangerous software on your computer is not the operating system per se, it's the web browser. The browser is where you're mostly likely to encounter "malicious crafted code" (i.e., JavaScript). Apple supports Safari for macOS N-2, and Safari on macOS N-2 is more similar to Safari on macOS N than macOS N-2 itself is to macOS N. In other words, Safari is likely to be fully patched even if macOS isn't.

You can also use another browser such as Firefox or Google Chrome, which ironically support older macOS versions than Apple supports, and the latest versions of those apps are identical and fully patched on every macOS version they support.


> "New versions of macOS are full of bugs"

This is why I never liked Apple's annual upgrade schedule. Ostensibly to stay ahead of security issues, annual upgrade isn't working and is causing more issues than it is worth. Leopard to Mountain Lion were the glory days of OS X stability, with Snow Leopard, Mountain Lion, and High Sierra standing out as focusing on bug fixes rather than introducing new features (and new bugs).

I wish Apple had development and stable branches for macOS, such that stable was plain and had very limited features, stayed compatible with decade-old hw, and continued to receive security updates, and upgrading to development was effectively a large patch for stable that included all the new features.

> Moreover, the most dangerous software on your computer is not the operating system per se, it's the web browser.

Agreed.

> You can also use another browser such as Firefox or Google Chrome, which ironically support older macOS versions than Apple supports, and the latest versions of those apps are identical and fully patched on every macOS version they support.

If one must have browser connectivity on a mac, then do this, even when still supported and fully patched. I wouldn't mail or surf on any unsupported OS.


> Leopard to Mountain Lion were the glory days of OS X stability, with Snow Leopard, Mountain Lion, and High Sierra standing out as focusing on bug fixes rather than introducing new features (and new bugs).

This is a common misconception. Major OS updates always introduce more bugs than they fix. Bertrand Serlet put up a "No New Features" keynote slide for Snow Leopard, but that was highly misleading, because Snow Leopard had many big under the hood changes, and the early versions of Snow Leopard had some nasty bugs. You can go back and look at the old Mac OS X release notes to see the descriptions of the bug fixes. What was different about Snow Leopard than today is that it received a full 2 years of nothing but bug fixes. The Snow Leopard people remember fondly was 10.6.8 v1.1 not 10.6.0, which was much buggier than its predecessor 10.5.8.

The current yearly major OS update schedule is death for quality, because there's never enough time to fix the bugs. The engineers always have to turn around and start working on the next big thing, never fix the current big then. There's no solution except more time between major updates.

The other difference between now and the past is that now Apple is immediately pushing everyone to update, whereas new major Mac OS X versions used to be paid rather than free, so there was a lot more lag, you had to go to the store and purchase new discs in order to upgrade, and a lot more users were sticking with the older versions.


> This is a common misconception.

This is vague.

> Major OS updates always introduce more bugs than they fix.

This was my major point.

> Bertrand Serlet put up a "No New Features" keynote slide for Snow Leopard, but that was highly misleading, because Snow Leopard had many big under the hood changes,

But no new features. The major goals of Snow Leopard were to improve performance and reduce its memory footprint (by removing a lot of PPC code). What was nice about it, and what my point was, Apple had 2 years for bug fixes.

> and the early versions of Snow Leopard had some nasty bugs. You can go back and look at the old Mac OS X release notes to see the descriptions of the bug fixes. What was different about Snow Leopard than today is that it received a full 2 years of nothing but bug fixes.

Yes, thank you, this is the entire point of my criticism of annual upgrades in that a year is not enough time to fix bugs before introducing instability and new bugs in a major release.

> The Snow Leopard people remember fondly was 10.6.8 v1.1 not 10.6.0, which was much buggier than its predecessor 10.5.8.

Thank you for the straw man, but, in fact, this was my entire argument against major annual releases. I'm not sure who expects point zero software to be bug free. I would actually prefer that major releases stop entirely unless there are fundamental changes, such as a new platform.

> The current yearly major OS update schedule is death for quality, because there's never enough time to fix the bugs. The engineers always have to turn around and start working on the next big thing, never fix the current big then. There's no solution except more time between major updates.

>< This is like Scooby Doo. Thank you for making me into Thelma, Fred.

> The other difference between now and the past is that now Apple is immediately pushing everyone to update, whereas new major Mac OS X versions used to be paid rather than free, so there was a lot more lag, you had to go to the store and purchase new discs in order to upgrade, and a lot more users were sticking with the older versions.

This is a fair point, except for the part about Apple pushing upgrades. Unrestrained users push themselves with no regard to what they're using continuing to work. Though software updates can be configured automatically by the user, and security updates are pushed, Apple isn't pushing upgrades, and I don't think the OS can be configured to upgrade automatically, only update.


> > Major OS updates always introduce more bugs than they fix.

> This was my major point.

I don't think your words said what you thought you might have meant: "with Snow Leopard, Mountain Lion, and High Sierra standing out as focusing on bug fixes rather than introducing new features (and new bugs)"

Major updates never focus on bug fixes rather than introducing new features (and new bugs). Otherwise they wouldn't need major updates at all, they'd just keep releasing minor bug fix updates. There's nothing really special about 10.6.0 (Snow Leopard), 10.8.0 (Mountain Lion), or 10.13 (High Sierra) compared to other major releases. These were all much buggier than the previous stable release.

> Yes, thank you, this is the entire point of my criticism of annual upgrades in that a year is not enough time to fix bugs before introducing instability and new bugs in a major release.

Except the annual release cycle actually started with Mountain Lion, which was 1 year after Lion, and High Sierra was also 1 year after Sierra, but you put them in the "the glory days of OS X stability".

> Apple isn't pushing upgrades, and I don't think the OS can be configured to upgrade automatically, only update.

I wasn't talking about automatic upgrades, I was talking about how macOS annoyingly advertises new major updates, for example in Software Update, and even permanently badging your System Preferences Dock icon until you upgrade. This didn't occur in the days when major updates were on physical discs.


> I don't think your words said what you thought you might have meant

> Except the annual release cycle actually started with Mountain Lion, which was 1 year after Lion, and High Sierra was also 1 year after Sierra, but you put them in the "the glory days of OS X stability".

They mean what I said, but I'm certain now that instead, you did this, but let's see if this helps

>>>> Leopard to Mountain Lion were the glory days of OS X stability,

COMMA (NEXT STATEMENT AND NEW THOUGHT)-->

>>>> with Snow Leopard, Mountain Lion, and High Sierra standing out as focusing on bug fixes rather than introducing new features (and new bugs).

> I was talking about how macOS annoyingly advertises new major updates, for example in Software Update, and even permanently badging your System Preferences Dock icon until you upgrade.

I don't think your words said what you meant and/or I made a poor assumption by your use of "pushing" in the context of "updates" or upgrades. Apple did start to push out security patches for supported OS versions, but you meant notices, or nags, of available OS upgrades. These can be avoided by disabling push notifications via the gui[1] or more permanently using the Terminal and launchd to disable push notifications:

     sudo launchctl unload -w /System/Library/LaunchAgents/com.apple.notificationcenterui.plist
or by completely disabling Software Update

      sudo launchctl unload -wF /System/Library/LaunchDaemons/com.apple.softwareupdate*
or less permanently disable automatic software updates with the defaults command (duplicating the gui method)

     sudo defaults write /Library/Preferences/com.apple.SoftwareUpdate AutomaticDownload -boolean FALSE

Outdated, but easily modernized for current macOS, and still my favorite list of disabling things.[2] Here is a nice source of defaults commands,[3] though the script is personalized for the author, I've only copied out what I wanted. Here's a similar script to pick through (blindly run scripts at own extreme risk).[4]

[1] https://www.businessinsider.com/guides/tech/how-to-turn-off-...

[2] http://tech.masterofsql.com/os-x/unload-disable-unwanted-age...

[3] https://gist.github.com/mvanbaak/e98b7b622ea2c8ab626d51cb88e...

[4] https://github.com/mathiasbynens/dotfiles/blob/main/.macos


> But no new features. The major goals of Snow Leopard were to improve performance and reduce its memory footprint (by removing a lot of PPC code). What was nice about it, and what my point was, Apple had 2 years for bug fixes.

But there were major features, but they were not user facing. Snow Leopard introduced Grand Central Dispatch, improvements to Objective-C, things like that are major changes, more significant than simply changing the way the UI looks.

High Sierra introduced APFS, a major major change. Sierra was probably a better release as far as no new features, they fixed the USB stack they broke in El Capitan.

Mountain Lion introduced iCloud related stuff like notifications I think.

The releases that you believe are stable because of no new features, actually have major new features, but the guys writing the low level code are more talented than the guys writing the UI code, that's why those releases are more stable.


> I wish Apple had development and stable branches for macOS, such that stable was plain and had very limited features, stayed compatible with decade-old hw

The reason they don't have that is the same we don't have this for phones. No profit driven venture on the planet would go through that trouble just to hurt their own sales.

Which is why free (as in speech) alternatives are so important.


It wouldn't hurt their sales. Users running old hardware aren't doing it to save money but instead because they can't afford new hardware, which isn't the same thing. Apple isn't getting those sales anyway. Mac sales are driven by new users and users that can afford to upgrade hardware every two years. These consumers aren't going anywhere, and if Apple remained committed to a minimal amount of support for older hardware, it would only make their brand more attractive and increase sales.


>It wouldn't hurt their sales. Users running old hardware aren't doing it to save money but instead because they can't afford new hardware, which isn't the same thing.

Many of those users that "can't afford new hardware" are still forced to update after some point due to the lack of updates ("cannot afford" and "absolutely cannot afford" being two different things, and most are in the former category, they can e.g. put it on their credit card, or save some more, or sacrifice something else they'd be spending on, even if they're tight).

Plus, there are many more who can afford it, but doesn't really care much for having a new machine, they just do it because the old ones don't run their software or get updates anymore.

Both of those categories would update less frequently if they had regular updates and support for their old machines.


Especially as they're looking to expand profits from services. Including older hardware will only increase audience capture and the number of people using and paying for various offerings from Apple.


Maybe I’m getting old but the UX changes in newer versions of macOS turn me off. I’m still running Catalina but that has to change as it’s EOL this year.

I use third party software to bring back some old behaviors but it’s an uphill battle, especially with things like the notification center.


Apple has added no UI changes since Leopard that I prefer or even care about about. I'd be extremely happy if they froze UI changes / features and focused entirely on stability, compatibility, and security instead.


All bugs aren't created equal. Malware users are vastly more likely to consume existing malware than create it and malware creators have an easier time creating code to attack known issues that have been fixed than finding unknown issues. Today's fix is a blueprint to attack people running yesterdays software.

Ergo you are most likely to be attacked using exploits that are both known and fixed because you are running old unpatched software compared to new bugs in recently released software.

In addition recently released browsers are great but your browser also relies on security features from your OS. I don't agree that its sufficient to run the most recently released browser on your old OS. Your up to date OS complete with new bugs is by far the best choice.


My iPhone 6 got an iOS update last month. I was pleasantly surprised they still updated an old iOS.


Do you have any information about how old versions of Mac OS receive security updates? As far as I'm aware, I don't really see any updates come through in the regular update section of the settings. It might've happened if there was some ridiculous vulnerability... But I otherwise don't think I've seen any general security updates show up (even after newsworthy events).

Also I keep asking this and keep getting 0 or bad answers. When people say web browsers, and specifically JavaScript, are to blame for almost all security issues -- What specifically are you referring to?

In almost every case I've looked into, what happens is somebody downloads some sort of file that is actually an .exe. Like TaylorSwiftGreatestHits.mp3.exe for example. It's almost always on a very old Windows version that has literally never done any updates.

Sometimes some JavaScript initiates the download of this file. When this then makes the news and is finally reported, the entire fault is explained as "JavaScript" or "Internet Browser" fully automated bug that requires 0 user input.

In reality, the user downloads the .exe and is prompted by Windows security to not run this file. They say run it anyway. Windows says the file was still not run for some security reasons. So the user disables the entire security system. Then the file says it must be run with administrator rights. They then run it once again with admin rights. Finally, X bug happens or is initiated.

This is constantly the ONLY actual event I come across where there is some crazy vulnerability. I have literally never seen something that's not this situation.

Back to my original question, if this is NOT actually the case, and there is verifiable information from any reasonably competent computer person, let me know so I can finally learn more.


Though, as other researchers have noted, they tend to give partial fixes for chains on older versions. That is, they might backport a fix for the zero click bug but they may not kill other less pressing issues like privilege escalation/sandbox escapes. I've seen this with a few of my own chains too.

Personally, my security bet has just been to stay at the absolute bleeding edge and install betas on day one. You'll get security patches first and, for the most part, many exploits are brittle enough that they tend not to work perfectly on newer, unseen versions of the OS and so you do have a slight advantage there.


> As far as stability and safety is concerned, I personally consider macOS N-1 to have the best tradeoffs.

This is what I settled on years ago. The only exception was using Monterey on the new Apple Silicon macs. I upgraded to that earlier than normal.


I also like the N-1 strategy for things that are released often. I also like going with the most recent release for things that are released every 3+ years though


This was my personal favourate statement in the article.

"Deciding whether and when to upgrade macOS is one of the more difficult choices we face. If your Mac isn’t capable of running the current release of macOS, or you’re dependent on key hardware or software which is incompatible, then the decision is made for you."

If you head over to the hackintosh side of things you will notice that very rarely does a hackintosh not upgrade to the next version.

I don't want to get "conspiracy theory" here but I do wonder how often the next release is blocked by Apple, not by hardware?

There are hacks to allow the new OS to run on real Apple hardware despite being blocked and people who have done this seem to be running the new OS just fine?

Releasing an OS with serious bugs, then limiting someone from upgrading to a newer one which fixes those issues is just bad.


I have a Macbook Pro 2015 and the latest OS it supports officially is Monterey. The new Ventura (2022) is not supported anymore. I consider my device to have had software support for about at least seven years so far and until when they stop releasing security fixes for Monterey. I believe this to be sufficient for my needs and the frequency of which I upgrade the laptop.

P.S. I don’t think that it’s ok for Apple to stop supporting older devices. I am just pointing out my personal view concerning my needs.


I wish apple would do 'best effort' support on older macs. Meaning, it'll let you install after you click thru two pages of warnings, and might work, or might not.


Apple ships all drivers with OS. So, if Apple drops hardware support, it removes drivers and other support for not supported hardware. For example for Ventura they dropped support for Nvidia GPU-s, Metal is built now to require AVX2 and much more. "Best effort" support is not trivial in such cases. Yes, I know about OCLP and use it myself, but it's a hack.


This is a much more sensible approach vs the "installer lock" apple took.


But it isn’t the way Apple does things. They obviously don’t want Apple computers in the wild running a version of an Apple OS that might or might not work.


Yes.. artificially limiting them from upgrading by checking the SMBIOS in the installer and preventing an upgrade is MUCH BETTER for the users.

And as posted here several times, what about those who use opencore legacy patcher and upgrade despite the block. What "doesn't work" for them?

SO to your point, apple prefers machines in the wild running an old and out of date OS because???


My 2015 MBP runs Mojave (~2020) and will until it dies. Catalina was just too much for me.

Besides, OS X looks much better than the newer iOS-based design, imho.


I tell people that I haven't looked forward to new features in the 'California' era like I did in the 'Cat' era. Now, I worry about what Apple will break or take away.

If Microsoft wasn't so intent on spying and advertising to me, I might consider them. I get the feeling I will switch to a BSD eventually.


I’ve been leaning the same way. I’m curious why you also lean towards BSD instead of Linux?

It’s a gut feeling for me after watching the Linux ecosystem churn a lot and seeing BSDs with well documented core systems and the “core/ports” distinction make clear what is and isn’t really supported


I prefer BSD to Linux for a few reasons. Firstly, BSD is not encumbered by GPL. Also, any BSD is a complete operating system while Linux requires distros subject to the whims of their creators, which seem to focus on form over function. Any long term stability is almost accidental, and there is no standard edition of Linux. Lastly and I think most importantly, the centralized BSD ports system is more trustworthy than an untracked number of repositories and sources. Actually, these complaints about Linux aren't about Linux, they're mostly about GNU. Linux is just a kernel, and it's fine, really. GNU is the problem.


> Firstly, BSD is not encumbered by GPL. Also, any BSD is a complete operating system while Linux requires distros subject to the whims of their creators, which seem to focus on form over function

This "BSD is a complete operating system" that you speak of... Do you mean FreeBSD, OpenBSD, NetBSD, DragnonFly BSD, Darwin?

Thankfully BSD is not subject to teh whims of any distro?


Each of {Free,Open,Net,DragonFly}BSD is its own operating system, complete with tree, system call interface and ports tree. This is not the same as the distribution model Linux has - the BSDs have diverged substantially from one another.


Has anything workable been made from Darwin in the past decade or has Apple mostly left that community high and dry?


Agreed. Also all BSDs have different characteristics.

Linux it self can't use GPLv3 shows that it is not innovation supporting License


Coming from macOS, there’s a certain cohesiveness that one comes to expect that is often absent with Linux, with conventions being mere suggestions and things generally being all over the place. The BSDs aren’t perfect but certainly much better in this regard.

It’s unrealistic but a Linux distro that forked as many packages as possible to make them follow conventions and be proper desktop citizens would be interesting.


I'm much more familiar with OpenBSD and FreeBSD on the server-side, and I don't feel real comfortable with Linux. The two biggest vendors, Red Hat and Ubuntu, don't really give the warm fuzzies. Maybe something like System 76 backs.


What about plain Debian :) if you’re comfortable with BSD then Debian should be a cakewalk for you.


The funny thing is I tried to setup and Arch Linux box. I can setup OpenBSD with thinking with all the partitions and such, but that Arch Linux install was just a bad time. I'll try Debian when I get back to work. We are replacing our Samba server and the BSDs aren't real current given some changes the Samba team made.


There is always OpenDarwin/PureDarwin?

https://en.wikipedia.org/wiki/Darwin_(operating_system)


I have tried all three and much comfortable with NetBSD.


Well, don't skip the DragonFly folks. I've had the four main one up and running at different times.


Check out the KDE edition of Fedora. Fedora is a rock solid OS and KDE is a rock solid desktop environment. Reminds me of what I always wanted Windows to be.

You can go with their default edition (GNOME) if you want something that looks like a MacOS clone right out of the box, but the actual user experience is so different (and IMO so much worse) that I can't recommend it for anyone interested in switching.


Funnily enough I've been considering going the opposite way. I'm on Windows, Android and iPadOS right now, but this diversity of operating systems means nothing integrates well, so it's not a great experience.

Buying into the Google or Microsoft ecosystems would improve things on this front, but this seems not great w.r.t. privacy, which is why I'm thinking of going all in on Apple.


There is no excuse for the most profitable company in the world to drop support for working hardware. It's entirely within their capability to support even PowerPC macs. And that's besides fact that Apple has an incredibly small portfolio of hardware to support; so much so that they could trivially exhaustively test every update on every possible piece of hardware.


This is sort of my line of thinking. when you effectively have a very small product line and are incredibly profitable it does seem a bit greedy to artificially limit updates.

It would be different if there was no way to bypass the upgrade lock, but given opencore legacy works.. Clearly the hardware is able to run the new OS.


Besides the impracticality of running new OS’s on a 2005 era Mac, what software engineer at Apple would be willing to take on that role? Also, even if the hardware were supported, what third party software company would still support PPC Macs?


> what software engineer at Apple would be willing to take on that role?

One being paid to do so? People still work with COBOL. If you have trouble finding engineers to do it, offer a higher salary.

> Also, even if the hardware were supported, what third party software company would still support PPC Macs?

Some software is still supported for Windows XP, and that's considerably older than the last PPC Macs. It would certainly be a whole lot easier to support if Apple still did the same.

In any case my point is that they absolutely have the resources to do this. I don't see how that's at all controversial given how much money they have.


> One being paid to do so? People still work with COBOL. If you have trouble finding engineers to do it, offer a higher salary.

Higher salary than who? Do you really think old COBOL programmers (FWIW I’m 48) are making more than the average new college grad working at any major tech company?

And besides if you know how raises and promotions work at tech companies, you would know they are based on “impact” and “scope”. You don’t demonstrate either by saying “I spent all last year supporting MacOS on a PowerPC 6100 from 1994 doesn’t speak well to either.

Unless you work at Google where it’s based on creating new messaging app that will be killed in a year.

Is the average COBOL programmer even making close to the same $170K+ that I’ve seen returning interns make at major tech companies (including the one I mentored)

> Some software is still supported for Windows XP, and that's considerably older than the last PPC Macs.

How much mainstream software is supported by XP? Microsoft Office? Adobe software? Modern browsers? Modern games?

> In any case my point is that they absolutely have the resources to do this. I don't see how that's at all controversial given how much money they have

Yes and they also had the money to buy Twitter. But what moron would do something that stupid?


There's still Linux distros supporting 32bit PPC Macs.


And how many people are actually getting paid to do so and how many companies are selling software for them?


I was just rereading the ppc history for Apple as they're doing it again for ARM and it seems quite reasonable to be honest. They pulled something off I think would be difficult for many others and have shown they're not married to a single architecture.

Still RIP my favorite machine the last ppc Mac tower


> They pulled something off I think would be difficult for many others and have shown they're not married to a single architecture.

You mean like how MS Windows ran on the DEC Alpha as well as X86? Or how Linux runs on X86 and ARM CPU's?

Running an OS on multiple CPU types goes back a very long time.


> Running an OS on multiple CPU types goes back a very long time.

Yet moving the entire world onto this new architecture and getting all developers on board to start creating software for it is nearly unheard of outside of Apple.


That’s the strategy that Microsoft can’t/won’t pursue: be willing to abandon legacy vendors/customers by forcing a migration.


I’m fan of how Apple does things for their ecosystem. I think only this strategy could have let them make the transition to arm as smooth as it’s been.

That said, I also have deep respect for Microsoft for being so deeply committed to backwards compatibility. You can find developer stories about how they’ve gone out of their way to hard code support for legacy apps (even space pinball) in major Windows updates. That sounds really gross from a code perspective, but from a business/product perspective, it buys trust/loyalty from enterprise customers. We see the opposite issue with Google, who has established a reputation for unreliable long term support. Despite their reputation for strong engineering, their new product lines suffer from distrust. Potential customers ask themselves why pay the onboarding cost for a Google product when it’s unclear if it’ll be around in two to three years.


"Planned obsolescence" has its advantages?

This is the third time apple has done this so one would hope they have learned form their mistakes and make the transition seemless?

Windows machines on X86 are just fine.. Amazing when you dont have a monopoly on both the OS and Hardware. MS runs on Intel, AMD, ARM...

MacOS runs on whatever hardware Apple sells.

you can see how an attempt by Microsoft to move off Intel would fail, right?


You mean like how MS Windows ran on the DEC Alpha as well as X86? Or how Linux runs on X86 and ARM CPU's?

It’s not the same thing.

A macOS hard drive could boot either an Intel or PPC Mac and all of the apps ran on both types of machines.

Apps that were “fat binaries” ran natively on both processor architectures.

Same thing today with the apps and the operating system running natively on Intel and ARM machines.


Nothing you wrote is accurate.

The apps ran on both types of machines thanks to rosetta and the performance hit this takes: https://en.wikipedia.org/wiki/Rosetta_(software)

It is doubtful that a PPC system can boot an intel MacOS HD. One is RISC the other CISC.

The reason you can "natively" run intel on ARM machines is Rosetta2

"Until recently, Mac developers only had to worry about creating Intel-based apps. However, that started to change when the first Apple silicon Macs arrived. For Intel-based apps to run on these computers, Apple introduced Rosetta 2."

Please read up before posting.


> It is doubtful that a PPC system can boot an intel MacOS HD. One is RISC the other CISC.

That's the point of fat binaries: they contain .text sections for several architectures. Even the installation DVD was a single one, for both platforms.

Some more modern example of the same concept:

    Terminal: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64e]
    Terminal (for architecture x86_64): Mach-O 64-bit executable x86_64
    Terminal (for architecture arm64e): Mach-O 64-bit executable arm64e


Nothing you wrote is accurate.

You must be new here. ;-)

The apps ran on both types of machines thanks to rosetta and the performance hit this takes: https://en.wikipedia.org/wiki/Rosetta_(software) It is doubtful that a PPC system can boot an intel MacOS HD. One is RISC the other CISC.

I know about this because I used to do it all the time when I worked in tech support. Regardless of which Mac a client might show up with, I could boot it with the same external hard drive back in the mid-90s.

Rosetta allowed unmodified PPC apps to run on Intel Macs by translating them on the fly.

However, developers had the option of making a few changes and recompiling their apps to produce a version that would include native code for PPC and Intel machines; macOS would handle running the correct binary. These were known as "fat binaries".

Fast forward to 2020 and Apple is applying the same strategy. Rosetta2 translates Intel apps on the fly to run on ARM. And developers can recompile their apps so that they run natively on Intel and ARM machines.

Even though I'm typing this on an Intel Mac, I could boot an ARM-based Mac with its hard drive if necessary.

Here's the output from the file command on the ls command:

    /bin/ls: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64e:Mach-O 64-bit executable arm64e]

    /bin/ls (for architecture x86_64):      Mach-O 64-bit executable x86_64
    /bin/ls (for architecture arm64e):      Mach-O 64-bit executable arm64e
Yes, even ls is a fat binary that runs natively on Intel and ARM-based Macs.

Please read up before posting.

Perhaps you should take your own advice?


lol maybe you should do some reading before posting. Mach-O supports multi-architecture binaries. Rosetta is solely for things that haven’t been made “Universal”

https://en.m.wikipedia.org/wiki/Universal_binary


Great.. now explain why Rosetta exist if it is not needed because "mach-o supports.."


Well, you can't run 2002 OS X software on a modern Mac anymore, but you can still run 2002 Windows software on a modern PC (and probably 2002 Linux software too).

Actually, you can't even run 2010 iPhone OS software on a modern iPhone anymore (but you can run 2010 Android software on a modern Android phone).


That's not true in any practical sense.

Sure you can run the xp calculator app on windows 11, but so what? Most of the programs that people want backwards compatibility for, enterprise apps, won't run correctly. Yes that's mostly because of how they're coded, but even if the source code still exists, I doubt anyone is putting effort into updating them at this point, and is just running xp or 2000 in a vm.


We can indeed run newer macOS on older macs with the Opencore Legacy Patcher - https://dortania.github.io/OpenCore-Legacy-Patcher/ . But the bugs aren't the only concern. There's also the revised terms of services and privacy policies that we have to consider with each new macOS, as Apple plans to become a service company at our expense. It is also clear that Apple has been taking away developer options, and rearchitecting macOS to be more and more like ios / iPadOS to have greater control of over what developers can do on the system. (There's now even talks of Apple working on a hybrid macOS and iPadOS merge so that they have one common OS for both platform).


> It’s safer to delay upgrading

This is my policy for most of my software unless there's some kind of critical exploit. Software updates have been the cause of many breakages for me, but I have not (yet) been affected by running out-of-date software (to my knowledge).


I avoid updating my software like the plague. Actually, the reason I switched to Firefox is because Chrome's updater snuck past my firewall. That is literally the only reason why.

I am considering switching back after seeing how terrible Firefox is. For example, it has a memory leak that causes a crash after 3-4 days of uptime.


I wonder if just maybe an update might… fix a bug?


I've been looking for evidence that the bug's been fixed, but even upon bringing up the issue with Mozilla engineers, all they have to say is "turn overcommit back on, we don't support your configuration".

Excuse me, Mozilla? Firefox should not be collecting 20-30 gigabytes of committed memory in the first place. They must be high or something.


A recent Firefox update had completely cleaned up all configuration and plugins that I had installed. I wouldn’t go back to Chrome over this, but it wasn’t a minor issue for me and it definitely made me more reluctant to the future updates.


I have ~weeks of uptime on Firefox MacOS with two separate laptops, and no memory issues whatsoever. So likely just a bug with something on your version/system (edit: check your plugins too)


macOS has overcommit and memory compression, so you wouldn't notice even if your Firefox did have a leak. macOS is cool that way, but Windows isn't.

FWIW, I would still be using macOS... if I had any choice.


The only software I look forward to upgrading is Emacs. Everything else fills me with dread about what will break or what functionality will be lost this time.


Even as a full time mac user, both at work and privately I have a hard time remembering which release, say Sierra was and which one Mojave was and which one came first and if any release was in between.


I only really remember Snow Leopard heh.


Here's a great reference of Mac version history: https://robservatory.com/a-full-history-of-macos-os-x-releas...


Isn't that... Umm good ? Windows completely rehauls everything in every major releases and makes it difficult to navigate through.


But at least we all know 3 came before 7, 11 followed 10 etc. Vista was 8. Dunno what happened to 9 but I'll never remember the order of 'name'. Same with android....kit kat was version what now?


Android has always gone in alphabetical order, so you can work it out in a few seconds.

Strangely I think it was easier to keep track of the order of OS X codenames when they were all big cats. It was more memorable.


Apple naming is too Californian centric.


If you don't know what happened to 9, I'm kind of delighted to inform you that the version number was skipped because of all the legacy applications out there that matched against a string value of "Windows 9?" to determine if the OS was Windows 95 or Windows 98, instead of checking the actual version number that gets returned when you type ver in the command line.


Vista came before 7, so I guess it was 6? Then there was XP before that, which I suppose was 5?


Windows 2000 was NT 5.0, XP (32-bit) was NT 5.1, Windows 2003 NT 5.2.

Vista was the first of the NT 6 series. However Windows 7, 8/8.1 and the related server versions were also NT 6.x


Android is just alphabet order until they drop banned entirely for numbers, so at least you know which versions are newer.


no so it was Vista, 7, then 8 - which would make Vista 6


Hyperbole much?


I was very confused by this article until I realized it was written in a myth/fact structure.

The headings are not claims the author is making or that the article supports.

The headings are common misperceptions, which each subsequent section refutes.

If the point of the article is, as it seems, that people should go ahead and update, then yes; Apple expects people to run only the latest version, and people should go ahead and update. If that’s not possible, they risk suffering issues not addressed until later releases, which is true of course of virtually all software.


This is simply how Apple does things. They provide software support for ~10 years and then they drop it.

Is it wrong? I’m not entirely sure. You can call that greedy, but take a few things into consideration. Firstly, Apple provides highly specialized software that runs extremely well and efficiently on their own hardware. Continuously providing support for old hardware while simultaneously maintaining the same level of performance is simply not feasible. I challenge any software developer to achieve a similar goal.

We are not talking about Linux here, where if you’re lucky, things work. Anyone’s that ever used macOS even once knows that things really work.

Now, I assume this window of time between a new hardware’s release and the software drop for that given piece of hardware will increase over time, given that new hardware released by Apple nowadays is incredibly performant, this would allow longer software support.

Frankly, my opinion is that 10 years of support is more than enough for anyone to consider renovating their hardware.

Recently, in the last XNU kernel release (corresponding to macOS 13/iOS 16) the 32-bit part of the kernel has been entirely removed, meaning that XNU won’t support any 32-bit device anymore. This is really exciting to me, as I see the technology moving forward, without getting stuck on prehistoric hardware support.


> We are not talking about Linux here, where if you’re lucky, things work.

What percentage of the Internet is even possible due to this, if you're to be believed, barely working and then only by luck, OS?

I get the Apple cultiness, but we don't need it here.


They’re probably talking about most desktop Linux distributions, which are less stable than what servers are running both because servers run distros with packages that are older and more polished and because servers don’t usually have hardware with notoriously troublesome drivers (e.g. Broadcom, Nvidia, some Realtek stuff, etc).

Of course desktop users can tailor their machines to use with Linux and also run something like Debian but it’s more likely they’re using whatever computer they happened to have and running Ubuntu (or one of its many derivatives), Fedora, or Arch which are indeed more likely on average to break. I know because I’ve seen it happen several times in my own usage.


I went from Mac to Linux. POPos was what the machine came with. It’s very stable though the UI took a little getting used to, it’s not as big a change as I expected.

The upside is Bioinformatics tools run natively and it saves me from doing all my work on the cluster (just finished a multi day run that was “quick” using all 16 of my cpu cores. ).


> which are indeed more likely on average to break

Maybe; it's not happened to me in 15 or so years, but people have different experiences. That said, "more likely on average to break" is a far, FAR cry from "it works, if you're lucky".


I run Linux on my 2015 iMac. Just upgraded it to Ubuntu 12.10, works great and loving the latest Gnome.


I guess you mean 22.10...


Na, they installed Quantal Quetzal to get some experimental TLS1.2 support after they could no longer connect to anything with TLS1.0


I did :-)


Linux on a server and Linux on your laptop are very different experiences.


Fair point, but I've run *buntu on various laptops (old and new) and desktop(s) my son and I constructed from stuff chosen from pcpartpicker.com not 3 months ago.

"It's just worked". Every time.


If comments on HN are anything to go by, linux on the desktop is perfect and easily usable by any office worker.

I'm sure if you walk up to Susan in accounting, handed her an arch install usb, she'll have tmux and neovim up and configured by lunch!

In all seriousness though. I'm not sure a lot of HN commenters (probably myself included) have any actual perspective on how a normal person wants to interact with their computers, or what business need from desktop endpoints.

The fact that anyone things that editing source is a good way to configure a window manager (dwm) is mind boggling.


My ~6 year old MacBook Pro (13" late 2016 model (with touchbar), purchased new in 2017) is no longer "supported" (no Ventura). As far as I can tell, linux does not support WiFi or audio on it, so it's pretty much a piece of trash as far as being a laptop once apps stop supporting macOS Monterey.


How many of these apps will drop support for Monterey within the coming year(s)? How many of your apps will actually stop working when there are no updates?

I think that if a Laptop lives for 6-8 years, that's quite good. If you don't think so, maybe Windows is a better fit for you because you can purchase fove (cheap) laptops for the price of one Mac. If you "need" your Mac for work, I think replacing it every 6-8 years shouldn't be an issue.


Why buy 5 Windows laptops for the price of one Mac when one PC laptop can be supported for 2-3 times as long? You can run Windows 10 on a ThinkPad T60 from 2006. Windows 10 will get support until 2025. That's 19 years support from one machine. Use a Linux distro and get even more.

Why spend more for a Mac if you get less useful life and are constantly having to repurchase software from OS upgrades breaking stuff?


I think you're missing my point a bit. Sure, if you even want longer support, don't purchase a laptop. Cheapest lifelong option...

My point was that the argument "apps won't run anymore" may be true, but there's no evidence for that. There are a lot of people running 10 year old Macs and the software does not just break.


As a lifelong Intel/MS user in my early 40s, I've been staring down an M1 Mac Studio and thinking hard about this.

I had a 2006 Dell Inspiron Laptop that I got for less than $1000 from the Dell Outlet, that came with Vista XP, and through various promotions had been able to upgrade the OS all the way to Windows 10, legitimately - and I think I might have only paid $60 for the Win 7 upgrade. About six year ago, I donated the laptop (the internal wifi card was spotty, and with 4gb of RAM, it wasn't much for multitasking).

Similarly, I've got a Dell Desktop with an i7 processor that's 11 years old, which through the magic of SSDs and the occasional video card upgrade, continues to do a solid job with the creative work I do in Photoshop, multitrack audio, Sketchup, and Twinmotion.

Ten years ago, I started using a Macbook Pro at work because internally, we switched to Ruby on Rails for the ecommerce platform we built. I was asked if I wanted to keep working in Windows, but basically everyone building in RoR was on a Macbook, so why swim against the current? I'll spare a this vs that comparison, and leave it to say that (muscle memory for keyboard shortcuts aside) I eventually was okay with working on a Mac. It also gave me a lot more exposure to Apple culture, for better and for worse. A lot of it was insufferable idolatry and ideological pontification - but I started to understand, if not necessarily agree with, the product lifecycle in Apple. I had a 2013 Macbook Pro, it was still working fine, but I traded it last year towards my first iPad (now that they had USB-C and more of a creative focus than a consumption focus, I was willing to take the plunge)

It's a lot like leasing cars. I'm someone who tries to drive a car into the ground, proverbially - I do regularly maintenance though, so it's more like after 10-15 years, the safety improvements on modern vehicles outweigh the cost advantage of driving something I paid off a decade ago.

Back to Apple - once they got serious about recycling, the pattern became: you spend a bunch of money up front, and then you just keep trading your hardware in for the latest version. I would hold onto my iPhone for three product cycles because I spent a lot of money on it and I was going to eek out every last cent. But with Apple, you have a large initial outlay, and you can leverage the trade-in as almost a kind of hardware subscription. They've got a solid backup/recovery process that makes this really, really simple. And I kinda get it, the way I kinda get why some people lease cars instead of buy. Get a Mac with AppleCare and upgrade it every year or two, and that premium you pay essentially means you don't have to think too hard about breaking your computer, and you're always within a year or two of the latest, greatest hardware.

It still costs more, no doubt about that. But like with a leased car, you kinda don't have to worry about maintenance. With cars and computers, I do a lot of work myself, but I'm getting to a point in life where I just don't care about doing that anymore, and I have better financial resources, and I'm thinking real hard about going with size, power, and performance of a Mac Studio, knowing that it means reshaping my relationship with computers. The fact that they've been reducing or eliminating proprietary connectors in recent years has played a very large factor in swaying me in this direction too, I'd add, while knowing that it's still very much Apple culture to charge you $30 for $0.35 worth of cable.


> A lot of it was insufferable idolatry and ideological pontification - but I started to understand, if not necessarily agree with, the product lifecycle in Apple.

I have an iPhone and iPad as well as a hackintosh.

I will never for the life of me understand the apple ideology. It is almost like some of these people's whole identify is tied to a company they neither own or work for?

Sure apple does make some great stuff, but they also make crap and have a history of bad choices (dock, firewire and lightening come to mind).

what i have a serious issue with is the artificial limits on OS updates. Clearly if Opencore legacy patcher gets the OS on the laptop apple intentionally blocked it?

Many who have used Opencore are happy with the experience so why exactly did MacOS refuse to install?


It's funny you mention Firewire. Up until last year, I was using a Firewire audio device with my Win10 machine - it was released in 2007, and the last driver update was in 2012. Even though official support had dropped a decade ago, it still worked. Because of the architecture of Firewire vs USB, if you wanted a lot of high speed I/O, Firewire was the way to go - as long as you weren't running a Mac. While researching the Mac Studio and its M1 processor, I read some rather pathetic stories of using a series of dongles - Firewire -> USB -> Thunderbolt (or some such chain), and some hackery to get old drivers installed.

I decided I'm in a comfortable enough place now that I can upgrade to a modern interface... without getting too into the weeds, I replaced an outboard mixer and two Firewire interfaces with a smaller, higher quality analog hybrid device that runs USB-C. And funnily enough, I've nearly recouped the cost of the purchase by selling my old gear.

Hanging on to old equipment just does not fit the Apple model. Trade in early, trade in often, or get stuck on old architecture, and get phased out of the upgrade path. I've got a 2015 Macbook Pro I should have traded in the moment I heard about the M1 processor. It was the vintage to have, and now I'd be lucky if I got $400 for it.

Anyway... there is a large portion of the human population that seeks out and worships idols. It's like any rational part of their brain decides that with THIS person, they can relax things like curiosity or critique. If I keep typing about the topic, I'm going to start saying unpleasant things that will no doubt upset people, so I'll just leave it at that.


Recently my old MBP (2013 model) stopped working, with a mysterious kernel process taking up max CPU to grind everything to a halt on every boot. Fortunately I had its files backed up to a disk in Linux-compatible format (ExFAT32?), so I finally decided to wipe it clean and install Pop!_OS. https://pop.system76.com/

It gave new life to the laptop - I'd recommend trying it, you might be able to get more mileage out of the machine.


I use to be a huge Ubuntu fan but decided to move back to a hackintosh ~2 years ago.

I still use ubuntu often so i went with Ubuntu Budgie https://ubuntubudgie.org

It has that whole "MacOS" dock look and feel right out of the box.

Good that you found an alternative OS and kept your old mac running. So many people seem content to spend big money on a Mac laptop, have it reach EOL and become landfill as they run out and buy a new one.


Ubuntu Budgie looks nice, thanks for mentioning it - I'll try it sometime.

An old Macbook with an Ubuntu variant is a nice combination of sturdy hardware, light(er)-weight operating system, and well-designed user interface. In a way, it feels more like the spirit of "Macintosh". And it would make for a good educational/toy computer for children and young people. If I see an opportunity, like family or friends who have old laptops or computers, I'm going to offer to set it up for them.


You have another couple years until apps stop supporting Monterey. That said, I think it was a poor decision by Apple to drop the 2016 version but keep the 2017.


I known its crazy… maybe bootcamp?


>"Anyone’s that ever used macOS even once knows that things really work."

I am not a regular Mac user but I was once asked by client to port my CI/CD related script so it can work for developers who use Macs. Script involved dealing with docker among other things. After dicking around for a while I've discovered deeply nested folder created by docker that was not supposed to exist after successful script completion. Do not remember all the details but this folder was a thorn. Short of sacrificing a virgin no matter what I tried I was not able remove it. Invoking search power of Google did not help either - plenty of recipes and suggestions and none worked. Finally after giving this Mac to system gurus and them dicking with it for a day I was told that they have failed so far it is no longer worth their time and they will reinstall the OS.


Macs have a bunch of weird quirks where simple things simply do not work. I got a new mbp for work recently, second to my main windows gaming machine with 34" ultra wide 1440p monitor.

And I've basically installed chrome, vscode, and a whole bunch of little packages to make basic hardware operate normally. There's an app to allow the track pad and mouse wheel scroll directions to be set differently. There's another to allow the mouse side buttons to work. It's a basic mouse. These are settings and compatibility issues that do not need to exist.

But the most annoying is that the monitor is a bit blurry on the M1. I use vscode for a few mins before I realise there's a fuzziness to it. And apparently this is a known issue, because macs only scale properly for 4k resolutions. So there's another app to make it think its 4k or something. Helps a bit but not that well, and it's a trial. It's maddening. How the hell is anyone supposed to know that Apple has just decided that they won't be compatible with an entire category of bloody monitors, of all things.


Macs only look good at a 110 or 220 pixel density.

34" ultra wide 1440p has a pixel density of about 110 ppi, and that should work just fine. You might need to use a different cable, though. You should look your monitor up on rtings.com and search for notes on Mac compatibility.

I have a MacBook Pro 14 that I use daily with a 34" ultra-wide 1440p gaming monitor, and I've never had a problem. I had a MacBook Pro 13 before the 14 that also worked without issue.


The problem is that macOS killed off subpixel anti aliasing, which is important to making text look good on 110ppi displays. So 220+ ppi displays are the only real option here.


Yeah, using non-Apple monitors with a mac is a pain. Try "Retina Display Manager" - https://github.com/avibrazil/RDM - as it allows you to switch to the sharper HiDPI resolutions supported on your monitor.


Already upgraded the cable for max refresh rate while gaming.

I might not have noticed an issue with the way it looks, if I wasn't running it alongside a Windows machine that is very crisp. As I said, it takes a few mins of vscode before something starts to feel a bit off. Its subtle, but it's there


I've used macOS professionally for years and it absolutely does not Just Work. I greatly prefer Linux, which also doesn't Just Work, but at least exposes enough knobs that you can fix what doesn't work. Unlike on Mac where you have to disable SIP just to get started fixing it.


I can't disagree.

On the other hand, if thinking about the world as it is and not about the abstract, I'd have to say that none of what Apple did since 2015 has anything to make me upgrade my hardware.

So I have like 6-8 computers that are all more than 7 years old by now. I'm not looking for new hardware.

And looking at the software they've released in those years, I'm not looking forward to new software.

Methinks I ate myself, thanks Apple. You got me spoiled.

And you can't compete.


> This is simply how Apple does things. They provide software support for ~10 years and then they drop it.

Monterey was released in 2021 and dropped support for my 2014 MacBook Pro, so that's only 7 years.

It's fair to say that you can continue to run macOS N-1 and N-2, which would add years of life, but the author of the linked article is trying to argue that you're not safe unless you're running the latest major version.


But then you get a few more years of your software not being 100% current but still getting a good chunk of the security updates and generally being compatible with current software, which adds up to that full 10 years.

Your MacBook Pro is compatible with the OpenCore Legacy Patcher: https://dortania.github.io/OpenCore-Legacy-Patcher/MODELS.ht...


Well, that’s 7 years of major updates with all the security fixes. Moreover, Apple continues to release minor versions for devices that are not supported anymore, in order to fix major flaws that are actively exploited ITW (see iOS from 12.4.2 to 12.5.6, those are a lot of minors released over the years to address 0d ITW, same thing happens with macOS)


Got to be honest I rarely get any issues with macOS for a couple of years now. The only notable one I remember was the date selection in reminders on 12.6 stopped working and that was fixed in 12.6.1. Workaround was to manually key it in.

You do have to keep within the 2-3 year update cycle though. Quite frankly it pays for itself though in less eye poking than windows.


Yeah, I can't recall any real bugs at all — the main problems for me are various accessibility flaws and ui issue that Apple will probably never choose to fix anyway. I take issue with your "2-3 year update cycle", though — my 8-year old MacBook Pro is still fine as the daily driver and I know plenty of others with Macs around the same age.


> I take issue with your "2-3 year update cycle"

I think the OP meant the software update cycle not hardware.


Both actually. I replace hardware the moment the AppleCare runs out. I also tend to track near or at the latest versions of macOS.

I factor in the cost on a monthly cost basis.


I also had to replace my Mac when its AppleCare ran out, but only because it started having random issues with its keyboard and USB ports. Stinks of logic board issue, not worth paying to replace when it'll just break again.

Still unhappy about that because nothing will ever live up to my mid-2015 MBP.


This is why I get rid of it the moment AppleCare is gone. I want someone else to handle the risk of it turning into a brick.


This is true. It's easier to migrate your data from a functional machine that you can still use as a reference.

Personally my Mac still serves as a remote desktop server for when I need to do Mac things, but it's really infrequent because there is no remote desktop solution that properly supports the Retina display.


Most notable bug I've experienced recently is PDF highlighting would make you jump to the previous highlight. It was so painful for so long. But 13.0 seems to have fixed it.

https://www.reddit.com/r/MacOS/comments/kmqsfr/preview_jumps...


Yeah, I’ve been riding the edge of the macOS upgrade wave for several years at this point and it’s never brought me any trouble. I wonder if some people’s usage patterns are more likely to stir up trouble than others.


Yeah I tend to do day one upgrades and can’t recall any real trouble. I sometimes think I must have some sort of lucky charm the way people go on about bugs online.


> One factor that makes recent memory an unreliable indicator is the series of changes to macOS to prepare for Apple silicon Macs.

Slightly off topic, but such changes from High Sierra kinda makes sense now. Back then it seemed like Apple engineers were dropping backward compatibility just because of their own convenience...


Apple needs to stop pushing annual upgrades.

Please spend 2 years polishing the core system. No features. We need to rebase onto a more stable and reliable platform.


Basically we need a Snow Leopard. That was by far the most rock solid release they've done in years. It included a few new features but was at it's core a massive performance and bug fix release. The fact that it was also released on some very old macs at the time helped too.

This being said, day to day I cant remember the last time I experienced a bug on any mac I've used. It's generally very stable, but does need a maintenance release to refine a few areas.


Hm I guess it depends on use cases, and machine configurations. I encounter many, many bugs. Off the top of my head:

- On reboot, sometimes the Menu Bar fails to load entirely, and I have to keep rebooting until it appears.

- WindowServer process memory leaks that require a reboot to fix.

- Finder steals focus from other apps. For instance, when I cmd-tab to quick switch between Mail and Notes, Finder will insert itself into the quick switch stack and replace the last focused app.

- Finder drag and drop randomly stops working.

- Spotlight fails to find files in normal directories, so I have to drop into Terminal to find them.

- Terminal sometimes loses input focus for no discernible reason.

- Calendar notifications either don’t show up at all, or dismiss themselves automatically, or the snooze action dismisses them and they never return.

- Safari page content crashes when the I resize the browser window. No discernible pattern for what causes this.

Death by one thousand paper cuts. The list just keeps going.


Systemwide OCR is the last feature I remember being excited about in a decade, of the top of my head.

But it’s something I could use a third party software for and comes with a few annoyances, like, sometimes I just want to drag an image, not select text.

Recently, I fell macOS releases take away more than they give. Either by quality of life (allow terminal.app to access your hard drive), poor UX and aesthetic decisions or downright feature removal.

Ventura, for instance, removes Preview’s ability to render PostScript files, and suddenly the man2pdf bash alias I’ve been using for 20 years is broken.


My low-end test Mac is an old (2012) 11" Air, running Catalina (it can't run anything newer).

It's been getting updates, until recently. I don't know for sure, if they are done.



Thanks!

I'll check it out.


Looks like you can go to Monterey at the moment and Ventura is in development.

You shouldn't have any issues.


My dad switched to Ubuntu Mate when his updates stopped coming in. He's happier than ever with the "look inside, see all the details" approach of Linux, compared to the "details are behind extra hoops because we don't trust our users^W^W^Wwant you to have to think about these scary, scary things."


same boat.

i have an air from mid-2013 and my last upgrade was Big Sur. i may have gotten the upgrade by mistake. since it happened, i am no longer getting any patches. and Monterey, of course, does not install. the upgrade app still nags me every now and then that i need to upgrade (to what?).but i do have a 2022 M2 on the side, for comparison. as far as i can tell, our problem is not the operating systems.

my biggest problem is with our browsers! or rather, our modern resource hog web applications.

i only use my personal macbook airs for the web. and i do use safari, chrome, edge, and vivaldi daily. each dedicated to a single purpose. so, on my M2 air, these beasts load most sites i visit fine. my mid-2013 air, sometimes, struggles opening gmail or youtube. i also never open any fancy webgl application on it.

oh by the way, edge, whatever you are doing, you are the best when it comes to youtube ;) and that's my only task for you.


See my post below. Often this is an artificial limit and as the other guy posted, opencore legacy can bypass this "lock" and allow you to upgrade beyond Catalina.


Sometimes you can install but things are disabled (Metal was one).


Long-time Mac user here -- by that I mean, I use my Macs for a long time. Long, long after Apple supports them. My Mac Pro 4,1 is a daily driver. I replaced my 2011 MacBook Pro with a ThinkPad only this year. I have strong feelings on this, but I'll say only two things:

1) Apple breaks more with each OS update than they offer in terms of features. I own Photoshop CS4 for personal use. It works great. At work, they pay for a subscription for the latest Adobe software and its crashy, screen-flickery, garbage. CS4 just works -- but that means I have to stay on Mojave, and that means I'm sticking with my Mac Pro. Same for the version of Final Cut I have. Same for my back catalog of fun games... and I dual boot Snow Leopard for some of the really old ones.

2) All complaints about Windows here are valid, but I have apps from the 3.1 era that run on modern hardware on Windows 11. Compared to Microsoft track-record, Apple shouldn't even be considered a stable OS platform provider. They make nice hardware; they're increasingly making a dumbed-down OS that's only redeeming characteristic is that it can run all the common browsers and has a nice shell built in.


I recently upgraded from a Late 2013 MacBook Pro to the base model 14 inch MacBook Pro M1. It cost me £1,729.00. I'll probably update more frequently now (I'm thinking every three years) but even if I didn't here is the cost breakdown.

1729 / 7 years / 365 days = £0.68 / day

Personally, for what you get, I think this is an incredibly reasonable subscription. I would love to know the percentage of people on this thread that are moaning about official support being dropped after seven years who happily spend £2.72 (4 times as much) on a coffee every day.


Plus, MacBooks hold an amazing re sell value vs HP, Lenovo, etc.

I'm not a fanboy (I hate Apple push for anti repair), but in honor of the truth, after the switch from Intel to ARM M1, Apple has one of the best balance of value/performance laptops in the market.


It works as long as you’re willing to stay current on Apple hardware. If you’re not… then macOS probably isn’t for you…


To be fair, I've always upgrade my hardware before the hardware has lost OS support.

If I really wanted to I guess I could run linux on these and use them for something, but I never do (I have one server in my house, it's enough). I upgraded the hardware for a reason.


I’ve yet to have my main laptop fall out of updates before I’ve updated it (just jumped from 2016 to the M1).

Secondary devices for particular purposes have fallen out of updates now and then


"If your Mac isn’t capable of running the current release of macOS, or you’re dependent on key hardware or software which is incompatible, then the decision is made for you."

Since I need Mac for building iOS apps, unfortunately I couldn't rely on old Mac systems (say > 8 years old). Apple regularely increase the minimum OS version for running Xcode. Since v13.3, you need to have Monterey installed.

https://developer.apple.com/support/xcode/


Hard truths about software: they have bugs. There's no such thing as an OS that has zero bugs. All major versions of software become abandoned eventually. Then you either upgrade to a newer version of the software or you live with whatever bugs you have.

> Those residual bugs are often severe: El Capitan continued to cause many Macs to grind to a halt through its entire cycle, the cause not being addressed until Sierra. Sierra in turn was abandoned with a serious bug in its backup scheduling system that caused automatic backups to fail completely after a few days, and that was only fixed in High Sierra. More recently, Monterey initially suffered from three serious memory leaks, of which two were fixed early, but the third has only just been addressed in Ventura.

Citation needed. Citation needed. Citation needed. I had multiple Macs with all these versions of macOS and never noticed any issues like this. My system never "ground to a halt," nor did I have failed backups, nor did I suffer from a "serious memory leak."

> APFS is but one example of a critical sub-system that loses all support once Apple releases a new version of macOS.

Is this an unknown or unexpected phenomenon? Does Microsoft make updates to NTFS on Windows 2000? What's wrong with APFS that needs to be fixed? "Bugs," I guess.

> Many of us had suspected that, during those two security-only maintenance years, older macOS didn’t get all the fixes it could have.

Right, because it's an old version that's out of support. What's the problem?

> It’s safer to delay upgrading

Stop. No. Don't. Stop telling people not to update their systems. This reminds me of everyone who recommended that people disable patches for Windows XP due to the mild inconvenience of the whole thing and then wondered why they got infected with every virus under the sun.

> It’s a similar argument to that concerning vaccination of younger people against Covid or other generally non-fatal conditions. Most people under the age of 70 who are in reasonably good health should be safe from serious or prolonged illness or death.

Oh, the author is anti-vax. Great.


(Apologies in advance for embracing the off-topic tangent.)

What you quoted from the article: "Most people under the age of 70 who are in reasonably good health should be safe"

For someone anti-vax, his own choice of words is odd. Most? Reasonably? Should? Is this supposed to be a compelling formulation of their argument? It's astounding to me that anti-vax arguments are so often laced in ambiguity and "principles", i.e. emotions which cosplay as rationality.

Whereas a properly rational and evidence-driven argument is just so straightforward. Your chances of avoiding infection are slim and ever-diminishing. The disease has a non-zero risk of complications/hospital resource dependency/death for people of all ages. The vaccines have a statistically significant impact on outcomes and an utterly spectacular safety track record. End of argument.


I miss being able to change application icons. I can't stand these square white icons with 4px padding.


There is a complicating factor of 3rd party software bugs and compatibility when run under the latest macos release. It usually takes 6-12 months before those have been fully ironed out


This article talks about regular bugs not being updated in anything but the current version of the OS. It talks about security bugs only being fixed in up to N-2 OS releases and then on a sliding scale of importance. It then talks about security bugs not being fixed in those older, unsupported N-3 and older which may get a rare security update.

My conclusion is that Apple wants to sell you a new Mac instead of fixing bugs for you.


I am increasingly annoyed by it -- especially when Apple hardware, commendably, kind of can last forever these days. But at some point while the hardware may still be completely usable, it will stop running the latest OS. Which, with OS's only really supported for (depending on how you look at it) 1 to 3 years, isn't great. There haven't for years really been any features I wanted in new OS versions, just have to upgrade to stay on the maintainence treadmill.


See my other post here.. opecore legacy patcher (https://dortania.github.io/OpenCore-Legacy-Patcher/) allows you to bypass Apple's artificial "lock" and install newer releases on perfectly working hardware.

This is why I own an iPad and iPhone but will not buy an apple Laptop. I dont need to have an expensive laptop suddenly ruled "end of life" and not able to upgrade unless i "patch" it first.


…But I really like my 11” Airbook from 2012


Oh, so the unreadable Time Machine backup on my APFS Encrypted drive is not just me then.


I ran into the same problem.

What I found most irritating was that even reformatting the drive (in any file system and on different machines), would not make the drive work again. After plugging it into my Mac the FS would corrupt again.


TLDR: Apple is not handling the bugs and security issues well for it's OS releases. It does not mean that you should stop updating your software.


Sort of agrees with my own opinion, that your use of Macs and iPhones and your stay in the ecosystem is only secure as long as you keep paying up.

More specifically, you're good for a year after you do a major upgrade. After that you may or may not see critical issues addressed, and after two years all bets are off.

I'm not sure how long Apple supports their overly expensive things these days, but I will not be surprised if they at some point declare only 3 years of support from introduction. It would mean more money, and they know people will payup.


The author glosses over the fact that you can upgrade to any new macOS version for free for around 7 years after your Mac was sold as new.

And even then there are ways to prolong the support window with projects like the OpenCore Legacy Patcher https://dortania.github.io/OpenCore-Legacy-Patcher/

So you buy a brand new Mac, you get to upgrade to the latest version and then even if you don't use OpenCore Legacy Patcher you're still getting at least some of the security updates from the latest version of macOS.

I realize that the author paints the picture of that later reduced level of support as being a bad thing, but they don't actually go as far as supporting their arguments with any sort of evidence beyond "there are bugs!"

Out of the box Apple is giving you ~10 years of at least partial updates and you can prolong it further using third party solutions like the OpenCore Legacy Patcher.

For example, you can use Monterey on the Mac Mini you bought in 2009, and the only way to go back to an older model is to start talking about Macs that are physically incapable of running modern macOS because they have physically incompatible 32-bit processors.


The iPhone 5s from 2013 just got a security update in July of this year.

But if you care about how long the vendor supports your phone, you really should get an Android phone. I’m sure you will be delighted about how long they support their phones.


The iPhone 5s got a security update to Safari. Android phones get updates to Chrome and the WebView component for many years past EOL as well.


> I’m sure you will be delighted about how long they support their phones.

Not generally. OnePlus only provides 3 major updates, for example. That's 3-4 years of use.

You can root for LineageOS or other custom ROMs, but then you lose Netflix and whatever because Android uses stupid trusted execution crap for DRM.


We really need a sarcasm bit for tcp/ip ;)


That’s nice. The article and most comments are about mac os. Snarky android comment is misplaced.


From the parent comment

> Sort of agrees with my own opinion, that your use of Macs and iPhones


I read that too. Parent went off topic for 1 word. Your entire comment is off topic. Do you have any thoughts on mac os you would like to share since that is the topic of the post?


Well, last time I checked, now Macs and iPhones are based on the same processor and many of the frameworks. Every time that Apple has done a processor change, it did drop the old ones faster.

Arm Macs are step change from x86. Given a choice between being on the Mac platform where Apple has moved fast to introduce better technology and being stuck on hot, loud, relatively slow battery hogging x86 PCs. I’ll take the former.


I don't know. I was a fan of what they were doing circa 2009-2014.

Big fan.

Haven't bought any computers since 2012 tho. And it looks to me like I am stuck with what I have, as what PC makers peddle is not meant for me.

If I was forced to, say for business reasons, to buy a bunch or even just one for the coming years, then in 2022 it would be Apple. Because they have memory and computing and bus on the same chip, and you can only beat that with a cluster. They even have a product that is a miniaturized computing cluster for a quarter of the price of one.

Apple is the greatest shit shoveler in PC-making right now. I'd go with Apple.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: