> In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.
The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets. People spend hours a day on their phones, and often run their life and businesses through it. Even with the $1000+/2-3y price tag, it’s simply not that much given how central role it serves in your life. This is especially true for younger generations who often don't have laptops or desktops at home, and also increasingly in poorer-but-not-poor countries (say eg Eastern Europe). So the iPhone (their best selling product) is far, far, far more a commodity utility than typical luxury consumption like watches, purses, sports cars etc.
Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury. Especially since the M1 launched, where performance and battery life took a giant leap.
Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
I am the one paying for my MacBook Pro, because my company is a self-funded business. I run my entire business on this machine and I love it. I always buy the fastest CPU possible, although I don't max out the RAM and SSD.
Amusingly enough, I talked to someone recently about compilation speeds and that person asked my why I don't compile my software (Clojure and ClojureScript) on "powerful cloud servers". Well, according to Geekbench, which always correlates very well with my compilation speeds, there are very few CPUs out there that can beat my M3 Max, and those aren't easily rentable as bare-metal cloud servers. Any virtual server will be slower.
So please, don't repeat the "MacBooks are for spoiled people who don't have to pay for them" trope. There are people for whom this is simply the best machine for the job at hand.
Incidentally, I checked my financials: a 16" MBP with M3 and 64GB RAM, amortized over 18 months (very short!) comes out to around $150/month. That is not expensive at all for your main development machine that you run your business on!
For a fair comparison, what about comparing against the cheapest "power cloud server"?
I mean Hetzner has a reputation for renting bare metal servers at the cheapest price in the market. Try AX102 which has very close performance to a M3 Max (CPU only): https://www.hetzner.com/dedicated-rootserver/matrix-ax/
While the OP's solution has a lot of advantages like being able to own the device and including GPU, but at least we do have cloud servers with comparable costs available.
I tried a lot to use remote servers for development when I had an Intel MacBook and I found the experience to always be so frustrating that I upgraded to the M series. Have the tools gotten any better or is vscode remote containers still the standard?
I did use them several years ago, for Clojure and ClojureScript development. Docker and docker-compose were my main tools, with syncthing helping synchronize source code in real time, Emacs as the editor. Everything worked quite well, but was never as easy and smooth as just running everything locally.
vscode remote containers are still the standard, but I find them very usable nowadays. My setup is a MBP M2 that I use to remote into a Windows WSL setup at home, a Linux desktop at work, and various servers. Nesting remote SSH + remote Docker works seamlessly, that was previously a major headache.
In your case it makes sense to get the most performant machine you can get even if it means you're paying a ton more for marginal gains. This is not usually true for the general public.
As a Clojure/ ClojureScript developer myself I just wonder what You do that compilation is such an important part of Your workflow and at the same time don't need as much RAM as possible? Yes, the MacBook Pro isn't bad at all for Clojure(Script) development. I was pretty angry that the Lenovo ThinkPad T14 Gen3 has a full channel of soldered RAM and just a single slot for expansion since I really use a lot of RAM and would prefer to go with 64 GB full dual-channel and not a hybrid 48 GB with 32 GB being dual-channel and 16 GB being single channel. (Yes, it does actually work.)
Most builds that I do are done asynchronously using GitHub Actions or similar. Yes, it does take some time but the build+deploy isn't that time sensitive.
In addition to the hardware, the OSX software is so much better with flawless speed, productivity, and multitasking with gestures. Try doing the desktop switching on the windows. On a flip note, I would gladly use the cloud if internet speeds and latency comes down to negligible level - we developer are an impatient lot.
"Engineers" - ironically the term used in the software industry for people who never standardize anything, solve the same problem solved by other "engineers" over and over again (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?) while making the same mistakes and learning the hard way each time, also vehemently argue about software being "art" might like OSX, but even that is debatable. Meanwhile actual Engineers (the ones with the license) the people who need CAD and design tools for building bridges and running manufacturing plants stay far away from OSX.
I did EE in college but we mostly just used Windows because the shitty semi-proprietary SPICE simulator we had to use, and stuff like that, only supported Windows. The company that makes your embedded processor might only support Windows (and begrudgingly at that).
I think engineers using software should not be seen as an endorsement. They seem to have an incredible tolerance for bad UI.
You seem to be suggesting that a chunk of the hundreds of millions of people who use a UI that you don't like, secretly hate it or are forced to tolerate it. Not a position I'd personally want to argue or defend, so I'll leave it at that.
What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.
Making up what? Go drop by your nearby shop.
My hair styling constantly complains about management software that they use and quality of payment integration.
At work I constantly hear complaints about shitty, slow IDEs.
At optician store guy been complaining about inventory system.
People hate software that they're forced to use. Professionals are better at tolerating crapware, because there's usually sunk cost fallacy involved.
This is not a reasonable way to infer the sentiment of hundreds of millions of people in different countries, different business, different situations, etc, etc.
Disguising it as an "observation" is even more ridiculous.
Indeed I’m not ready to defend it, it is just an anecdote. I expected the experience of using crappy professional software to be so universal that I wouldn’t have to.
>They seem to have an incredible tolerance for bad UI.
Irelevant.
Firstly, it's a tool, not a social media platform designed to sell ads and farm clicks, it needs to be utilitarian and that's it, like a power drill or a pickup truck, not look pretty since they're not targeting consumers but solving a niche set of engineering problems.
Secondly, the engineers are not the ones paying for that software so their individual tolerance is irelevant since their company pays for the tools and for their tolerance to those tools, being part of the job description and the pay.
Unless you run your own business , you're not gonna turn down lucrative employment because on site they provide BOSCH tools and GM trucks while you personally prefer the UX of Makita and Toyota. If those tools' UX slows down the process and makes the project take longer it's not my problem, my job is to clock in at 9 and clock out at 5, that's it, it's the company's problem to provide the best possible tools for the job, if they can.
It was figuratively. Obviously everyone has different working hours/patterns depending on job market, skill set and personal situation.
But since you asked, Google is famous for low workloads. Or Microsoft. Or any other old and large slow moving company with lots of money, like IBM, Intel, SAP, ASML, Airbus, DHL, Siemens, manufacturing, aerospace, big pharma, transportation, etc. No bootstrapped "agile" start-ups and scale-ups, or failing companies that need to compete in a race to the bottom.
If you look at creative pros such as photographers and Hollywood ‘film’ editors, VFX artists, etc. you will see a lot of Windows and Linux as people are more concerned about getting absolute power at a fair price and don’t care if it is big, ugly. etc.
Oh, I'm sure there are lots of creatives who use OSX, so I don't mean to suggest nobody uses OSX, so I'll admit it was a bit in jest to poke fun at the stereotype. I'm definitely oldschool - but to me It's a bit cringe to hear "Oh, I'm an engineer.." or "As an engineer.." from people sit at a coffee shop writing emails or doing the most basic s/w dev work. I truly think silicon valley people would benefit from talking to technical people who are building bridges and manufacturing plants and cars and hardware and chips and all this stuff on r/engineeringporn that everyone takes for granted. I transitioned from s/w to hardcore manufacturing 15 years ago, and it was eye opening, and very humbling.
I’d assume a lot of this is because you can’t get the software on MacOS. Not a choice. Who is choosing to use Windows 10/11 where you get tabloid news in the OS by default? Or choosing to hide the button to create local user accounts?
Who is choosing to use macOS, where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
I do. Because for all issues it has, it is still much better than whatever Windows has to offer.
> where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
At least my WiFi doesn't turn off indefinitely during sleep until I power cycle whole laptop because of a shitty driver.
So what, Windows does the same. Printers [1], WiFi [2], VPN [3], Bluetooth devices [4], audio [5] - and that's just stuff I found via auto-completing "windows update breaks" on Google in under 5 minutes.
The only problem is that Apple is even worse at communicating issues than Microsoft is.
The big difference is that Microsoft - at least usually - confirms and owns the issues.
With Apple, it's usually just crickets... nothing in the release notes, no official statements, nothing. It's just trial and error for the users to see if a particular update fixed the issue.
So the same software exists on multiple platforms, there are no legacy or hardware compatibility considerations, interoperability considerations, no budget considerations, and the users have a choice in what they use?
I.e the same functionality exists with no draw backs and money was no object.
More choice in hardware. More flexibility in hardware. UI preferences. You can't get a Mac 2 in 1 or a Mac foldable or a Mac gaming notebook or a Mac that weighs less than a kilogram. You can't get a Mac with an OLED screen or a numpad. Some people just prefer the Windows UI too. I usually use Linux but between MacOS and Windows, I prefer the latter.
We use the sales metrics and signals available to us.
I don't know what to say except resign to the fact that the world is fundamentally unfair, and you won't ever get to run the A/B experiment that you want. So yes, Windows it is !
You seem to have some romanticized notion of engineers and deeply offended by someone calling themselves engineer. Why do you even care if someone sits at a coffee shop writing emails and calls themselves engineer? You think it somehow dilutes prestige of word "engineer"? Makes it less elite or what?
"deeply offended" - My default response to imposters is laughter. Call yourself Lord, King, President, Doctor, Lawyer whatever - doesn't matter to me. I'd suggest you to lighten up.
Not that the degree means much, I learnt 90% of what I know on the job. It certainly helped get my foot in through the university brand, and alumni network.
You can call yourself anything you want Doctor, Lawyer, Engineer. I have the freedom to think my own thoughts too.
I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations".
There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus!
But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over.
I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people".
It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work.
Why pointing out Calculus as opposed to just Math?
Might be just my Eastern Europe background where it was all just "Math" and both equations (that's Algebra I guess) and simpler functions/analysis (Calculus?) are taught in elementary school around age 14 or 15.
Maybe I'm missing/forgetting something - I think I used Calculus more during electrical engineering than for computer/software engineering.
In my central european university we've learned "Real Analysis" that was way more concerned about theorems and proofs rather than "calculating" something - if anything, actually calculating derivatives or integrals was a warmup problem to the meat of the subject.
Calculus, because all of engineering depends critically on the modeling of real world phenomena using ordinary or partial differential equations.
I don’t mean to disregard other branches of math — of course they’re useful — but calculus stands out in specific _applicability_ to engineering.
Literally every single branch of engineering. All o then. Petrochemical engineering to Biotech. They all use calculus as a fundamental block of study.
Discovering new drugs using Pk/Pd modeling is driven by modeling then drug<->pathogen repo as cycles using Lotka models.
Im not saying engineers dont need to learn stats or arithmetic. IMO those are more fundamental to _all_ fields, janitors or physicians or any field really. But calculus is fundamental to engineering alone.
Perhaps, a begrudging exception I can make is its applications in Finance.
But every other field where people build rockets, cars, airplanes, drugs, or ai robots, you’d need proficiency in calculus just as much as you’d need proficiency in writing or proficiency in arithmetic.
True, we learnt calculus before college in my home country - but it was just basic stuff. But I learnt a lot more of it including partial derivatives in first year of engineering college.
>I think I used Calculus more during electrical engineering than for computer/software engineering.
I think that was OPs point - most engineering disciplines teach it.
Yeah computer science went through this weird offshoot for 30-40 years where calculus was simply taught because of tradition.
It was not really necessary through all of the app developers eras. In fact, it’s so much so the case that many software engineers graduating from 2000-2015 or so work as software engineers without a degree in BS. Rather, they could drop the physics & calculus grind and opt for a BA in computer science. They then went on to become proficient software engineers in the industry.
It’s only after the recent advances of AI around 2012/2015 did a proficiency in calculus become crucial to software engineering again.
I mean, there’s a whole rabbit hole of knowledge on the reason why ML frameworks deal with calculating vector-Jacobian or Jacobian-vector products. Appreciating that and their relation to gradient is necessary to design & debug frameworks like PyTorch or MLX.
Sure, I will concede that a sans-calculus training (BA in Computer Science) can still be sufficiently useful to working as an ML engineer in data analytics, api/services/framework design, infrastructure, systems engineering, and perhaps even inference engineering. But I bet all those people will need to be proficient in calculus the more they have to deal with debugging models.
That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.
You’re right it’s a total guess. It’s based on my experience in the field.
My strong “opinion” here comes from an observation that while calculus may have been a required subject of study in awarding engineering degrees, the reality is, people didn’t really study it. They just brushed through a couple of pages and wrote a few tests/exams.
In America there’s plethora of expert software engineers who opt for a bachelors degree in computer science that is a BA not a BS.
I think that’s complete totally reasonable thing to do if you don’t want to grind out the physics and calculus courses. They are super hard after all. And let’s face it, all of the _useful to humanity_ work in software doesn’t require expertise in physics or calculus, at least until now.
With AI going forward it’s hard to say. If more of the jobs shift over to model building then yes perhaps a back to basics approach of calculus proficiency could be required.
Most software engineering just doesn’t require calculus, though it does benefit from having the understanding of functions and limit behaviors that higher math does. But if you look at a lot of meme dev jobs they’ve transitioned heavily away from the crypto craze of the past 5 years towards “prompt engineering” or the like to exploit LLMs in the same way that the “Uber for X” meme of 2012-2017 exploited surface level knowledge of JS or API integration work. Fundamentally, the tech ecosystem desires low skill employees, LLMs are a new frontier in doing a lot with a little in terms of deep technical knowledge.
Hmm, that is an interesting take. Calculus does seems like the uniting factor.
I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas.
Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like
there was some old commercial that had the tagline "performance is nothing without control". If you can't put the technology to work on your problems then the technology, no matter how incredible, is worthless to you.
This checks out. I'm a software developer who took math all through high school and my first three years of college. I barely scraped through my calculus exams, but I excelled at combinatorics, probability, matrix math, etc. (as long as it didn't veer into calculus for some reason).
I guess I just enjoy things more when I can count them.
For this engineering, I think calculus is not the main proficiency enhancer you claim it to be. Linear Algebra, combinatorics, probability and number theory are more relevant.
Calculus was important during the world wars because it means we could throw shells to the enemy army better, and that was an important issue during that period.
Nowadays, calculus is just a stepping stone to more relevant mathematics.
Todays ML frameworks grapple with the problem of “jacobian-vector products” & “vector-jacobian product” as a consequence of understanding the interplay between gradients & derivative; and the application of the “chain rule”. All of those 3 concepts are fundamentally understood by being proficient in calculus.
While I’m being the hype-man for calculus I don’t mean to say proficiency in linear algebra or statistics is in any “less necessary” or “less useful” or “less challenging” or “less..” in any way.
I’m merely stating that, historically, calculus has been the unique branch of study for engineering. Statistics has always found value in many fields — business, finance, government policy etc.
Sure Linear algebra is one of those unique fields too — I kinda like to think of it as “algebra” in general and perhaps its utility has flowed in tandem with calculus. Idk. I haven’t thought super hard about it.
From what I've heard (not an OSX user) Windows is the best operating system for multiple screens; OSX and Linux glitch way more. Most anyone doing 3D sculpture or graphics/art on a professional level will eventually move to working with 2-3 screens, and since there are no exclusively Mac design programs, OSX will be suboptimal.
There's little things too, like some people using gaming peripherals (multi-button MMO mice and left hand controllers, etc.) for editing, which might not be compatible with OSX.
And also, if you're mucking around with two 32 inch 4k monitors and a 16 inch Wacom it might start to feel a little ridiculous trying to save space with a Mac Pro.
Besides Windows having more drivers for USB adapters than Linux*, which is a reflection of the market, I find Linux having much fewer glitches using multiple screens.
Once it works, Linux is more reliable than Windows. And virtual desktops have always worked better on Linux than on Windows. So I disagree with you on that front.
* In my case, this means I had to get an Anker HDMI adapter, instead of any random brand.
I'd say a lot of engineers (bridges, circuit boards, injection mouldings) are kept far away from OSX (and linux). Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!
Yes. I'm pretty sure my wifes 2014 Macbook Air has been 6 months without restart. My windows 11 workstation however has never done a week. I power down now daily to avoid dissapointment.
IETF RFCs soon number over 10K; Java, win32, the Linux kernel syscall API are famous for backward compatibility
not to mention the absurd success of standard libraries of Python, Rust, PHP and certain "standard" projects like Django, React, and ExpressJS
> (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?)
considering the design space is enormous and the tradeoffs are not trivial ... it's good to have libraries that fundamentally solve the similar thing but in different context-dependent ways
arguably we are using too many libraries and not enough problem-specific in-situ DSLs (see the result of Alan Kay's research the STEPS project at VPRI - https://news.ycombinator.com/item?id=32966987 )
I'd argue almost all NEW library development is about politics and platform ownership. Every large company wants to be the dependency that other projects tie into. And if you don't want to hitch your wagon to google or facebook or whoever, you roll your own.
Many if not most computational problems are fundamentally about data and data transformation under constraints - Throughput, Memory, Latency, etc, etc. And for the situations where the tradeoffs are non-trivial, solving this problem is purely about domain knowledge regarding the nature of the data (video codec data, real-time sensor data, financial data, etc) not about programming expertise.
The various ways to high level architect the overall design in terms of client/server, P2P, distributed vs local, threading model, are, IME are not what I would call crazy complicated. There are standard ways of implementing various variations of the overall design which sadly because of a overall roll-your-own mindset, most devs are reluctant to adopt someone elses design. Part of that is that we don't have a framework of knowledge that allows us to build a library for these designs in our head where we can just pick one thats right for our usecase.
I don't agree with your characterization of the design space as 'enourmous'. I'd say most programmers just need to know a handful of design types because they're not working on high performance, low latency, multi-million endpoint scalable projects where as you say things can get non-trivial.
I'll give a shot at an analogy (I'm hoping the nitpickers are out to lunch). The design space for door knob is enormous because of the various hand shapes, disability constraints, door sizes, applications, security implications, etc. And yet we've standardize d on a few door knob types for most homes which you can go out and buy and install yourself. The special case bank vaults and prisons and other domains solve it their own way.
I challenge you to take those people who make bridges to build full software.
I am not meaning software is engineering or not.
It is a fact, in terms of cost, that software and bridge building are, most of the time very different activities with very different goals and cost-benefit ratios.
All those things count when taking decisions about the level of standardization.
About standards... there are lots also and widely used, from networking to protocols, data transfer formats... with well-known strengths and limitations.
In my 30± year career I can confidently say that Software Engineers look towards standardisation by default as it makes their lives easier.
It feels to me that you're bitter or had more than one bad experience. Perhaps you keep working with, or come across, bad Engineers as your generalising is inaccurate.
Maybe we need a new moniker "webgineer". The average HN/FAANG web programmer does appear to vastly overestimate the value of their contributions to the world.
When I started doing this "Internet stuff" we were called "webmasters", and job would actually include what today we call:
- DevOps
- Server/Linux sysadmin
- DB admin
- Full stack (backend and frontend) engineer
1999 indeed! I haven't heard that term since around 1999 when I was hired as a "web engineer" and derisively referred to myself as a "webgineer". I almost asked if I could change my title to "sciencematician".
People who cobble together new printers or kettles overestimate the value of their contributions to the world too. The delineation isn't between JS devs and JPL or ASML engineers.
You can shit all you want on so called "engineers", but they are the one who make the CAD you're talking about that "real engineers" use. So get off your high horse.
You're kidding yourself if you don't think that mechanical, structural or any other engineers don't do the same thing. They do.
I worked for one of the UKs leading architecture / construction firms writing software and also am an amature mechanic.
You'd be amazed at how many gasket types, nuts, bolts, fasteners, unfasters, glues, concretes, bonding agents and so on ... all invented for edge preferences and most of which could be used interchangably.
Also standards? Hah. They're an absolute shitshow in any engineering effort.
And they can typically setup their dev environment without a VM, while also getting commercial app support if they need it.
Windows requires a VM, like WSL, for a lot of people, and Linux lacks commercial support. macOS strikes a good balance in the middle that makes it a pretty compelling choice.
I was thinking more about software like the Adobe suite, Microsoft Office, or other closed source software that hasn’t released on Linux. Electron has made things a bit better, but there are still a lot of bigs gaps for the enterprise, unless the company is specifically choosing software to maintain Linux support for end users.
Sure, Wine exists, but it’s not something I’d want to rely on for a business when there are alternatives like macOS which will offer native support.
Most people don't need the Adobe Suite, and the web version of M$-Office is more than Ok for occasional use. Most other enterprise software are web apps too nowadays, so it's much less relevant what OS your machine is running than it was ten years ago...
Excel is essential and in most businesses that I worked with, most of the accounting and business side is run on it. I switched to Windows from Linux just because of Excel when WSL came out. If Linux would have Excel and Photoshop that would be a no brainer to choose it, but that will never happen
Apple fanboys like to talk about how cool and long lasting a MacBook Air is but a 500 bucks Chromebook will do just as well while allowing pretty much 90% of the use cases. Sure, the top end power is much lower but at the same time considering the base RAM/storage combo Apple gives it is not that relevant. If you starting loading it up, that puts the pricing in an entirely different category and in my opinion the MacBook Air becomes seriously irrelevant when compared to serious computing devices in the same price range...
There's still a huge market for people who want higher end hardware and to run workloads locally, or put a higher price on privacy. For people who want to keep their data close to their chest, and particularly now with the AI bloom, being able to perform all tasks on device is more valuable than ever.
A Chromebook "does the job" but it's closer to a thin client than a workstation. A lot of the job is done remotely and you may not want that.
Yes, but for those people if you consider the price of a fully loaded MacBook Pro it is a rather small win considering all the limitations.
If the only thing you care about are battery life (only if you plan to use it lightly on the go, because even the high-end Apple Silicon sucks decent amount of power at full tilt) and privacy I guess they are decent enough.
This is my argument: the base models are at the same time overkill and too limited considering the price and the high-end models are way too expensive for what they bring to the table.
Apple has a big relevancy problem because of how they put a stupid pricing ladder on everything, but that is just my opinion, I guess.
As long as they keep making a shit ton of cash it doesn't matter, I suppose.
But if the relevant people stop buying Macs because they make no sense, it will become apparent why it matters sooner or later...
Not at all, a Chromebook let's you run Linux apps. I can run full blown IDEs locally without problems. And yes, that is with 8Gb ram, ChromeOS has superb memory management.
Well, Google developed and deployed MGLRU to Chromebooks long before upstreamed it. Plus they use some magic to check the MGLRU working set size inside the VMs and balance everything.
Are you seriously arguing about mini-LED displays only found in expensive MacBook Pro when I mention a cheap 500 dollars Chromebook.
There is at least a 4x difference in price for those machines, it is ridiculous to even pretend they are somewhat comparable.
And if we are talking about expensive high-end hardware, mini-LED is worse than OLED found in those machines anyway so it's not like if that would be a strong argument.
My argument isn't about Chromebooks vs any MacBook.
My argument is against a base MacBook Air that is too expensive for relatively limited added utility against something like a cheaper Chromebooks.
Sure, the MacBook Air is better built and will do some things better but those things are not extremely relevant for someone who would be satisfied by an entry level MacBook Air, because while an MBA has some very nice attributes, in the end everything is limited by its RAM/storage (and to a lesser degree, ports).
For a concrete example, in my country the cheaper MacBook Air you can get is the old M1 design at 1k€, then there is the M2 refresh at 1.2k€ and M3 variant at 1.3k€.
But you can get an Asus Chromebook Plus for 600€ that has either the same amount of RAM and storage or more RAM (16Gb) or more storage (512Gb) depending on the variant you end up with.
The display is worse (100 nits less bright and worse resolution) but slightly bigger (14") and that may matter more to many people. It has an older Intel i5 (you can find some AMD options for better efficiency) but it hardly matters for the vast majority of people who just want a laptop to do the basics (basically the target of a MacBook Air). Its battery life would be a bit worse than an MBA but not in a way that can be relevant for the vast majority of customers.
One major advantage it has over an MBA is better ports selection, with an HDMI port, a USB A port and an SD card reader on top of the 2 Thunderbolt/USB C ports the MBA has, allowing a dongle free life without having to buy anything else, providing much better utility. That can be way more relevant for many peoples than a better build quality (that I would argue do not even bring better longevity, since with Apple you are hostage of the software support anyway).
You see I am not against MacBooks; in fact, I would advise purchasing a MacBook Pro for some specific use case.
But the reality is that the entry level Apple hardware is rather compromised for its price, and if someone would be satisfied by that choice, I'm arguing that there is another choice (worse on paper, better in some ways) but at half the price (40% off minimum).
If you start loading up a MacBook Air, you end up in MacBook Pro price territory and it doesn't make a lot of sense to not add the 100-200 more to get the much better machine.
I know from experience that entry level Apple hardware is a terrible deal, both because I made the mistake myself or I had to help/fix the issues for other people that made those choices. I have a cousin who remind me every time how much he fucking hates his entry level iMac (an old one with a compromised Fusion Drive and minimum RAM) even though it was rather expensive compared to most computers.
My answer is always the same: you spent a lot, but not enough, because with Apple you do not deserve a good experience if you don't go above and beyond in your spending.
In my opinion it is way more disingenuous to advocate for entry-level Apple hardware to people who would be very much satisfied with products costing half as much. The value is just not there, Apple went way too far in the luxury territory in locking down everything and creating a pricing ladder that is engineered to confuse and upsell customers to extract as much money as possible.
For someone who really needs a computer to do more intense work, provided they can work around the tradeoffs of Apple Silicon/macOS and they are willing to spend a large amount of cash, Apple has some great hardware for sure.
For everyone else the value is extremely questionnable, especially since they are going full steam ahead into services subscription and the software can be lacking in some ways that will require purchasing even more stuff, the total cost of ownership doesn't make sense anymore.
For example, their iPhone SE is absolutely terrible, at 500€ you pay for 6 years old technology with small screen compared to the footprint, terrible battery life, etc. A 500€ mid-range Android is so much better in so many ways that it is just stupid at this point...
As for OLED, I don't think burn-in is a significant concern anymore, and if it is I would argue that you are using it too much like a desktop. In my country you could buy 2 decent OLED laptops for the price of an entry-level MacBook Pro anyway so it doesn't matter as much (and replacing displays of hardware manufacturers other than Apple is much easier and cheaper, so there is that).
I think the MacBook Pros are very good for some niche applications, but at viable minimum 2.23k€ price (16Gb RAM/512GB storage) there are a lot of good options so it really requires a careful analysis of actual use case. If you do things related to 3D or engineering it is probably not worth it...
You usually don't need either for software development though, and if you do the free or online alternatives are often good enough for the rare occasions you need them. If you are a software developer and you have to spend significant time using Office it means you either are developing extensions for Office or your company management is somewhat lacking and you are forced to handle things you should not (like bureaucracy for instance).
Where I’m at my email is in Outlook. Having to use the web version sounds annoying. I also end up getting a lot of information in spreadsheets. Having to move all that to the online version to open also sounds annoying. The online version is also more limited, which could lead to issues.
I could see a front end dev needing Photoshop for some things, if they don’t have a design team to give them assets.
There are also security software the company says laptops must have which isn’t available for Linux. They only buy and deploy this stuff with Windows and macOS in mind.
A couple weeks ago on HN I saw someone looking for a program to make a demo of their app (I think). The comments were filled with people recommending an app on macOS that was apparently far and away the best option, and many were disappointed by the lack of availability elsewhere. I find there are a lot of situations like this, where I might be able to get the job done on another OS, but the software I actually want to use is on macOS. Obviously this one is a matter of taste to some degree.
It’s not as big an issue as it was 20 years ago, but it’s still an issue for in many environments.
I would love to buy Apple hardware, but not from Apple. I mean: M2 13 inch notebook with access to swap/extend memory and storage, regular US keyboard layout and proper desktop Linux (Debian, Alpine, Mint, PopOS!, Fedora Cinamon) or windows. MacOS and the Apple eco system just gets in your way when you're just trying to maintain a multi-platform C++/Java/Rust code base.
WSL for normal stuff. My co-worker is on Windows and had to setup WSL to get a linter working with VS Code. It took him a week to get it working the first time, and it breaks periodically, so he needs to do it all over again every few months.
I'm developing on Windows for Windows, Linux, Android, and web, including C, Go, Java, TSQL and MSSQL management. I do not necessarily need WSL except for C. SSH is built directly into the Windows terminal and is fully scriptable in PS.
WSL is also nice for Bash scripting, but it's not necessary.
It is a check box in the "Add Features" panel. There is nothing to install or setup. Certainly not for linting, unless, again, you're using a Linux tool chain.
But if you are, just check the box. No setup beyond VS Code, bashrc, vimrc, and your tool chain. Same as you would do on Mac.
If anything, all the Mac specific quirks make setting up the Linux tool chains much harder. At least on WSL the entire directory structure matches Linux out of the box. The tool chains just work.
While some of the documentation is in its infancy, the workflow and versatility of cross platform development on Windows, I think, is unmatched.
This. I have to onboard a lot of students to our analysis toolchain (Nuclear Physics, ROOT based, C++). 10 years ago I prayed that the student has a Mac, because it was so easy. Now I pray they have Windows, because of WSL. The tool chain is all compiled from source. Pretty much every major version, but often also minor versions, of macos break the compilation of ROOT. I had several release upgrades of Ubuntu that only required a recompile, if that, and it always worked.
Unless he is doing Linux development in the first place, that sounds very weird. You most certainly don't need to set up WSL to lint Python or say JS in VSCode on Windows.
That sounds wild, you can run bash and unix utils on windows with minimal fuss without WSL. Unless that linter truly needed linux (and i mean, vscode extensions are typescript..) that sounds like overkill
> Engineers use MacBook pros because it’s the best built laptop, the best screen, arguably the best OS and most importantly - they’re not the ones paying for them.
I know engineers from a FANG that picked MacBook pros in spite of the specs and only because of the bling/price tag. Them they spent their whole time using it as a remote terminal for Linux servers, and they still complained about the thing being extremely short on RAM and HD.
One of them even tried to convince their managers to give the vision pro a try, even though there was zero use cases for it.
Granted, they drive multiple monitors well with a single USB-C plug, at least with specific combinations of monitors and hubs.
It's high time that the "Apple sells high end gear" shtick is put to rest. Even their macOS treadmill is becoming tiring.
The build quality of Apple laptops is still pretty unmatched in every price category.
Yes, there are 2k+ laptops from Dell/Lenovo that match and exceed a similarly priced MacBook in pure power, but usually lack battery life and/or build quality.
Apple devices also work quite seamless together.
IPads for example work great as a second screen wirelessly with the MBPs. I'd immediately buy a 14 inch ipad just for that, since that is so useful when not on your standard desk.
Also copy paste between devices or headphones just work...
in case Apple would come up with the idea to take an ipad as external compute unit that would be amazing... just double your ram, compute and screen with it in such a lightweight form factor... should be possible if they want
is there now a low latency solution for windows 2nd monitor? I was only aware of some software where latency is quite bad or one company that provided a wireless HDMI / Displayport dongle...
Also the nice thing for headphones within apple is, that the airpods automatically switch to where the attention is... meaning e.g., in case I watch something on the laptop and pick up an iphone call (no matter if via phone or any app) the airpod automatically switches
My 15 inch macbook which fried its display twice (didn't go to sleep properly and then put in a bagpack and overheated. There is no way to see that the sleep didn't kick in), and then had the broken display cable problem (widespread and Apple wanted $900 for a new display..) would disagree.
For comparison: The 4k touch display on my xps15 that didn't survive a diet coke bath was <$300 including labor for a guy to show up in my office and repair it while I was watching....
> The build quality of Apple laptops is still pretty unmatched in every price category.
I owned a MacBook Pro with the dreaded butterfly keyboard. It was shit.
How many USB ports do the new MacBook air have? The old ones had two. And shipped with 8GB of RAM? These are shit-tier specs.
The 2020 MacBook pros had a nice thing: USB-C charging, and you could charge the from either side. Current models went back to MagSafe, only on one side. The number of USB ports is still very low.
But the are shiny. I guess that counts as quality.
I guess we can agree to disagree, but I find the 2020 rev Macbook pros have a good number of USB-C ports (2 on the left, 1 on the right -- all can do PD), a magsafe charger, headphone jack, HDMI port and SD card slot. How many USB-C ports do you need? Sometimes I wish there was ethernet but I get why it's not there.
I agree, the butterfly keyboard was shitty but I absolutely love the keyboard on the 2020 rev. It's still not as great as my mechanical desktop keyboard, but for a laptop keyboard it's serious chef's kiss. Also, I have yet to find a trackpad that is anywhere as good as the Macbook. Precision trackpads are still way way worse.
Finally, the thing that always beings me back to MBPs (vs Surfacebooks or Razers) is battery life. I typically get a good 10+ hours on my MBP. Battery life on my old Razer Blade and Surfacebooks were absolutely comically horrible.
I'm absolutely not an Apple person. Privately own zero Apple hardware.
However there are two awesome things about my work MBP I would really want from my ThinkPad:
Magsafe charger - too many close calls!
And the track pad.
I can't work properly without an external mouse on my ThinkPad. But on the MBP everything just has the right size, location, proportions and handling on the track pad. I had a mouse for the MBP too but I stopped using it!
I don’t think it’s at all unreasonable for an engineer using a device for 8+ hours every day to pay an additional, say, 0.5% of their income (assuming very conservatively $100,000 income after tax, $1,000 extra for a MacBook, 2 year product lifespan) for the best built laptop, best screen, and best OS.
I do networking stuff and macOS is on par with Windows - I can't live on it without running into bugs or very questionable behavior for longer than a week. Same as Windows.
What stuff is weird? I have so far had very good experiences with Apple (although not iOS yet). Almost everything I do on my Linux workstation works on Mac too. Windows though is beyond horrible and different in every way.
> I do networking stuff
Me too, but probably very different stuff. I’m doing p2p stuff over tcp and am affected mostly by sock options, buffer sizes, tcp options etc.
I like apple hardware, but their OS is fucking atrocious. In the year 2024 it still doesn't have a native volume mixer, or any kind of sensible window management shortcuts. Half the things on it have to be fixed with paid software. Complete joke of an OS, if it were up to me I'd stick a linux distro on top of the hardware and be happy
The OS is not a joke since it can do some stuff better than either Windows or Linux can but I completely agree that there are some serious limitations or omissions that should have been fixed.
I think they don't because they have an incentive to not do so: they get a cut on all the software you have to purchase on the App Store to make up for it.
It might not look like a lot, but if a large portions of Mac users need to buy a 5-10 bucks app to fix the windows management problems, it becomes serious money at 15-30% cut on millions of purchases...
And this is precisely the problem with Apple today. They are not honest enough to fix or improve the stuff they sell at a very high price, both because they sell it anyway and because they put in place many incentives for themselves to not do so.
There is the running meme of the iPad calculator but macOS could also use some care on the calculator/grapher not having received serious attention in decades. At the price they sell their stuff that would seem like a given, but considering they'll make money on the apps you buy to improve that situation, they'll never do so.
After using Apple App Stores for so many years, I wish I didn't, the convenience really isn't worth the cost down the road...
Not worth it at all. I rarely use battery power, so I'd rather have an intel or AMD chip with more cores and a higher clock speed at the expense of the battery. Oh, and an OS that can actually manage its windows, and customize keyboard settings, and not require an account to use the app store
Then why are you using a Macbook in the first place? There are plenty of Ryzen 7000 and Intel Ultra laptops with similar performance out there. The key benefit of a Macbook is the battery life and sane sleeping when portable.
Apple's hardware these days is exceptional but the software left wanting in comparison. MacOS feels like it's been taking two steps back for every step forward for a decade now. I run MacOS, Linux w/ i3, and Windows all every day, and outside of aesthetics & apple integration MacOS feels increasingly the least coherent of the 3.
The same is true of the ipad which is just a miraculous piece of hardware constrained by an impotent operating system.
This statement is completely wrong. There are millions of engineers in the world and most of them live in countries like China, India and Russia. Very few of them use MacBooks.
The vast majority of the software engineers in big companies (that employ a lot more people than big tech and startups combined) who use Java and C# also have predominately Windows laptops (as their employers can manage Windows laptops a lot easier, have agreements with vendors like Dell to buy them with a discount, have software like AV that doesn't support MacOS, etc.).
On top of that MacBooks don't have the best screens and are not the best built. Many Windows laptops have OLED screens or 4K IPS screens. There are premium Windows laptops made out of magnesium and carbon fiber.
I'm an American, so maybe the situation is different elsewhere.
Every company I've worked for during the last 12 years gives out MacBook Pros. And I've been developing using Scala / Java for the last 20 years.
Employers manage Macs just fine, this isn't 1999. There have been studies showing that Macs have lower IT maintenance costs compared to Windows.
I admit that I haven't dealt with Windows devices in a long time, maybe there are some good ones available now, but I find your statements to be beyond belief. Apple Silicon Macs have blown the doors off the competition, out performing all but top-end Intel laptops, while using a fraction of the power (and I never even hear the fans come on).
I think relatively few corporations are offering Macs to people. It's all bog-standard POS Dells, with locked-down Windows images that often do not even allow you to change the screensaver settings or the background image, in the name of "security." I'd love to be wrong about that.
all two jobs I've worked, both as a backend dev using Go in data-storage companies, have offered Macs.
The first one, a small, badly run startup, only offered Macs. This gig, a larger company, offers Mac, Linux and Windows. I started with Linux and then switched to Mac because I was tired of stuff breaking.
Arch works fairly well on Apple silicon now, though Fedora is easier/recomended.
Limited emulation due to the 16KB pages and no thunderbolt display out.
Arguably the best OS? For what? For browsing the web, video editing, etc.? Maybe. For development? Jesus, macOS doesn't even have native container support. All the devs I know with macOS then either get a second Linux laptop, or spend a lot of their time SSHd into a Linux server.
For dev (at least backend and devops), macOS is not that great.
I don't know what you are talking about, I'm a back end engineer, and every company I've worked for during the last 12 years gives out MacBook pros to all devs. Even the game company that used C# and Mono gave out MacBooks (and dual booted them, which of course you can't do any more; I never bothered with Windows since our servers were written in Scala).
Not all teams run tons of containers on personal computers. All our servers are running on AWS. I rarely ssh into anything.
I like the fact that OS X is based on UNIX, and not some half-assed bullshit bolted onto Windows. I still have bad memories of trying to use Cygwin 15 years ago. Apparently WSL is an improvement, but I don't care.
Mac runs all the software I need, and it has real UNIX shells.
Yeah it's funny for all the hoopla I've heard over the glory of MacOS having a REAL UNIX TERMINAL, WSL works better in practice simply because it's running an actual Linux VM and thus the support is better.
Still, I just don't think it's that burdensome to get containers running on MacOS, it's just annoying that it happens to work worse than on Windows or Linux. Ignoring the hardware, the only real advantage to MacOS development is when you're targeting Apple products with what you're developing.
"best OS" is so subjective here. I'll concede that the MacBook hardware is objectively better than any laptop I've owned. But it's a huge leap to say Mac OS is objectively better than Linux IMO.
I have one and hate it with a passion. A MacBook Air bought new in the past 3 years should be able to use Teams (alone) without keeling over. Takes over a minute to launch Outlook.
My 15 year old Sony laptop can do better.
Even if Microsoft on Mac is an unmitigated dumpster fire, this is ridiculous.
I avoid using it whenever possible. If people email me, it’d better not be urgent.
Indeed, I have a 15-year-old desktop computer that is still running great on Linux. I upgraded the RAM to the maximum supported by the motherboard, which is 8 GB, and it has gone through three hard drives in its life, but otherwise it is pretty much the same. As a basic web browsing computer, and for light games, it is fantastic.
It also performs pretty well for the particular brand of web development I do, which basically boils down to running VS Code, a browser, and a lot of ssh.
It's fascinating to me how people are still attached to the hardware upgrade cycle as an idea that matters, and yet for a huge chunk of people and scenarios, basically an SSD, 8gb of RAM and an Intel i5 from a decade ago could have been the end of computing history with no real loss to productivity.
I honestly look at people who use Apple or Windows with a bit of pity, because those ecosystems would just give me more stuff to worry about.
Is it an Apple silicon or Intel machine? Intel macs are crazy slow - especially since the most recent few versions of macOS. And especially since developers everywhere have upgraded to an M1 or better.
You could certainly still buy new intel macbooks 3 years ago from Apple. Plenty of people did - particularly given a lot of software was still running through rosetta at the time.
The M1 air was only released in November 2020. With a bit of slop in the numbers, its very possible the parent poster bought an intel mac just before the M1 launched.
Yeah it's such a shame how much the performance has been affected by recent macOS. I kept my 2019 Mac Book Pro on Catalina for years because everyone else was complaining... finally upgraded directly to Sonoma and the difference in speed was night and day!
Sounds a bit like my Intel MBP, in particular after they (the company I work for) installed all the lovely bloatware/tracking crap IT thinks we need to be subjected to. Most of the day the machine runs with the fans blasting away.
Still doesn't take a minute to launch Outlook, but I understand your pain.
I keep hoping it will die, because it would be replaced with an M-series MBP and they are way, way, WAY faster than even the best Intel MBP.
I will pile on on MS Teams. I am on a Mac and periodically have to fight it because it went offline on me for some reason and I am no longer getting messages. Slightly less annoying is when my iPhone goes to sleep and Teams on my iPhone then sets my status to "Away", even though I am actively typing on Teams on my computer.
And while my particular problems might be partially because I am on MacOS, I observe Windows-using colleagues have just as many problems joining meetings (either total refusal, no audio, or sharing issues). So I think using Teams as a measure of any computer is probably not warranted.
I actually rejected a job offer when heard I will be given a macbook pro.
Apple, been the most closed company these days, should be avoided as much as you can, not to mention its macos is useless for linux developers like me, anything else is better.
its keyboard is dumb to me(that stupid command/ctrl key difference), can not even mouse-select and paste is enough for me to avoid Macos at all costs.
> I actually rejected a job offer when heard I will be given a macbook pro.
For what it's worth, I've had a good success rate at politely asking to be given an equivalent laptop I can put linux on, or provide my own device. I've never had to outright reject an offer due to being required to use a Mac. At worst I get "you'll be responsible for making our dev environment work on your setup".
I've had 50/50. These days I'm fairly okay with just taking the Macbook Pro. I did have one instance where I got one my first week and used my Dell XPS with Linux the entire 10 months I was at the place. I returned the Macbook basically unused.
Only one time did I interview with a place where I asked if I'd be given a choice what hardware/OS I could use. The response was "We use Windows". My response was, "no we do not. Either I will not be using Windows with you, or I will not be using Windows NOT with you". I didn't get an offer. I was cool with it.
> its keyboard is dumb to me(that stupid command/ctrl key difference)
Literally best keyboard shortcuts out of all major OSes. I don't know what weird crab hands you need to have to comfortably use shortcuts on Windows/Linux.
CMD maps PERFECTLY on my thumb.
any thing runs Linux,even wsl2 is fine,no macos is the key. and yes it costs the employer about half of the expensive Apple devices that can not even be upgraded, its hardware is as closed as its software.
Employers typically also care about costs like “how hard is it to provision the devices” and “how long is the useful life of this” or “can I repurpose an old machine for someone else”.
Provisioning is a place where Windows laptops win hands down, though.
Pretty much everything going wrong with provisioning involves going extra weird on hw (usually for cheap supplier) and/or pushing weird third party "security" crapware.
> "I don't even know what you mean by mouse-select and paste."
Presumably they mean linux-style text select & paste, which is done by selecting text and then clicking the middle mouse button to paste it (no explicit "copy" command).
macOS doesn't have built-in support for this, but there are some third-party scripts/apps to enable it.
On Windows these days, you get WSL, which is actual Linux, kernel and all. There are still some differences with a standalone Linux system, but they are far smaller than macOS, in which not only the kernel is completely different, but the userspace also has many rather prominent differences that you will very quickly run afoul of (like different command line switches for the same commands).
Then there's Docker. Running amd64 containers on Apple silicon is slow for obvious reasons. Running arm64 containers is fast, but the actual environment you will be deploying to is almost certainly amd64, so if you're using that locally for dev & test purposes, you can get some surprises in prod. Windows, of course, will happily run amd64 natively.
> the actual environment you will be deploying to is almost certainly amd64
that’s up to your team of course, but graviton is generally cheaper than x86 instances nowadays and afaik the same is true on google and the other clouds.
Arm is an ISA, not a family of processors. You may expect Apple chips and Graviton to be wildly different, and perform completely different in the same scenario. In fact, most Arm cpus also have specific extensions that are not found in other manufacturers. So yes, while both recognize a base set of instructions, thats about it - expect that everything else is different.
I know, amd64 is also technically an ISA, but you have 2 major manufacturers, with very similar and predictable performance characteristics. And even then, sometimes something on AMD behaves quite differently from Intel.
For most devs, doing crud stuff or writing high-level scripting languages, this isn't really a problem. For some devs, working on time-sensitive problems or with strict baseline performance requirements, this is important. For devs developing device drivers, emulation can only get you so far.
No, I said you won’t always be deploying on amd64. Because arm64 is now the cheapest option and generally faster than the sandy bridge vcpu unit that amd64 instances are indexed against (and really, constrained to, intentionally, by AWS).
I never said anything about graviton not being arm64.
Its not about price, its about compatibility. Just because software compiles in a different ISA doesnt mean it behaves the same way. But if that isn't obvious to you, good for you.
M* has caused nothing but trouble for most mac user engineers I know (read: most engineers I know) who upgraded. Now not only are they building software for a different OS, they're building for a different architecture! They do all of their important compute in Docker, wasting CPU cycles and memory on the VM. All for what: a nice case? nice UI (that pesters you to try Safari)?
It looks like Apple's silicon and software is really good for those doing audio/video. Why people like it for dev is mostly a mystery to me. Though I know a few people who don't really like it but are just intimidated by Linux or just can't handle the small UX differences.
I'm an engineer that has both an apple silicon laptop (mbp, m2) and a linux laptop (arch, thinkpad x1 yoga.) I choose the mac every day of the week and it's not even close. I'm sure it's not great for specific engineering disciplines, but for me (web, rails, sre) it really can't be beat.
The UX differences are absolutely massive. Even after daily-driving that thinkpad for months, Gnome always felt kinda not quite finished. Maybe KDE is better, but it didn't have Wayland support when I was setting that machine up, which made it a non-starter.
The real killer though is battery life. I can work literally all day unplugged on the mbp and finish up with 40-50% remaining. When i'm traveling these days, i don't even bring a power cable with me during the day. The thinkpad, despite my best efforts with powertop, the most aggressive frequency scaling i could get, and a bunch of other little tricks, lasts 2 hours.
There are niceties about Linux too. Package management is better and the docker experience is _way_ better. Overall though, i'd take the apple silicon macbook 10 times out of 10.
Battery life followed by heat and fan noise have been my sticking points with non-mac laptops.
My first gen ThinkPad Nano X1 would be an excellent laptop, if it weren’t for the terrible battery life even in power save mode (which as an aside, slows it down a lot) and its need to spin up a fan to do something as trivial as driving a rather pedestrian 2560x1440 60hz display.
It feels almost like priorities are totally upside down for x86 laptop manufacturers. I totally understand and appreciate that there are performance oriented laptops that aren’t supposed to be good with battery life, but there’s no good reason for there being so few ultraportable and midrange x86 laptops that have good battery life and won’t fry your lap or sound like a jet taking off when pushed a little. It’s an endless sea of mediocrity.
This echoes my experiences for anything that needs power management. Not just that the battery life is worse, but that it degrades quickly. In two years it’s barely usable. I’ve seen this with non-Apple phones and laptops. iPhone otoh is so good these days you don’t need to upgrade until EOL of ~6 years (and even if you need it battery is not more expensive than any other proprietary battery). My last MacBook from 2011 failed a couple of years ago only because of a Radeon GPU inside with a known hw error.
> There are niceties about Linux too.
Yes! If you haven’t tried in years, the Linux desktop experience is awesome (at least close enough) for me – a dev who CAN configure stuff if I need to but find it excruciatingly menial if it isn't related to my core work. It’s really an improvement from a decade ago.
I'd like to offer a counterpoint, I have an old'ish T480s which runs linuxmint, several lxd containers for traefik, golang, python, postgres and sqlserver (so not even dockerized, but full VMs running these services), and I can go the whole morning (~4-5 hours).
I think the culprit is more likely the power hungry intel CPU in your yoga?
Going on a slight tangent; I've tried but do not like the mac keyboards, they feel very shallow to me, hence why I'm still using my old T480s. The newer thinkpad laptop keyboards all seem to be going that way though (going thinner), much to my dismay. Perhaps a P14s is my next purchase, despite it's bulk.
Anybody with a framework 13 want to comment on their keyboard?
I really like the keyboards on my frameworks. I have both the 13 and the new 16, and they are pretty good. Not as good as the old T4*0s I'm afraid, but certainly usable.
Interesting. I do similar (lots of Rails) but have pretty much the opposite experience (other than battery life - Mac definitely wins there). Though I use i3/Sway more than Gnome. The performance of running our huge monolith locally is much better for Linux users than Mac users where I work.
I used a Mac for awhile back in 2015 but it never really stood out to me UX-wise, even compared to Gnome. All I really need to do is open a few windows and then switch between them. In i3 or Sway, opening and switching between windows is very fast and I never have to drag stuff around.
This is going to change once Arm on Linux becomes a thing with Qualcomm's new jazz. I am mostly tethered to a dock with multiple screens. I have been driving Ubuntu now for over 4 years full time for work.
In my experience as a backend services Go developer (and a bit of Scala) the switch to arm has been mostly seamless. There was a little config at the beginning to pull dual-image docker images (x64 and arm) but that was a one time configuration. Otherwise I'm still targeting Linux/x64 with Go builds and Scala runs on the JVM so it's supported everywhere anyway; they both worked out of the box.
My builds are faster, laptop stays cooler, and battery lasts longer. I love it.
If I was building desktop apps I assume it would be a less pleasant experience like you mention.
Interestingly enough, the trend I am seeing is all the MacBook engineers moving back to native development environments. Basically, no longer using docker. And just as expected, developers are getting bad with docker and are finding it harder to use. They are getting more and more reliant on devops help or to lean on the team member who is on Linux to handle all of that stuff. We were on a really great path for a while there in development where we were getting closer to the ideal of having development more closely resemble production, and to have developers understand the operations tools. Now we're cruising firmly in the opposite direction because of this Apple switch to arm. Mainly it wouldn't bother me so much if people would recognize that they are rationalizing because they like the computers, but they don't. They just try to defend logically a decision they made emotionally. I do it too, every human does, but a little recognition would be nice.
It's not even a problem with MacBooks as such. They are still excellent consumer devices (non-casual gaming aside). It's this weird positioning of them as the ultimate dev laptop that causes so many problems, IMO.
Because machines are tools meant to perform tasks, and part of that is being interoperable with other tools and de facto standards in the relevant field. For dev work, today, MacBook is not good at it.
Remember, though, that the binaries deployed in production environments are not being built locally on individual developer machines, but rather in the cloud, as reproducible builds securely deployed from the cloud to the cloud.
Modern language tooling (Go, Rust et al) allows one to build and test on any architecture, and the native macOS virtualization (https://developer.apple.com/documentation/virtualization) provides remarkably better performance compared to Docker (which is a better explanation for its fading from daily use).
Your "trend" may, in fact, not actually reflect the reality of how cloud development works at scale.
And I don't know a single macOS developer that "lean(s) on the team member who is on Linux" to leverage tools that are already present on their local machine. My own development environments are IDENTICAL across all three major platforms.
Virtualization and Docket are orthogonal technologies. The reason you use docker, especially in dev, is to have the exact same system libraries, dependencies, and settings on each build. The reason you use virtualization is to access hardware and kernel features that are not present on your hardware or native OS.
If you deploy on docker (or Kubernetes) on Linux in production, then ideally you should be using docker on your local system as well. Which, for Windows or MacOS users, requires a Linux VM as well.
It seems that you're trying to "educate" me on how containers and virtualization work, when in fact I've been doing this for a while, on macOS, Linux and Windows (itself having its own Hyper-V pitfalls).
I know you mean well, though.
There is no Docker on macOS without a hypervisor layer - period - and a VM, though there are multiple possible container runtimes not named Docker that are suitable for devops-y local development deployments (which will always, of course, be constrained in comparison to the scale of lab / staging / production environments). Some of these can better leverage the Rosetta 2 translation layer that Apple provides, than others.
I'm sorry that I came up as patronizing, I was more so trying to explain my confusion and thought process rather than to teach you about virtualization and containers.
Specifically what confused me in your comment was that you were saying Docker on Mac was superseded by their new native virtualization, which just doesn't make sense to me, for the reasons I was bringing up. I still don't understand what you were trying to say; replacing docker with podman or containerd or something else still doesn't have anything to do with virtualization or Rosetta, or at least I don't see the connection.
I should also say that I don't think anyone really means specifically docker when they talk about it, they probably mean containerization + image repos in general.
I don’t know a single engineer who had issues with M chips, and most engineers I know (me included) benefited considerably from the performance gains, so perhaps your niche isn’t that universal?
You must have an unusual setup because, between Rosetta and rosetta in Virtualization.framework VMs (configurable in Docker Desktop or Rancher Desktop), I’ve never had issues running intel binaries on my Mac
what's wrong w/ Rails on M chips? I don't recall having had much trouble with it (except w/ nokogiri bindings right when the M1 was first available, but that's a given for any new release of OSX)
We have to cross-compile anyway because now we're deploying to arm64 Linux (AWS Graviton) in addition to x86 Linux.
So even if all developers of your team are using Linux, unless you want to waste money by ignoring arm64 instances on cloud computing, you'll have to setup cross compilation.
1) macs are by far the best hardware and also performance running intel code is faster than running intel code on the previous intel macs: https://discourse.slicer.org/t/hardware-is-apple-m1-much-fas...
2) they should use safari to keep power usage low and browser diversity high
It is, provided that the hardware vendor has reasonably decent support for power management, and you're willing to haul around an AC adapter if not. In general, I really like AMD hardware with built-in graphics for this, or alternately, Intel Tiger Lake-U based hardware.
Asahi Linux is shockingly great on Apple Silicon hardware, though.
Apple is selling hardware and scaling AI by utilizing it is simply a smart move.
Instead of building huge GPU clusters, having to deal with NVIDIA for GOUs (Apple kicked NVIDIA out years ago because of disagreements), Apple is building mainly on existing hardware.
This is in other terms utilizing CPU power.
On the other hand this helps their marketing keeping high price points when Apple now is going to differentiate their COU power and therefore hardware prices over AI functionality correlating with CPU power. This is also consistent with Apple stopping the MHz comparisons years ago.
Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.
They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally.
> there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.
But this hardly applies to 95% if not more people of all people running Apple's hardware, the fastest CPU/GPU isn't worth much if you can fit any at least marginally useful LLM model on the 8GB (or less on iPhones/iPads) of memory that you device has?
>Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.
Most CS professionals who write code have no idea what it takes to build a desktop, so the hardware that they chose is pretty much irrelevant because they aren't specifically choosing for hardware. The reason Apple gets bought is mostly by anyone, including tech people, is because of ecosystem. The truth is, nobody really care that much about actual specs as long as its good enough to do basic stuff, and when you are indifferent to the actual difference but all your friends are in the ecosystem, the choice is obvious.
You can easily see this yourself: ask these "professionals" about the details of the Apple Neural engine, and its a very high chance that they will repeat some marketing material, while failing to mention that Apple does not publish any real docs for ANE, you have to sign your code to run on ANE, and you have to basically use Core ML to utilize the ANE. I.e if they really cared about inference, all of them would be buying laptops with discrete 4090s for almost the same price.
Meanwhile, if you look at people who came from EE/ECE (who btw on the average are far better coders than people with CS background, based on my 500+ interviews in the industry across several sectors), you see a way larger skew towards Android/custom built desktops/windows laptops running Linux. If you lived and breathed Linux and low level OS, you tend appreciate all the power and customization that it gives you because you don't have to go learn how to do things.
Coming from both environments, I'd be wary of making some of these assertions, especially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus. This applies regardless of (RT)OS / hardware choice, i.e., it's simply common sense.
The signing of binaries is a part of adult developer life, and is certainly required for the platforms you mention as well.
Unquestionably, battery life on 4090-based laptops sucks on a good day, and if you're working long hours, the last thing you want to have to do is park yourself next to your 350W adapter just to get basic work done.
>specially when you consider that any ecosystem that optimizes software and hardware together (from embedded devices all the way to general-purpose computing machines) is generally going to perform well, given the appropriate engineering focus.
Very much not true. Not to make this personal, but this is exactly what Im talking about Apple fans not understanding hardware.
Linux has been through the ringer of fighting its way to general use, and because of its open source nature and constant development. So in terms of working well, it has been optimized for hardware WAY further than Apple, which is why you find it on servers, personal desktops, phones, portable gaming devices, and even STM32 Cortex bldc control boards, all of which run different hardware.
Apple doesn't optimize for general use, it optimizes for a specific business case. In the case of Apple silicon, it was purely battery life which brings more people in to the ecosystem. Single core performance is on par with all the other chips, because the instruction set doesn't actually matter (https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-...), multi core is behind, Mac Os software is still a pile of junk (Rosetta still isn't good across the board), computers are not repairable, you have no privacy since Apple collects a shitload of telemetry for themselves, e.t.c and so on.
And, Apple has no incentive to make any of this better - prior to Apple Silicon, people were still buying Intel Macs with worse specs and performance for the same price, all for the ecosystem and vanity. And not only was the Macos still terrible (and much slower), you also had hardware failures like plugging in a wrong USBC hub would blow the chip and brick your Mac, butterfly keyboards failing, and questionable decisions like virtual esc keys.
>The signing of binaries is a part of adult developer life,
...for professional use, and the private key holder should be the person who wrote that software. I hope you understand how ridiculous it is to ask a developer to sign code using the manufacturers key to allow them to run that code on a machine that they own.
>Unquestionably, battery life on 4090-based laptops sucks on a good day,
Well yea, but you are not buying that laptop for battery life. Also, with Ryzen cpus and 4090s, most get like 6-8 hours depending on use due to Nvidia Prime, which is pretty good for travel, especially if you have a backpack with a charging brick.
If you want portability, there are plenty of lighter weight option like Lenovo Yoga which can get 11-12 hours of battery life for things like web browsing.
Any decent laptop from the same era. My parents are using both HP ProBooks and Lenovo Thinkpads from that era currently and they are working perfectly and maintenance costs are lower than the same era macbooks...
I own a MacBook Air, I won't be buying another purely because the moment I need to upgrade anything or repair anything it's effectively ewaste.
Not found any good proxy which works well with cisco VPN software. Charles and proxyman work intermittently at best and require disconnecting from the VPN and various such dances.
> Somewhat true but things are changing. While there are plenty of “luxury” Apple devices like Vision Pro or fully decked out MacBooks for web browsing we no longer live in a world where tech are just lifestyle gadgets.
I notice your use of the weasel word "just".
We undoubtedly live in a world where Apple products are sold as lifestyle gadgets. Arguably it's more true today than it ever was. It's also a world where Apple's range of Veblen goods managed to gain footing in social circles to an extent that we have kids being bullied for owning Android phones.
Apple's lifestyle angle is becoming specially relevant because they can no longer claim they sell high-end hardware, as the difference in specs between Apple's hardware and product ranges from other OEMs is no longer noticeable. Apple's laughable insistence on shipping laptops with 8GB of RAM is a good example.
> Even in the higher end products like the MacBooks you see a lot of professionals (engineers included) who choose it because of its price-performance-value, and who don’t give a shit about luxury.
I don't think so, and that contrasts with my personal experience. All my previous roles offered a mix of MacBooks and windows laptops, and MacBooks were opted by new arrivals because they were seen as perks and the particular choice of windows ones in comparison were not as impressive, even though they out-specced Apple's offering (mid-range HP and Dell). In fact in a recent employee's review their main feedback was that the MacBook pro line was under-specced because at best it shipped with only 16GB of RAM while the less impressive HP ones already came with 32GB. In previous years, they called for the replacement of the MacBook line due to the rate of keyboard malfunctions. Meaning, engineers were purposely picking the underperforming option for non-technical reasons.
I bought my first Apple product roughly 11 years ago explicitly because it had the best accessibility support at the time (and that is still true). While I realize you only see your slice of the world, I really cringe when I see the weasel-word "lifestyle". This "Apple is for the rich kids"-fairytale is getting really really old.
Apparently you’ve never used Apple Silicon. There’s no PC equivalent in terms of specs.
Also, I think you’re misunderstanding what a Veblen good is and the difference between “premium” and “luxury.” Apple does not create luxury or “Veblen” goods like for example, LVMH.
An easy way to discern the difference between premium and luxury — does the company advertise the product’s features or price?
For example, a Chanel handbag is almost entirely divorced from its utility as a handbag. Chanel doesn’t advertise features or pricing, because it’s not about the product’s value or utility, it’s what it says about your personal wealth that you bought it. That’s a Veblen good.
Apple heavily advertises features and pricing. Because they sell premium products that are not divorced from their utility or value.
price-performance is not a thing for a vast majority of users. Sure I'd like a $40k car but I can only afford a $10k car. It's not nice but it gets me from a to b on my min-wage salary. Similarly, I know plenty of friends and family. They can either get 4 macs for $1000 each (mom, dad, sister, brother) so $4k. Or they can get 4 windows PCs for $250 so $1k total.
The cheap Windows PCs suck just like a cheap car sucks (ok, they suck more), but they still get the job done. You can still browse the web, read your email, watch a youtube video, post a youtube video, write a blog, etc.. My dad got some HP celeron. It took 4 minutes to boot. It still ran though and he paid probably $300 for it vs $999 for a mac. He didn't have $999.
I’m not saying one or the other is better for your family members. But MacBooks last very long. We'll see about the M series but for myself for instance I got the M1 air without fans, which has the benefit of no moving pieces or air inlets, so even better. My last one, a MBP from 2011 lasted pretty much 10 years. OS updates are 8-10y.
> The cheap Windows PCs suck […], but they still get the job done
For desktop, totally. Although I would still wipe it with Ubuntu or so because Windows is so horrible these days even my mom is having a shit time with only browsing and video calls.
A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.
> A random laptop however is a different story. Except for premium brands (closer to Apple prices) they tend to have garbage battery life, infuriating track pad, massive thermal issues, and preloaded with bloatware. Apple was always better here, but now with the lower power/heat of the ARM chips, they got soooo much better overnight.
To the person with no budget, all that doesn't matter. They'll still get let $250 laptop and put up with the garbage battery life (find a power outlet), infuriating trackpad (buy an external mouse for $10), bloatware (most users don't know this and just put up with it), etc....
I agree Apple is better. But if your budget is $250 and not $1k then you get what you can get for $250 and continue to feed your kids and pay your rent.
But also you don't have to buy new. If I had $250, an ancient MacBook might be better than a newer low-end windows laptop. Though for my purposes I'd probably get an oldish Chromebook and root it.
you can get a laptop with a much bigger screen and a keyboard for as little as 100 to 300$ and it will be much much easier to get work done on, than an apple phone. so i think apple is still very much a luxury product.
Clumsily phrased. What I meant is that iPhones or similar priced smartphones are affordable and common for say middle class in countries with similar purchase power to Eastern European countries. You’d have to go to poorer countries like Vietnam or Indonesia for iPhones to be “out of reach”, given the immense value it provides.
Heck now I see even Vietnam iPhone is #1 vendor with a 28% market penetration according to statcounter. That’s more than I thought, even though I was just there…
Speaking of India, they’re at 4% there. That’s closer to being luxury.
I think US is their main market, though. The rest of the world prefers cheaper better phones and doesn't mind using WhatsApp for messaging, instead of iMessage.
As a single market, US is probably biggest. I’m seeing numbers that say that the “Americas” is a bit less than half of global revenue, and that would include Canada and all of South and Latin America. So the rest of the world is of course very important to Apple, at least financially.
> doesn't mind using WhatsApp for messaging
Well WhatsApp was super early and way ahead of any competition, and the countries where it penetrated had no reason to leave, so it’s not exactly like they settle for less. It has been a consistently great service (in the category of proprietary messaging apps), even after Zuck took over.
It's not about price-performance value at all. Mac is still the most expensive performance. And Apple is only particularly popular in the US. Android phones dominate most other markets, particularly poor markets.
Apple is popular in the US because a) luxury brands hold sway b) they goad customers into bullying non-customers (blue/green chats) and c) they limit features and customizability in favor of simpler interfaces.
It's popular with developers because a) performance is valuable even at Apples steep cost b) it's Unix-based unlike Windows so shares more with the Linux systems most engineers are targeting.
I have never been an apple fanboy. Till 2022, I was on android phones. Work issued either Thinkpad or XPS variants. However, I have owned apple books since 2004 starting from panther era. I sincerely believe that apple provides best features and performance combination in the given price for laptops.
Here I feel that I-hate-apple crowd is just stuck with this notion of luxury overpriced brand when it is clearly not the case. Apple has superior hardware at better price points. Last time I was doing shopping for a laptop, I could get similar features only at a 30% - 40% price premium in other brands.
I am typing this on an apple M2 air and try finding similar performance under 2000 USD in other brands. The responsiveness, the (mostly) sane defaults and superior rendering and fonts make it worth it. The OS does not matter so much as it used to do in 2004 and the fact that I have a unix terminal in 2024 is just incidental. I have turned off auto updates and I do not use much of phone integration apart from taking backups and photo copying.
I switched to an iPhone in 2022 from a 200 US$ Samsung handset. Here, I would say that not everyone needs an iPhone. My old phone used to do all the tricks I need on this one. However, the camera is really and photos are really great. If I buy an iPhone next time, it would be just for the photos it takes.
> > In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
> Their primary business goal is to sell hardware.
There is no contradiction here. No need for luxury. Efficient hardware scales, Moore's law has just been rewritten, not defeated.
Power efficiency combined with shared and extremely fast RAM, it is still a formula for success as long as they are able to deliver.
By the way, M-series MacBooks have crossed bargain territory by now compared to WinTel in some specific (but large) niches, e.g. the M2 Air.
They are still technically superior in power efficiency and still competitive in performance in many common uses, be it traditional media decoding and processing, GPU-heavy tasks (including AI), single-core performance...
This is it. An M series air is an incredible machine for most people - people who likely won’t ever write a line of js or use a GPU. Email, banking, YouTube, etc ona device with incredible battery and hardware that will likely be useful for a decade is perfect. The average user hasn’t even heard of HN.
It's great for power users too. Most developers really enjoy the experience of writing code on Macs. You get a Unix based OS that's just far more usable and polished than a Linux laptop.
If you're into AI, there's objectively literally no other laptop on the planet that is competitive with the GPU memory available on an MBP.
depends on your workload. RAM and passive cooling are the most likely issues but afaik an M2/M3 with 16GiB still performs a lot better than an similarly priced x64 laptop. Active cooling doesn't mean no throttling either.
If you don't explicitly want a laptop, a 32GB M2 Pro Mac Mini would be a good choice I think.
Personally i only have used MBPs so far.
But the M-series Air are not remotely comparable to the old Intel Airs, that's for sure :)
The alternative is Google / Android devices and OpenAI wrapper apps, both of which usually offer a half baked UI, poor privacy practices, and a completely broken UX when the internet connection isn't perfect.
Pair this with the completely subpar Android apps, Google dropping support for an app about once a month, and suddenly I'm okay with the lesser of two evils.
I know they aren't running a charity, I even hypothesized that Apple just can't build good services so they pivoted to focusing on this fake "privacy" angle. In the end, iPhones are likely going to be better for edge AI than whatever is out there, so I'm looking forward to this.
No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.
You just can't have this on Apple devices. On Android side choices are limited too, I don't like Google and especially their disastrous hardware design, but their Pixel line is the most approachable one able to do all these.
Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?
> No, the alternative is Android devices with everything except firmware built from source and signed by myself
Normal users will not do this. Just because many of the people here can build and sign a custom Android build doesn't mean that is a viable commercial alternative. It is great that is an option for those of us who can do it, but don't present it as a viable alternative to the iOS/Google ecosystems. The fraction of people who can and will be willing to do this is really small. And even if you can do it, how many people will want to maintain their custom built OSes?
the main reason the masses don't have privacy and security-centred systems is that they don't demand them and they will trade it away for a twopence or for the slightest increment in convenience
a maxim that seems to hold true at every level of computing is that users will not care about security unless forced into caring
with privacy they may care more, but they are easily conditioned to assume it's there or that nothing can be realistically be done about losing it
I, an engineer, am not doing this myself, too. There is a middle ground though: just use a privacy-oriented Android build, like DivestOS. [1]
There are a couple caveats:
1. It is still a bit tricky for a non-technical person to install. Should not be a problem if they know somebody who can help, though. There's been some progress making the process more user friendly recently (e.g. WebUSB-based GrapheneOS installer).
2. There are some papercuts if you don't install Google services on your phone. microG [2] helps with most but some still remain. My main concern with this setup is that I can't use Google Pay this way, but having to bring my card with me every time seems like an acceptable trade off to me.
WebGPU isn't standardized yet. Hell, most of the features people complain about aren't part of any standard, but for some reason there's this sense that if it's in Chrome, it's standard - as if Google dictates standards.
I’ve been using Firefox since the Quantum version is out. It feels slightly slower to Chrome but it's negligible to me. Otherwise I can't tell a difference (except some heavy web based Office like solutions screaming 'Your browser is not supported!' but actually works fine).
Meanwhile, Apple has historically dictated that Google can't publish Chrome for iOS, only a reskinned Safari. People in glass-walled gardens shouldn't throw stones.
Because as you described, the only alternatives that exist are terrible experiences for basically everyone, so people are happy to pay to license a solution that solves their problems with minimal fuss.
Any number of people could respond to “use Android devices with everything except firmware built from source and signed by myself” with the same question.
The yearly subscription is for publishing your app on Apple’s store and definitely helps keep some garbage out. Running your own app on your own device is basically solved with free third party solutions now (see AltStore and since a newer method I can’t recall atm)
Notice that parent never talked about publishing apps, just building and running apps on their own device. "Publishing on AltStore" (or permanently running the app on your own device in any other way) still requires a $100/year subscription as far as I'm aware.
> No, the alternative is Android devices with everything except firmware built from source and signed by myself
I wouldn't bet on this long term, since it fully relies on Google hardware, and Google's long-term strategy is to remove your freedom piece by piece, cash on it, not to support it.
The real alternative is GNU/Linux phones, Librem 5 and Pinephone, without any ties to greedy, anti-freedom corporations.
> No, the alternative is Android devices with everything except firmware built from source and signed by myself. And at the same time, being secure, too.
There are people who don't know how to use file explorer, new generation grows up in a world of iPhones without ever seeing file system. Any other bright ideas?
> Heck, you can't even build your own app for your own iPhone without buying another hardware (a Mac, this is not a software issue, this is a legal issue, iOS SDK is licensed to you on the condition of using on Apple hardware only) and a yearly subscription. How is this acceptable at all?
Because they set the terms of use of the SDK? You're not required to use it. You aren't required to develop for iOS. Just because Google gives it all away for free doesn't mean Apple has to.
Sure, as a SWE I'm not going to buy a computer unable to run my own code. A smartphone is an ergonomic portable computer, so I say no to iPhone and would like to remind others who didn't have a deep think into this about it.
Do you have a legal right to write software or run your own software for hardware you bought?
Because it’s very easy to take away a right by erecting aritificial barriers, just like how you could discriminate by race at work, but pretend you are doing something else,
> Do you have a legal right to write software or run your own software for hardware you bought?
I've never heard of such a thing. Ideally I'd like that, but I don't have such freedoms with the computers in my cars, for example, or the one that operates my furnace, or even for certain parts of my PC.
So you bought "a thing' but you can't control what it does, how it does it, you don't get to decide what data it collects or who can see that data.
You aren't allowed to repair the "thing' because the software can detect you changed something and will refuse to boot. And whenever it suits the manufacturer, they will decide when the 'thing' is declared out of support and stops functioning.
I would say you are not an owner then, you (and me) and just suckers that are paying for the party. Maybe it's a lease. But then we also pay when it breaks, so it more of a digital feudalism.
> Do you have a legal right to write software or run your own software for hardware you bought?
No, obviously not. Do you have a right to run a custom OS on your PS5? Do you have a right to run a custom application on your cable set-top box? Etc. Such a right obviously doesn’t exist and most people generally are somewhere between “don’t care” and actively rejecting it for various reasons (hacking in games, content DRM, etc).
It’s fine if you think there should be, but it continues this weird trend of using apple as a foil for complaining about random other issues that other vendors tend to be just as bad or oftentimes even worse about, simply because they’re a large company with a large group of anti-fans/haters who will readily nod along.
Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?
> Remember when the complaint was that the pelican case of factory OEM tools you could rent (or buy) to install your factory replacement screen was too big and bulky, meaning it was really just a plot to sabotage right to repair?
Yes, I do. That was and continues to be a valid complaint, among all other anti-repair schemes Apple have come up with over the years. DRM for parts, complete unavailability of some commonly repaired parts, deliberate kneecapping of "Apple authorized service providers", leveraging the US customs to seize shipments of legitimate and/or unlabeled replacement parts as "counterfeits", gaslighting by official representatives on Apple's own forums about data recovery, sabotaging right to repair laws, and even denial of design issues[1] to weasel out of warranty repair just to name a few.
All with the simple anti-competitive goal of making third party repair (both authorized and independent) a less attractive option due to artificially increased prices, timelines to repair, or scaremongering about privacy.
> Yes, I do. That was and continues to be a valid complaint,
No, it doesn’t - because you can simply not use the tools if you don’t want. You can just order a $2 spudger off Amazon if you want, you don’t need the tools at all.
It continues to be a completely invalid complaint that shows just how bad-faith the discussion about apple has become - it literally costs you nothing to not use the tools if you want, there is no downside to having apple make them available to people, and yet you guys still find a way to bitch about it.
Moreover, despite some “bold” proclamations from the haters… no android vendors ever ended up making their oem tooling available to consumers at all. You have to use the Amazon spudger on your pixel, and you will fuck up the waterproofing when you do your repair, because the android phone won’t seal properly against water without the tools either. IPX sixtywho!?
It’s literally a complete and total net positive: nothing was taken away from you, and you don’t need to use it, and it makes your life easier and produces a better repair if you want it. Apple went out of their way to both make the tooling available to normies who want to rent it or people who want to buy it for real. And people still bitch, and still think they come off better for having done so. Classic “hater” moment, in the Paul Graham sense. Anti-fanboys are real.
Literally, for some people - the pelican cases with the tools are too big and heavy. And that’s enough to justify the hate.
Again, great example of the point I was making in the original comment: people inserting their random hobby horse issues using apple as a foil. You don’t like how phones are made in general, so you’re using apple as a whipping boy for the issue even if it’s not really caused or worsened by the event in question etc. Even if the event in question is apple making that issue somewhat better, and is done worse by all the other vendors etc. Can’t buy tooling for a pixel at all, doing those repairs will simply break waterproofing without it, and you’re strictly better off having the ability to get access to the tooling if you decide you want it, but apple offering it is a flashpoint you can exploit for rhetorical advantage.
> Moreover, despite some “bold” proclamations from the haters… no android vendors ever ended up making their oem tooling available to consumers at all. You have to use the Amazon spudger on your pixel, and you will fuck up the waterproofing when you do your repair, because the android phone won’t seal properly against water without the tools either. IPX sixtywho!?
I think the dirty little secret here is that an iPhone is just about the only phone, apart from maybe some of the really nice Google and Samsung flagships, that anyone wants to repair, because they're bloody expensive. Which is fine and dandy but then do kindly park your endless bemoaning of the subjects of e-waste and non-repairable goods, when Android by far and away is the worse side of that equation, with absolute shit tons of low yield, crap hardware made, sold, and thrown away when the first software update renders it completely unusable (if it wasn't already, from the factory).
Could you chill with the relentless insults? I'd appreciate it.
Perhaps you haven't noticed, but once you tally up overpriced parts together with their oversized, heavy, expensive rental of tools that you don't need, you end up with a sum that matches what you would pay to have it repaired by Apple - except you're doing all of the work yourself.
A curious consumer who has never repaired a device, but might have been interested in doing so, will therefore conclude that repairing their own device is 1. Far too complicated, thanks to an intimidating-looking piece of kit that they recommend, but is completely unnecessary, and 2. Far too expensive, because Apple prices these such that the repair is made economically nonviable.
So yes, I still believe that this is Apple fighting the anti-repair war on a psychological front. You're giving them benefit of the doubt even though they've established a clear pattern of behavior that demonstrates their anti-repair stance beyond any reasonable doubt - although you dance around the citations and claim that I'm being unreasonable about Apple genuinely making the repair situation "better".
Futhermore, I'm not a fanboy or anti-fanboy of any company. The only thing I'm an anti-fanboy of are anti-consumer practices. If Apple changed some of their practices I'd go out and buy an iPhone and a Macbook tomorrow.
The fact that I pointed out that Apple is hostile against repair does not mean that I endorse Google, Samsung, or any other brand - they all suck when it comes to repair, yet you're taking it as a personal attack and calling me names for it.
Excuse me? I'm clearly not the one who crossed into flamewar, please read the previous 1-2 comments.
edit: Describing others as "bitching", "bad faith", and "hater", runs afoul of half of the guidelines on this site. That comment somehow isn't moderated, but mine is called out for crossing into flamewar?
You were both breaking the site guidelines, and I replied to both of you in the same way.
Even if I hadn't, though, we need you to follow the rules regardless of what anybody else is doing, and the same goes for every user here. Pointing a finger at the other person isn't helpful.
I understand it can be difficult to have someone point out that you're not approaching a situation in good faith, but that's not exactly "relentless insults".
It can be difficult to handle the intimation that maybe there is the mirror image of the "brainless apple sheeple" too. Haters exist - people who are not able to approach a situation fairly and are just relentlessly negative because they hate X thing too. Negative parasocial attachment is just as much of a thing as a positive one.
And when you get to the point of "literally making the tools available is bad because the pelican cases are too heavy" you have crossed that rubicon. I am being very very polite here, but you are not being rational, and you are kinda having an emotional spasm over someone disagreeing with you on the internet.
Yes, if you want OEM parts and you rent OEM tooling it's probably going to come close to OEM cost. That isn't discriminatory, if the prices are fair+reasonable, and objectively they pretty much are. $49 to rent the tools, and have them shipped both ways, etc, is not an unreasonable ask.
Not having a viable business model for your startup doesn't mean the world is wrong. It means you don't have a viable business idea. And yeah, if you are renting the tools as a one-off, and counting your personal time as having some value (or labor cost in a business), then you probably are not going to get costs that are economical with a large-scale operator with a chain operation and an assembly-line repair shop with repair people who do nothing but repair that one brand. That's not Apple's fault.
What we ultimately come down to with your argument is "apple is killing right-to-repair by being too good at repair and providing too cheap repairs such that indie shops can no longer make a margin" and I'm not sure that's actionable in a social sense of preventing e-waste. People getting their hardware repaired cheaply is good. Long parts lifetimes are good. Etc.
Being able to swap in shitty amazon knockoff parts is a whole separate discussion, of course. And afaik that is going to be forced by the EU anyway, consequences be damned. So what are you complaining about here?
Actually to be fully clear, in many cases you have an anti-right: literally not only do you not have a right, but it’s illegal to circumvent technological restrictions intended to prevent the thing you want to do.
As noxious as that whole thing is, it’s literally the law. I agree the outcome is horrifying of course… stallman was right all along, it’s either your device or it’s not.
And legally speaking, we have decided it’s ok to go with “not”.
> better for edge AI than whatever is out there, so I'm looking forward to this
What exactly are you expecting? The current hype for AI is large language models. The word 'large' has a certain meaning in that context. Much larger that can fit on your phone. Everyone is going crazy about edge AI, what am I missing?
> Everyone is going crazy about edge AI, what am I missing?
If you clone a model and then bake in a more expensive model's correct/appropriate responses to your queries, you now have the functionality of the expensive model in your clone. For your specific use case.
The size of the resulting case-specific models are small enough to run on all kinds of hardware, so everyone's seeing how much work can be done on their laptop right now. One incentive for doing so is that your approaches to problems are constrained by the cost and security of the Q&A roundtrip.
Quantized LLMs can run on a phone, like Gemini Nano or OpenLLAMA 3B. If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.
> If a small local model can handle simple stuff and delegate to a model in the data center for harder tasks and with better connectivity you could get an even better experience.
Distributed mixture of experts sounds like an idea. Is anyone doing that?
Sounds like an attack vector waiting to happen if you deploy enough competing expert devices into a crowd.
I’m imagining a lot of these LLM products on phones will be used for live translation. Imagine a large crowd event of folks utilizing live AI translation services being told completely false translations because an actor deployed a 51% attack.
I’m not particularly scared of a 51% attack between the devices attached to my Apple ID. If my iPhone splits inference work with my idle MacBook, Apple TV, and iPad, what’s the problem there?
Using RAG a smaller local LLM combined with local data (e.g. your emails, iMessages etc) can be useful than a large external LLM that doesn’t have your data.
No point asking GPT4 “what time does John’s party start?”, but a local LLM can do better.
This is why I think Apple’s implementation of LLMs is going to be a big deal, even if it’s not technically as capable. Just making Siri better able to converse (e.g. ask clarifying questions) and giving it the context offered by user data will make it dramatically more useful than silo’d off remote LLMs.
In the hardware world, last year’s large has a way of becoming next year’s small. For a particularly funny example of this, check out the various letter soup names that people keep applying to screen resolutions. https://en.m.wikipedia.org/wiki/Display_resolution_standards...
Google has also been working on (and provides kits for) local machine learning on mobile devices... and they run on both iOS and Android. The Gemini App does send data in to Google for learning, but even that you can opt out of.
Apple's definitely pulling a "Heinz" move with privacy, and it is true that they're doing a better job of it overall, but Google's not completely horrible either.
Yeah, I was thinking of their non-local model like Gemini advanced though.
In any case iPhone probably don't have enough memory to run a 3.25B model? e.g. 15 pro only have 8 GB (and Gemini Nano seems to only work on the 12GB Pixel 8 Pro) and 14 has only 6GB, that hardly seems sufficient for even a small LLM if you still you want to run the full OS and other apps at the same time.
Care to cite these subpar Android apps? The app store is filled to the brim with subpar and garbage apps.
>Google dropping support for an app about once a month
I mean if you're going to lie why not go bigger
>I'm okay with the lesser of two evils.
So the more evil company is the one that pulled out of China because they refused to hand over their users data to the Chinese government on a fiber optic silver plate?
Google operates in China albeit via their HK domain.
They also had project DragonFly if you remember.
The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.
>Google operates in China albeit via their HK domain.
The Chinese government has access to the iCloud account of every Chinese Apple user.
>They also had project DragonFly if you remember.
Which never materialized.
>The lesser of two evils is that one company doesn’t try to actively profile me (in order for their ads business to be better) with every piece of data it can find and forces me to share all possible data with them.
Apple does targeted and non targeted advertising as well. Additionally, your carrier has likely sold all of the data they have on you. Apple was also sued for selling user data to ad networks. Odd for a Privacy First company to engage in things like that.
Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.
>As for the subpar apps: there is a massive difference between the network traffic when on the Home Screen between iOS and Android.
Not sure how that has anything to do with app quality, but if network traffic is your concern there's probably a lot more an Android user can do than an iOS user to control or eliminate the traffic.
> Google has been around for 26 years I believe. According to that link 60 apps were killed in that timeframe. According to your statement that Google kills an app a month that would leave you 252 apps short. Furthermore, the numbers would indicate that Google has killed 2.3 apps per year or .192 apps per month.
Most of the "Services" on that list are effectively apps, too:
VPN by Google One, Album Archive, Hangouts, all the way back to Answers, Writely, and Deskbar.
I didn't touch hardware, because I think that should be considered separately.
The first of 211 services on that site was killed in 2006.
The first of the 60 apps on that site was killed in 2012.
So even apps alone, 4.28 a year.
But more inclusively, 271 apps or services in 17 years is ~16/year, over one a month.
You need to remind yourself of the site guidelines about assuming the worst. Your comments just come across condescendingly.
I think it was Paul Thurrott on Windows Weekly podcast who said that all these companies don't really care about privacy. Apple takes billions of dollar a year to direct data towards Google via the search defaults. Clearly privacy has a price. And I suspect it will only get worse with time as they keep chasing the next quarter.
Tim Cook unfortunately is so captured in that quarterly mindset of 'please the share holders' that it is only a matter of time.
I do hope that those working in these companies actually building the tools do care. But unfortunately, it seems that corruption is an emergent property of complexity.
The Google payments are an interesting one; I don't think it's a simple "Google pays them to prefer them", but a "Google pays them to stop them from building a competitor".
Apple is in the position to build a competing search product, but the amount Google pays is the amount of money they would have to earn from it, and that is improbable even if it means they can set their own search engine as default.
While Apple is first and foremost a hardware company, it has more or less always been about the "Apple experience". They've never "just" been a hardware company.
For as long as Apple has existed, they've done things "their way" both with hardware and software, though they tend to want to abstract the software away.
If it was merely a question of selling hardware, why does iCloud exist ? or AppleTV+, or Handoff ? or iMessage, or the countless other seemingly small life improvements that somehow the remainder of the industry cannot seem to figure out how to do well.
Just a "simple" thing as switching headphones seamlessly between devices is something i no longer think about, it just happens, and it takes a trip with a Windows computer and a regular bluetooth headset to remind me how things used to be.
As part of their "privacy first" strategy, iMessage also fits in nicely. Apple doesn't have to operate a huge instant messaging network, which undoubtedly is not making a profit, but they do, because having one entry to secure, encrypted communication fits well with the Apple Experience. iMessage did so well at abstracting the ugly details of encryption that few people even think about that that's what the blue bubble is actually about, it more or less only means your message is end to end encrypted. As a side effect you can also send full resolution images (and more), but that's in no way unique to iMessage.
I can't buy a MacBook Air for less than $999, and that's for a model with 8GB RAM, an 8-core CPU and 256GB SSD. The equivalent (based on raw specs) in the PC world runs for $300 to $500.
How is something that is twice as expensive as the competition not a luxury device?
EDIT: Because there's repeated confusion in the replies: I am not saying that a MacBook Air is not objectively a better device. I'm saying it is better by metrics that fall strictly into the "luxury" category.
Better build quality, system-on-a-chip, better OS, better battery life, aluminum case—all of these are luxury characteristics that someone who is looking for a functional device that meets their needs at a decent price won't have as dealbreakers.
> How is something that is twice as expensive as the competition not a luxury device?
You can buy a version of <insert product here> from Walmart at 1/2 price of a "normal" retailer. Does that mean every "normal" retailer is actually a luxury goods dealer?
Is my diner a luxury restaurant because a burger costs twice as much as McDonald's?
When I buy a Rick Owens coat for $3k, sure it's a luxury good. It protects from the elements just the same, I know that I overpay only because it looks nice. But when I pay the same for the device I need for my work and use for 12 hours a day, it’s not luxury — it's just common sense. I've tried working with Windows and Linux, and I know that I'm paying not only for specs, but because the sum of all the qualities will result in a much better experience — which will allow me to work (and earn money) faster and with less headache.
$1000 for a laptop that will last 10 years seems crazy to call a luxury, when we have Alienware/apple laptops that go for 2k to 5k+ and demographics that buys them yearly.
I bought a 300 euro ThinkPad that's going on 9 years now.
It was my only computer for the entire time until about a week ago, when I bought another 300 euro ThinkPad. I also didn't have a smartphone, only a basic dumbphone for a large portion of that time.
So for me, yes, MacBook Airs, like the lovely M1 I'm writing this on (I love my job!) are luxury goods.
Just because Ferraris cost half a million doesn't mean a 50k BMW isn't luxury.
> You can buy a version of <insert product here> from Walmart at 1/2 price of a "normal" retailer. Does that mean every "normal" retailer is actually a luxury goods dealer?
What percent of that retailer's products does that comparison apply to?
If it's more than half then yeah that's probably a luxury goods dealer.
> The equivalent (based on raw specs) in the PC world runs for $300 to $500.
Equivalent device?! Find me Windows laptop in ANY price category that can match weight, fanless design, screen quality, battery life, speakers quality and battery life of Air.
I got a Latitude 9430 on eBay for $520. This thing is an amazing laptop and I'd put it right there with the Macs I have to work with at dayjob, as far as build quality/feel.
That's still ~twice as expensive as the items I linked to below, and that's at clearance prices.
A good deal on a luxury item still gets you a luxury item.
And if we want to compare Walmart to Walmart, this thing currently runs for $359 and has 16GB RAM, 512GB SSD, and a CPU that benchmarks slightly faster than the M2:
No brand new laptop has a 1h battery. Also, battery life importance as in "I can work a full day unplugged from AC" it's something that affects only a subset of laptop users, and mostly during some specific conditions (i.e. long travels).
That's more like cheap vs middle of the road. There is no luxury space in laptops - displays, iPads, and workstations maybe but that's it (and those are more pro than luxury).
$999 amortized over 3 years is $30/mo which is less than what even middle class people spend on coffee.
I doubt I am alone in saying that I would gladly pay twice the price to avoid having to use Windows. It's the most user-hostile, hand-holdy, second-guess-and-confirm-my-explicit-command-ey os I've used to date. And bloatware baked in? No thanks.
You're probably right. I am in the middle-class, maybe lower middle-class, and I live in the US. I have advantages and opportunities that many in other circumstances do not and I am sincerely grateful for them.
Oh dear. 16:10 screen with superior resolution, brightness and gamut - and it still gets superior battery life driving all those pixels.. that’s a headline feature that even a non-propellerhead can observe (I was honestly surprised when I looked up that Acer screen what a dim, narrow piece of shit it is) - notably there are ballpark priced systems with better screens.
I think you unjustifiably downplay how much of a selling point a screen that looks great (or at least decent) on the floor is. And I know tons of devs that put up with the 45% NTSC abominations on Thinkpads that aren’t even suitable for casual photo editing or web media, just because you make do with that doesn’t automatically make a halfway decent display on a laptop a “luxury”.
Sorry, but don’t buy the “everything that isn’t a $300 econo shit laptop is luxury” thesis repeated ad nauseum.
"Luxury" often includes some amount of pure status symbols added to the package, and often on what is actually a sub-par experience. The quintessential luxury tech device were the Vertu phones from just before and even early in the smartphone era - mid-range phones tech and build quality-wise, with encrusted gems and gold inserts and other such bling, sold at several thousand dollars (Edit: they actually ranged between a few thousand dollars all the way to 50,000+).
But the definition of luxury varies a lot by product category. Still, high-end and luxury are separate concepts, which ven when they do overlap.
You just made up the "sub-par experience" as a defining point of a luxury product.
A luxury product is defined by being a status symbol (check for all Apple devices) and especially by its price.
A luxury car like a Bentley will still you bring from point A to point B like the cheapest Toyota.
I didn't say that sub-par experience was a requirement, I said it was often a part of luxury products. Or, more precisely, I should have said that something being of excellent build quality and offering excellent, top of the line experience is neither sufficient nor necessary for being a luxury good.
It is true though that luxury goods are, often, top of the line as well. Cars and watches are often examples of this. Clothes are a much more mixed bag, with some luxury brands using excellent materials and craftsmanship, while others use flashy design and branding with mediocre materials and craftsmanship.
Exactly where Apple sits is very debatable in my experience. I would personally say that many of their products are far too affordable and simple to be considered luxury products - the iPhone in particular. The laptops I'm less sure about.
Fair enough.
Apple is clearly not in the same luxury league like a Bentley or a yatch, but it's totally like a Mercedes, to continue with the car analogy. You get a "plus" for the extra money but then it's open for debate whether that "plus" is worth or not. And it's actually the source of many flamewars on the Internet.
I think the Mercedes comparison (or the more common BMW) one is also useful for getting the idea that not every manufacturer is competing for the same segments but the prices in segments are generally close. No Mercedes is as cheap as a Camry but a Lexus is similar.
This comes up so often in these flame wars where people are really saying “I do/don’t think you need that feature” and won’t accept that other people aren’t starting from the same point. I remember in the 90s reading some dude on Fidonet arguing that Macs were overpriced because they had unnecessary frills like sound cards and color displays; I wasn’t a Mac user then but still knew this was not a persuasive argument.
That would also apply to Apple products then, and especially so to their laptops. I actually bought a MacBook Air recently and the thing that I like most about it is how comfortable the keyboard and especially the trackpad is compared even to high-end ThinkPads. And, on the other hand, the trackpad on my T14s is certainly quite sufficient to operate it, so this comfort that MacBook offers is beyond the bare necessity of function.
By that definition, Zara is a luxury clothing brand, Braun is a luxury appliance maker, and Renault is a luxury car brand. I think it requires significantly more.
The Walmart variant was introduced 6 weeks ago to offload excess stocks of a four year old discontinued model. I'm not sure your argument of "at only 70% of the price of a model two generations newer" is the sales pitch you think it is.
They’re tools. This attempt to treat them as luxury goods doesn’t hold with those. It’s entirely common for even people who want to do some home repair—let alone professionals—but aren’t clueless about DIY to spend 2x the cheapest option, because they know the cheapest one is actually worth $0. More will advocate spending way more than 2x, as long as you’re 100% sure you’re going to use it a lot (like, say, a phone or laptop, even for a lot of non-computer-geeks). This is true even if they’re just buying a simple lowish-power impact driver, nothing fancy, not the most powerful one, not the one with the most features. Still, they’ll often not go for the cheapest one, because those are generally not even fit for their intended purpose.
[edit] I mean sure there are people who just want the Apple logo, I’m not saying there are zero of those, but they’re also excellent, reliable tools (by the standards of computers—so, still bad) and a good chunk of their buyers are there for that. Even the ones who only have a phone.
I didn't go for the cheapest option: I'm typing this on a laptop that I bought a few months ago for $1200. It has an aluminum case, 32GB RAM, an AMD Ryzen CPU that benchmarks similar to the M3, and 1TB SSD. I can open it up and replace parts with ease.
The equivalent from Apple would currently run me $3200. If I'm willing to compromise to 24GB of RAM I can get one for $2200.
What makes an Apple device a luxury item isn't that it's more expensive, it's that no matter what specs you pick it will always be much more expensive than equivalent specs from a non-luxury provider. The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people.
Note that there's nothing wrong with buying a luxury item! It's entirely unsurprising that most people on HN looking at the latest M4 chip prefer luxury computers, and that's fine!
Huh. Most of the folks I know on Apple stuff started out PC (and sometimes Android—I did) and maybe even made fun of Apple devices for a while, but switched after exposure to them because they turned out to be far, far better tools. And not even much more expensive, if at all, for TCO, given the longevity and resale value.
Eh, I have to use a MacBook Pro for work because of IT rules and I'm still not sold. Might be because I'm a Linux person who absolutely must have a fully customizable environment, but MacOS always feels so limited.
The devices are great and feel great. Definitely high quality (arguably, luxury!). The OS leaves a lot to be desired for me.
I spent about a decade before switching using Linux as my main :-) Mostly Gentoo and Ubuntu (man, it was good in the first few releases)
Got a job in dual-platform mobile dev and was issued a MacBook. Exposure to dozens of phones and tablets from both ecosystem. I was converted within a year.
(I barely customize anything these days, fwiw—hit the toggle for “caps as an extra ctrl”, brew install spectacle, done. Used to have opinions about my graphical login manager, use custom icon sets, all that stuff)
> no matter what specs you pick it will always be much more expensive than equivalent specs from a non-luxury provider
On the phone side, I guess you would call Samsung and Google luxury providers? On the laptop side there are a number of differentiating features that are of general interest.
> The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people
Things that might matter to regular people (and tool users):
- design and build for something you use all day
- mic and speakers that don't sound like garbage (very noticeable and relevant in the zoom/hybrid work era)
- excellent display
- excellent battery life
- seamless integration with iPhone, iPad, AirPods
- whole widget: fewer headaches vs. Windows (ymmv); better app consistency vs. Linux
- in-person service/support at Apple stores
It's hard to argue that Apple didn't reset expectations for laptop battery life (and fanless performance) with the M1 MacBook Air. If Ryzen has caught up, then competition is a good thing for all of us (maybe not intel though...) In general Apple isn't bleeding edge, but they innovate with high quality, very usable implementations (wi-fi (1999), gigabit ethernet (2001), modern MacBook Pro design (2001), "air"/ultrabook form factors (2008), thunderbolt (2011), "retina" display and standard ssd (2012), usb-c (2016), M1: SoC/SiP/unified memory/ARM/asymmetric cores/neural engine/power efficiency/battery life (2020) ...and occasionally with dubious features like the touchbar and butterfly keyboard (2016).)
Looking even further back in Apple laptop history, we find interesting features like rear keyboard placement (1991), 4 pound laptop with dock for desktop use (1992), and trackpad (1994). Apple's eMate 300 (1997) was a Newton laptop rather than a Mac, but it had an ARM processor, flash storage, and 20+ hour battery life, making it something of an ancestor to the Mac M1.
Once Arm and battery life shift occurs with Linux and Windows, they'll (ie. Apple) be on the front foot again with something new, that's the beauty of competition.
>The things that Apple provides are not the headline stats that matter for a tool-user, they're luxury properties that don't actually matter to most people.
Here lies the rub, ARE those the stats that matter? Or does the screen, touchpad, speakers, battery life, software, support services, etc. matter more?
I feel people just TOTALLY gloss over the fact that Apple is crushing the competition in terms of trackpads + speakers + battery life, which are hardly irrelevant parts of most people's computing experience. Many people hardly use their computers to compute - they mostly use them to input and display information. For such users, memory capacity and processing performance ARE frills, and Apple is a market leader where it's delivering value.
Also even in compute, apple is selling computers with a 512-bit or 1024-bit LPDDR5x bus for a lower price than you can get from the competition. Apple is also frequently leading the pack in terms of compute/watt. This has more niche appeal, but I've seen people buy Apple to run LLM inferencing 24/7 while the Mac Studio sips power.
Lenovo Thinkpad p14s(t14) gen 4, 7840U, $1300, oled 2.8K 400 nits P3, 64gb RAM, 1TB, keyboard excellent, speakers shitty(using sony wh-1000xm4), battery(52.5Wh) life not good not bad, OLED screen draws huge amount of power. weight ~3 lb.
This spec costs 2k euro in NL. Fully specd Air (15 inch) is 2,5k euro, with arguably better everything except RAM and is completely silent. Doesn’t look that much different to me in terms of price.
Also, those things aren't even true about Apple devices. Apple fanboys have been convinced that their hardware really is way better than everything else for decades. It has never been true and still isn't.
Clean os install? You haven't used windows in a while have you?
Im a Linux guy but am forced to use Mac's and windows every now and then.
Windows has outpaced macos for a decade straight.
Macos looks like it hasn't been updated in years. It's constantly bugging me for passwords for random things. It is objectively the worst OS. I'd rather work on a Chromebook.
I think he has different critera on what bothers him, thats okay though isn't it. I get a little annoyed at anything where I have to use a touchpad, not enough to rant about it, but it definitely increases friction (haha) in my thought process.
What metrics are you using for build quality? Admittedly I don't know a ton of mac people (I'm an engineer working in manufacturing) but the mac people I know, stuff always breaks, but they're bragging about how apple took care of it for free.
My Acer Aspire lasted me for tens of thousands of hours of use and abuse by small children over 6 years until I replaced it this year because I finally felt like I wanted more power. That's the Toyota Camry of laptops.
The features that Apple adds on top of that are strictly optional. You can very much prefer them and think that they're essential, but that doesn't make it so. Some people feel that way about leather seats.
tl;dr is that Walmart is also selling an Acer for $359 that beats that device on every headline metric.
It's nice to know that I could get the old-gen model for slightly cheaper, but that's still an outrageous price if the MacBook Air isn't to be considered a luxury item.
My last Acer lasted me six years until I decided to replace it for more power (which, notably, I would have done with a MacBook by then too). They're not as well built as a MacBook, but they're well built enough for the average laptop turnover rate.
If it was actually bad value they wouldn't sell as high as they do and review with as much consumer satisfaction as they do.
These products may not offer you much value and you don't have to buy them. Clearly plenty of people and institutions bought them because they believed they offered the best value to them.
If people were actually rational that might be true, but they aren't. Apple survives entirely on the fact that they have convinced people they are cool, not because they actually provide good value.
Agreed. I'd definitely make the same arguments here as I would for an Audi. There's clearly a market, and that means they're not a bad value for a certain type of person.
Yes, but having all three of those things (well, specs/performance is probably just one thing, but treating them as separate as you did means that I don't have to do the heavy lifting of figuring out what a third thing would actually be) IS, in fact, a luxury.
Nobody is away from a power source for longer than 18 hours. MOST people don't need the performance that a macbook air has, their NEEDS would be met by a raspberry pi... that is, basic finances, logging into various services, online banking, things that first world citizens "rely" on.
The definition of luxury is "great comfort and extravagance", and every current Apple product fits that definition. Past Apple definitely had non-luxury products, as recently as the iPhone C (discontinued 10 years ago)... but Apple has eliminated all low-value options from their lineup.
When you're breaking out SSD speeds you're definitely getting into the "luxury" territory.
As I said in another comment:
The point isn't that the MacBook Air isn't better by some metrics than PC laptops. A Rolls-Royce is "better" by certain metrics than a Toyota, too. What makes a device luxury is if it costs substantially more than competing products that the average person would consider a valid replacement.
They're average. A 512GB M3 MBA gets like 3000MBps for read/write. A 1TB Samsung 990 Pro, which costs less than the upgrade from 256GB to 512GB on the Air is over twice as fast. And on base models Apple skimps and speeds are slower.
Good question, I think the answer is even at thousands a window device battery can't hit 18 hour specs. Can someone name a windows device even at 2k+ that acts like an M chip? In fact the pricier windows usually mean GPU and those have worse battery then cheap windows(my 4090 is an hour or so off charge)
I am all in on Apple, to be clear. Mac Pros, multiple MBPs, Studio, Pro Display XDR, multiple Watches, phones, iPad Pro.
My experiences (multiple) with Genius Bar have been decidedly more "meh" to outright frustrating, versus "luxury", oftentimes where I know more than the Genius.
Logic Board issues where on a brand new macOS install I could reproducibly cause a kernel panic around graphics hardware. There was an open recall (finally, after waiting MONTHS) on this. It covered my Mac. But because it passed their diagnostic tool, they would only offer to replace the board on a time and materials basis.
I had a screen delamination issue. "It's not that bad - you can't see it when the screen is on, and you have to look for it". Huh. Great "luxury" experience.
And then the multiple "we are going to price this so outrageously, and use that as an excuse to try to upsell". Like the MBA that wouldn't charge due to a circuit issue. Battery fine, healthy. Laptop, fine, healthy, on AC. Just couldn't deliver current to the battery. Me, thinking sure, $300ish maybe with a little effort.
"That's going to be $899 to repair. That's only $100 less than a new MBA, maybe we should take a look at some of the new models?" Uh, no. I'm not paying $900 for a laptop that spends 99% (well, 100% now) of its life on AC power.
Is a Wendy’s burger luxury because it costs twice as much as McDonald’s?
Cost comparisons alone are stupid. And “this AMD benchmarks the same as an M2” is a useless comparison since regular people don’t buy laptops for raw compute power.
Really? You can find a laptop with the equivalent of Apple Silicon for $3-500? And while I haven't used Windows in ages I doubt it runs as well with 8 GB as MacOS does.
The point isn't that the MacBook Air isn't better by some metrics than PC laptops. A Rolls-Royce is "better" by certain metrics than a Toyota, too. What makes a device luxury is if it costs substantially more than competing products that the average person would consider a valid replacement.
> the average person would consider a valid replacement
But what is that, exactly? If you look at all aspects of a laptop: CPU, RAM, SSD, battery life, screen quality, build quality, touchpad, OS, and put them in order of importance for the average consumer, what would be on top? I don't think it's the tech specs.
For instance, I would be willing to bet that for a large number of consumers, battery life is far more important than the tech specs, which means that a valid replacement for their MacBook must have equivalent battery life. You also have to consider things like the expected lifespan of the laptop and its resale value to properly compare their costs. It's not simple.
Curious what criteria you're using for using for qualifying luxury. It seems to me that materials, software, and design are all on par with other more expansive Apple products. The main difference is the chipset which I would argue is on an equal quality level as the pro chips but designed for a less power hungry audience.
Maybe for you, but I still see sales guys who refuse working on WinTel where basically what the do is browse internet and do spreadsheets - so mainly just because they would not look cool compared to other sales guys rocking MacBooks.
I'm not sure what you're point is. My point (which I failed at), is that Apple's incentives are changing because their growth is dependent on services and extracting fees so they will likely do things that try to make people dependent on those services and find more ways to charge fees (to users and developers).
Providing services is arguably at odds with privacy since a service with access to all the data can provide a better service than one without so there will be a tension between trying to provide the best services, fueling their growth, and privacy.
My point was that it's interesting how we can frame a service business "extracting fees" to imply wrongdoing. When it's pretty normal for all services to charge ongoing fees for ongoing delivery.
It’s about the money, it’s about perverse incentives and propensity of service businesses to get away with unfair practices. We have decent laws about your rights as a consumer when you buy stuff, but like no regulation of services
There is tons of regulation of services? Everything from fraud / false advertising to disclosure of fees to length and terms of contracts. What regulation do you think is missing?
And as someone who presumably provides services for a living, what additional regulations would you like to be subject to?
So the new iPad & M4 was just some weekend project that they shrugged and decided to toss over to their physical retail store locations to see if anyone still bought physical goods eh
I have very little faith in apple in this respect.
For clarity, just install little snitch on your machine, and watch what happens with your system. Even without being signed in with an apple id and everything turned off, apple phones home all the time.
You can block 17.0.0.0 at the router, opening up only the notification servers. CDNs are a bit harder, but can be done with dnsmasq allow/deny of wildcard domains. Apple has documentation on network traffic from their devices, https://support.apple.com/en-us/101555
As a privacy professional for many, many years this is 100% correct. Apple wouldn’t be taking billions from Google for driving users to their ad tracking system, they wouldn’t give the CCP access to all Chinese user data (and maybe beyond), and they wouldn’t be on-again-off-again flirting with tailored ads in Apple News if privacy was a “human right”.
(FWIW my opinion is it is a human right, I just think Tim Cook is full of shit.)
What Apple calls privacy more often than not is just putting lipstick on the pig that is their anticompetitive walled garden.
Pretty much everybody in SV who works in privacy rolls their eyes at Apple. They talk a big game but they are as full of shit as Meta and Google - and there’s receipts to prove it thanks to this DoJ case.
Apple want to sell high end hardware. On-device computation is a better user experience, hands down.
That said, Siri is utter dogshit so on-device dogshit is just faster dogshit.
At this point call your government representatives and ask for new laws, or if you live someplace with laws, actual enforcement (looking at you EU).
The idea that user behavior or consumer choice will change any of this is basically discredited in practice. It will always been cat and mouse until the point that CEOs go to jail, then it will stop.
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
Indeed.
Privacy starts with architectural fundamentals that are very difficult to retrofit...
If a supplier of products has not built the products this way, it would be naive to bet bank or farm on the supplier. Even if there were profound motivation to retrofit.
Add to this the general tendency of the market to exploit its customers.
>The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
They failed with their ad-business so this is a nice pivot. I'll take it, I'm not usually a cheerleader for Apple, but I'll support anyone who can erode Google's surveillance dominance.
> The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.
There are a ton of us out here that consciously choose Apple because of their position on privacy. I have to imagine they know how many customers they'll lose if they ever move on this, and I want to believe that it's a large enough percentage to prevent it from happening. Certainly my circle is not a useful sample, but the Apple people in it are almost all Apple people because of privacy.
When I was at Apple for a short time, there was a small joke I hear from the ex-amazonians there who would say "What's the difference between an Apple software engineer and an Amazon software engineer? The Amazon engineer will spin up a new service on AWS. An Apple engineer will spin up a new app". Or something along those lines. I forget the exact phrasing. It was a joke that Apple's expertise is in on-device features, whereas Amazon thrives in the cloud services world.
Every company is selling one thing or another, and nothing is going to last forever. I really fail to see what, except for generic negativity, your comment adds to anything.
That is not where Apple's growth has been for quite some time, it's services. And because of that I'll be awaiting the economic rental strategy to come at any moment.
Nothing is true forever. Google wasn’t evil forever, Apple won’t value privacy forever.
Until we figure out how to have guarantees of forever, the best we can realistically do is evaluate companies and their products by their behavior now weighted by their behavior in the past.
As soon as the privacy thing goes away, I'd say a major part of their customer base goes away too. Most people use android so they don't get "hacked" if Apple is doing the hacking, I'd just buy a cheaper alternative.
Maybe true for a lot of the HN population, but my teenagers are mortified by the idea of me giving them android phones because then they would be the pariahs turning group messages from blue to green.
And just to elaborate on this: it's not just snobbery about the color of the texts, for people who rely on iMessage as their primary communication platform it really is a severely degraded experience texting with someone who uses Android. We Android users have long since adapted to it by just avoiding SMS/MMS in favor of other platforms, but iPhone users are accustomed to just being able to send a video in iMessage and have it be decent quality when viewed.
Source: I'm an Android user with a lot of iPhones on my in-laws side.
I’m in Europe and everyone uses WhatsApp, and while Android does gave higher share over here, iPhone still dominate the younger demographics. I’m not denying blue/green is a factor in the US but it’s not even a thing here. It’s nowhere near the only it even a dominant reason iPhones are successful with young people.
Interesting that some people would take that as an Apple problem and others would take it as a Google problem
Who’s at fault for not having built-in messaging that works with rich text, photos, videos, etc?
Google has abandoned more messaging products than I can remember while Apple focused on literally the main function of a phone in the 21st century. And they get shit for it
Apple get shit for it because they made it a proprietary protocol for which clients are not available on anything except their own hardware. The whole point of messaging is that it should work with all my contacts, not just those who drank the Apple-flavored Kool-Aid.
Google’s protocol is proprietary too - their encryption extension makes it inaccessible for anyone else and google will not partner or license (companies have tried).
RCS as currently implemented is iMessage but with a coat of google paint. There is no there there.
Google should get plenty of shit too for closing down GTalk in the first place. It's not an either-or. Big tech in general hates open protocols and interoperability for consumer stuff; Apple is just the most egregious offender there.
My take is that it's like a fashion accessory. People buy Gucci for the brand, not the material or comfort.
Rich people ask for the latest most expensive iPhone even if they're only going to use WhatsApp and Instagram on it. It's not because of privacy or functionality, it's simply to show off to everyone they can purchase it. Also to not stand out within their peers as the only one without it.
As another content said: it's not an argument, it's a fact here.
I have an iPhone so I guess I qualify as a rich person by your definition. I am also a software engineer. I cannot state enough how bogus that statement is. I've used both iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI which maintains its consistency both throughout the OS and over the years. They're the most dumbed down phones and easiest to understand. I recommend iPhone to all my friends and relatives.
There's obviously tons of people who see iPhone as a status item. They're right, because iPhone is expensive and only the rich can buy them. This doesn't mean iPhone is not the best option out there for a person who doesn't want to extensively customize his phone and just use it.
Yes, by pure statistics you are probably rich compared to everyone else. The average software developer salary is way bigger than the average salary for the entirety of the US. Let's not even mention compared to the rest of the world.
Sure, some people pick up the iPhone because they like the specs, or the apps, or whatever else. That's why I said the majority picks it up for status, not all. But keep in mind nobody's judging the iPhone's specs or capabilities here. We're talking about why people buy it.
Ask any teenager why they want an iPhone. I'd be very surprised if even one said it's because of privacy. It's because of the stupid blue bubble, which is a proxy for status.
I'm pretty sure if Apple released the same phone again with a new name and design, people would still buy it. For the majority, it's not because of features, ease of use, specs, etc: it's status.
> iPhone and Android, and recent flagships. iPhone is by far the easiest one to use. Speaking in more objective terms, iPhones have a coherent UI
It’s not about if you’ve used android, it’s about if you’ve beeen poor-ish or stingy
To some people those are luxuries- the most expensive phone they buy is a mid-range Motorola for $300 with snapdragon 750g or whatever. They run all the same apps after all, they take photos.
Its not an argument, just ask why people lust after the latest iPhones in poor countries. They do it because they see rich people owning them. Unless you experience that, you won't really understand it.
The cheapest point of entry is absolutely not comparable. The cheapest new iPhone on apple.com is $429. The cheapest new Samsung on samsung.com is $199 (They do have a phone listed for $159, but it's button says "Notify Me").
Granted, you may have been leaning very heavily on the dictionary definition of "comparable", in that the two numbers are able to be compared. However, when the conclusion of that comparison is "More than twice the price", I think you should lead with that.
Keep in mind, the iPhone SE is using a 3 year old processor, the Samsung A15 was released 5 months ago with a brand new processor.
According to various sites, the Mediatek Dimensity 6100+ is a 6nm update to a core that was released 3 years ago (Dimensity 700 on a 7nm). It's 5-10% faster, likely due to the update from 7 to 6nm, as the cores are the same and run at the same speed. It contains an updated bluetooth chipset (from 5.1 to 5.2) and supports a larger max camera. The camera on the A15 is well below the max size of the previous chipset, however, the increased camera bandwidth should ensure that the camera feels snappier (a common complaint on low-end phones). The process improvement should increase efficiency as well, however, there are not benchmarks that are able to test this.
It's fashion and the kids are hip. But there is an endless void of Apple haters here who want to see it burn. They have nothing in common with 99.9% of the customer base.
I was thinking about this for a while, the problem is not about apple, it’s the fact that the rest of the industry is gutless, and has zero vision or leadership. Whatever Apple does, the rest of the industry will follow or oppose - but will be defined by it.
It’s like how people who don’t like US and want nothing to do with US still discuss US politics, because it has so much effect everywhere.
(Ironically no enough people discuss China in any coherent level of understanding)
You're absolutely right, I'm so glad that Apple was the first company to release a phone with a touch screen, or a phone with an app store, or a smart watch or a VR headset.
Apple doesn't release new products, they wait until the actual brave and innovating companies have done the exploration and then capitalize on all of their learnings. Because they are never the first movers and they have mountains of cash, they're able to enter the market without the baggage of early adopters. They don't have to worry about maintaining their early prototypes.
Apple doesn't innovate or show leadership, they wait until the innovators have proven that the market is big enough to handle Apple, then they swoop in with a product that combines the visions of the companies that were competing.
Apple is great at what they do, don't get me wrong. And swooping in when the market is right is just good business. Just don't mistake that for innovation or leadership.
This is a prejudiced take. Running AI tasks locally on the device definitely is a giant improvement for the user experience.
But not only that, Apple CPUs are objectively leagues ahead of their competition in the mobile space. I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance. Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.
If I cared about status, I would have changed my phone already for a new one.
> I am still using a IPhone released in 2020 with absolutely no appreciable slow down or losses in perceived performance.
My Pixel 4a here is also going strong, only the battery is slowly getting worse. I mean, it's 2024, do phones really still get slow? The 4a is now past android updates, but that was promised after 3 years. But at 350 bucks, it was like 40% less than the cheapest iPhone mini at that time.
Apple says it made these changes for other reasons, honestly, truly. And if it happened to have the same effect, then that was unfortunate, and unintended.
Only Apple really knows. But there was a slew of changes and reversals following the drama. "Oh, we'll implement notifications now", "Oh, we'll change the peak performance behavior", and "we will change and add additional diagnostics to make sure issues are battery related" certainly has a feel for a bunch of ex post facto rationalization of several things that seem, to me, that if it was truly a battery thing all along, would have been functional requirements.
>Apple CPUs are objectively leagues ahead of their competition in the mobile space
This is a lie. The latest Android SoCs are just as powerful as the A series.
>Because even a 4 years old IPhone still has specs that don't lag behind by much the equivalent Android phones, I still receive the latest OS updates, and because frankly, Android OS is mess.
Samsung and Google offer 7 years of OS and security updates. I believe that beats the Apple policy.
The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).
Apple has not, that I've seen at least, ever established a long term support policy on iPhones and iPads, but the numbers show they're doing at least as well as what Samsung and Google are promising to do, but have not yet done. And they've been doing this for more than a decade now.
EDIT:
Reworked the iOS numbers a bit, down to the month (I was looking at years above and rounding, so this is more accurate). iOS support time by device for devices that cannot use the current iOS 17 (so the XS and above are not counted here) in months:
The average is 72.5 months, just over 6 years. If we knock out the first 2 phones (both have somewhat justifiable short support periods, massive hardware changes between each and their successor) the average jumps to just shy of 79 months, or about 6.5 years.
The 8 and X look like regressions, but their last updates were just 2 months ago (March 21, 2024) so still a good chance their support period will increase and exceed the 7 year mark like every model since the 5S. We'll have to see if they get any more updates in November 2024 or later to see if they can hit the 7 year mark.
>The last iPads to stop getting OS updates (including security, to be consistent with what Samsung and Google are pledging) got 7 and 9 years of updates each (5th gen iPad and 1st gen iPad Pro). The last iPhones to lose support got about 7 years each (iPhone 8 and X). 6S, SE (1st), and 7 got 9 and 8 years of OS support with security updates. The 5S (released in 2013) last got a security update in early 2023, so also about 9 years, the 6 (2014) ended at the same time so let's call it 8 years. The 4S, 2011, got 8 years of OS support. 5 and 5C got 7 and 6 years of support (5C was 5 in a new case, so was always going to get a year less in support).
These are very disingenuous numbers that don't tell the complete story. An iPhone 7 getting a single critical security patch does not take into account the hundreds of security patches it did not receive when it stopped receiving support. It received that special update because Apple likely was told or discovered it was being exploited in the wild.
Google and Samsung now offer 7 years of OS upgrades and 84 months of full security patches. Selectively patching a phone that is out of the support window with a single security patch does not automatically increase its EOL support date.
I look forward to these vendors delivering on their promises, and I look forward to Apple perhaps formalizing a promise with less variability for future products.
Neither of these hopes retroactively invalidates the fact that Apple has had a much better track record of supporting old phone models up to this point. Even if you do split hairs about the level of patching some models got in their later years, they still got full iOS updates for years longer than most Android phones got any patches at all, regardless of severity.
This is not an argument that somehow puts Android on top, at best it adds nuance to just how much better iOS support has been up to this point.
Let's also not forget that if Apple wasn't putting this kind of pressure on Google, they wouldn't have even made the promise to begin with, because it's clear how long they actually care to support products with no outside pressure.
I agree. This is the type of competition I like to see between these two companies. In the end the consumer wins regardless of which one you buy. Google has also promised 10 years of Chromebook support, so they've clearly got the message on the importance of supporting hardware much longer than a lot of people would use them for.
They made that pledge for the Pixel 8 (2023). Let's revisit this in 2030 and see what the nature of their support is at that point and how it compares to Apple's support for iPhone devices. We can't make a real comparison since they haven't done anything yet, only made promises.
What we can do today is note that Apple never made a promise, but did provide very long security support for their devices despite that. They've already met or come close to the Samsung/Google pledge (for one device) on almost half their devices, and those are all the recent ones (so it's not a downward trend of good support then bad support, but rather mediocre/bad support to improving and increasingly good support).
Another fun one:
iPhone XS was released in September 2018, it is on the current iOS 17 release. In the absolute worst case of it losing iOS 18 support in September, it will have received 6 full years of support in both security and OS updates. It'll still hit 7 years (comfortably) of security updates. If it does get iOS 18 support in September, then Apple will hit the Samsung/Google pledge 5 years before Samsung/Google can even demonstrate their ability to follow through (Samsung has a chance, but Google has no history of commitment).
I have time to kill before training for a century ride:
Let's ignore everything before iPhone 4S, they had short support periods that's just a fact and hardly worth investigating. This is an analysis of devices released in 2011 and later, when the phones had, mostly, matured as a device so we should be expecting longer support periods. These are the support periods when the phones were able to run the still-current iOS versions, not counting later security updates or minor updates but after the major iOS version had been deprecated. As an example, for the iPhone 4S it had support from 2011-2016. In 2016 its OS, iOS 9, was replaced by iOS 10. Here are the numbers:
4S - 5 years
5 - 5 years
5C - 4 years (decreased, 5 hardware but released a year later in a different case)
5S - 6 years
6 - 5 years (decreased, not sure why)
6S - 7 years (hey, Apple did it! 2015 release, lost iOS upgrades in 2022)
SE(1st) - 5 years (like 5C, 6S hardware but released later)
7 - 6 years (decreased over 6S, not sure why)
8 - 6 years
X - 6 years
The 6S is a bit of an outlier, hitting 7 years of full support running the current iOS. 5C and SE(1st) both got less total support, but their internals were the same as prior phones and they lost support at the same time as them (this is reasonable, if annoying, and does drag down the average). So Apple has clearly trended towards 6 years of full support, the XS (as noted above) will get at least 6 years of support as of this coming September. We'll have to see if they can get it past the 7 year mark, I know they haven't promised anything but the trend suggests they can.
Sure. They also pledged to support Chromebooks for 10 years. My point being is that I don't think they'll be clawing back their new hardware support windows anytime soon. Their data indicates that these devices were used well beyond their initial support window metrics so it was in their, and their users, best interest to keep them updated as long as they possibly could. 3 years of OS updates and 4 years of security updates was always the weak link in their commitment to security. And this applies to all of their devices including the A series - something I don't see other Android OEM's even matching.
BTW, my daily driver is an iPhone 13 and I was coming from an iPhone X. So I'm well aware of the incredible support Apple provides its phones. Although, I would still like to see an 8+ year promise from them.
The vast majority of people don’t. They buy because the ecosystem works. Not sure how I get status from a phone that nobody knows I have. I don’t wear it on a chain.
Apple only pivoted into the “privacy” branding relatively recently [1] and I don't think that many people came for that reason alone. In any case, most are now trapped into the walled garden and the effort to escape is likely big enough. And there's no escape anyway, since Google will always make Android worse in that regard…
[1] in 2013 they even marketed their “eBeacon” technology as a way for retail stores to monitor and track their customers which…
Ca 2013 was the release of the Nexus 5, arguably the first really usable android smartphone.
Privacy wasn’t really a concern because most people didn’t have the privacy eroding device yet. In the years following the Nexus 5 is where smartphones went into geometric growth and the slow realization of the privacy nightmare became apparent
Imho I was really excited to get a Nexus 4 at the time, just a few short years later the shine wore off and I was horrified at the smartphone enabled future. And I have a 40 year background in computers and understand them better than 99 out of 100 users – if I didn’t see it, I can’t blame them either
Define usable. Imho before Nexus 4 everything was crap, Nexus 4 barely was enough (4x1.4 GHz), Nexus 5 (4x2.2GHz) plus software at the time (post-kitkat) was when it was really ready for mainstream
I'd say from my experience the average Apple users care less about privacy then the general public. It's a status symbol first and foremost 99% of what people do on their phones is basically identical on both platforms at this point.
Their primary business goal is to sell hardware. Yes, they’ve diversified into services and being a shopping mall for all, but it is about selling luxury hardware.
The promise of privacy is one way in which they position themselves, but I would not bet the bank on that being true forever.