Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Apple unveils the new MacBook Pro featuring the M3 family of chips (apple.com)
196 points by mfiguiere on Oct 31, 2023 | hide | past | favorite | 427 comments


Related ongoing threads:

Apple unveils M3, M3 Pro, and M3 Max - https://news.ycombinator.com/item?id=38078063

Apple supercharges 24‑inch iMac with new M3 chip - https://news.ycombinator.com/item?id=38078068


From a marketing/product positioning perspective, I liked that Apple was focusing on performance per watt as their main selling metric. Yes it's fast and you can likely advertise on that alone, but the per-watt performance is just leagues ahead of everyone else, especially with them being the first to release with TSMC's 3nm fab.

The 22hr advertised battery life is definitely a side effect of this. I'm super curious how long it ends up lasting in real-world tests.


Historically, though, focusing on performance per watt is a technique that hardware companies use to distract from otherwise not very impressive concrete performance improvements. You'd expect a new process node and chip shrink to bring modest performance per watt improvements even with largely identical designs. When it comes to GPUs, this has been combined with actually increasing the power draw to make the generational performance improvement look more impressive; so technically, performance per watt would improve, but the footprint would still increase, and yet the performance improvement would still be rather lackluster for any given price bracket. (This especially because you're competing with last generation's current street prices and used prices: if I can get a used M2 machine that's competitive in performance to a new M3 machine in the same price bracket, it might be more worthwhile.)

That's not to say there's no value in focusing on energy efficiency, especially for laptop chips; Apple Silicon was leagues ahead in energy efficiency when it launched, even if the gap is a lot smaller with newer Ryzen mobile chipsets. And speaking of Ryzen, the 5950X is a paragon of performance-per-watt in a desktop chip, which is of course still just as great as it makes it easier to power, cool and overclock. It's more to say that a shift toward focusing on energy efficiency could signal that they may be hitting road blocks in unlocking substantial performance improvements through better chip design alone, which of course makes sense, as its not like you can just linearly come up with design improvements every year forever, and they already came out swinging with their first generation chips.


The other point here is that "performance per watt" is how you get "performance" when single-thread performance is hard, because it lets you fit more cores into the same TDP. The top end EPYC 9754 that will stomp basically anything has a 360W TDP but that's for 128 cores, which is less than 3W/core.

The trouble for Apple is that although you can do this, they're paying the highest bid for first access to TSMC's new process, and doubling the core count would have used that much more die area. Which might have delayed the release as it takes more time to build up stock. So now we're into this weirdness where they give you a 22 hour battery life (how often are people awake for that many hours straight?) but they're getting outperformed by the Ryzen 7000 HX line on the older generation process node because it has more cores.


> Historically, though, focusing on performance per watt is a technique that hardware companies use to distract from otherwise not very impressive concrete performance improvements.

Before Apple came along and made people realize PPW is pretty damn important in the mobile/laptop space.


It's definitely something that at least vendors knew was important. However, ARM/AArch64 chips already decimated x86 in performance per watt even when x86 chips generally had a process advantage. But ARM laptops generally sucked. Yes, the battery life was excellent: even a crummy Samsung Series 5 Chromebook delivered pretty impressive battery life for its light weight and price. But the performance? Yeah it was shit. Good enough for a few tabs of Chrome and nothing more, honestly not so bad. But the calculus for phones definitely worked out in ARM's favor.

But that's the thing. Sufficient performance per watt is very important. Improvements to performance per watt are rarely unwelcome. However... you can mislead people regarding generational improvements when touting performance per watt metrics. If the design didn't actually improve much, a new process node and pumping more power in can cover a bit. Pretty much every vendor does this at some point, and even if Apple isn't currently doing this, pivoting towards touting the efficiency improvements rather than the raw performance is not a good sign, especially considering that these chips go in desktops too, and Apple still has some ground to cover if it wants to compete fully with high end desktop CPUs from the PC market.


perf per watt is not new and it is not just marketing hype, it is the limiting factor for not only notebooks but desktop gpus and cpus. the envelope is power vs. heat vs. performance. increasing perf per watt will let you do more in that envelope, whether it is in a notebook with batteries, or in a desktop with giant coolers holding you back.


Anecdotally, I've been able to use my M2 Max MacBook Pro for 12 hours straight with 50% battery at the end of the day. Nothing intensive, just Chrome, Emacs, Spotify, Slack, Discord, etc.


Nothing intensive, just Chrome, Emacs, Chrome, Chrome, Chrome, etc.


Lol spot on. It's actually a bigger flex than you might think to be getting that much battery life with all that Chrome running.


Google Chrome Helper (Renderer), Google Chrome Helper (Renderer), Google Chrome Helper, Google Chrome Helper (GPU), Google Chrome Helper (Renderer)...


My M1 Pro can do damn near two workdays on one charge as long as I'm not doing shit in Docker.

The Lenovo Thinkpad I was also issued loses like 40% of its charge overnight, while "sleeping"....


Is this on very low brightness?

My 14" M1 Pro rarely got more than 7 hours and my former work 16" M1 Pro more than 9 on a React workflow. When introducing VM work or anything fairly heavy to the mix take 2 hours off each. Typically at 50-60% brightness, have iStat showing load so I can catch any errant processes.

My biggest issue with this release is no published improvements in battery life. 22 hrs is for movie watching. Wireless web - of course very low load - is still 15 for the 16" and 12 for the 14", same as M2 Pro which is each just 1 hour more than the M1 Pro.


I almost never touch the brightness. It does what it wants. Not sure. Looks like it's at about 1/3 right now, but it'd be brighter if it weren't kinda cloudy here, I bet.

Program choices matter a lot. I gained about 30% battery life switching from Chrome to Safari, years back. I do have Teams and MS Outlook running all the time on that machine though, and they're not exactly slim.


Try OrbStack. Should be much better on battery life.


OrbStack is awesome, but fair warning, its disk image does NOT play nice with backups.


Thanks for the tip. On Colima now, will give that a try when I get a chance.


I second this. Night and day.


Today, we saw independent benchmarks of Qualcomm's new 4nm chip that provided 50% (Geekbench) to 100% (Cinebench) more multicore performance than M2, while using 23W. And it can efficiently encode AV1. Maybe M4 will catch up.


Well, benchmarks are nice. But a processor is not an end user product. How do you think this is going to be as a product once you put it in a PC running Windows - especially as Microsoft struggles trying to get third party apps on ARM and MS’s own substandard x86 emulator


I can not fathom the level of incompetence by vendors to want to install Windows on ARM devices.

1. The only reason you make people deal with Windows is backwards compatibility.

2. When you advertise it as a Windows laptop people will expect to be able to run their apps (they expect backwards compatibility). RIP your reputation and support inbox.

Yes, it will be harder to sell when it runs Linux. But it's the correct expectations management and at least it will suck less.

Oh well. This is what Google will do with their Chromebooks. Windows on ARM has the same future as the Windows Phone.


Have you tried x86 emulation on Windows for ARM? It works just fine for me.


It's slower and one of the main things people want Windows to run is games. Also, games are one of the things emulation systems are most likely to break, because games use all kinds of weird performance hacks and come with heinous anti-cheat systems.


Have you tried it though? I use it primarily for games and I've never had an issue with it


I had a Thinkpad X13s for a while that worked quite well except for a video issue (used pawn shop purchases are a risk like that). Firefox, Edge, and MS Office worked great natively on ARM. LibreOffice worked just fine via the MS x86 emulator. And the X13s had the old Snapdragon 8cx Gen 3 with less than half the performance. Granted, I wasn't doing any heavy lifting with it the month I had it.

I mainly got it to test out my PortableApps.com stuff running under ARM. I'm doing it now on a Macbook Air M1 with Windows 11 running under UTM.


The entire value proposition of Windows is wide array of application support for every little niche app you could possibly want.

You just named a few apps that are on Macs.


I got it to handle 3 things: a laptop I can use as a laptop for basic stuff and to use to remote in to my development desktop at home, a Windows ARM machine I can test my Windows x86/x64/ARM64 software on, and a Mac to test out my software running under macOS via Wineskin. It's a little clunky but it works for all 3.

My original set was a regular full-fat Windows laptop, a super-cheap used Galaxy Book Go Gen1, and a used Intel Mac Mini. I later replaced the Windows laptops with the Thinkpad 13s. If it hadn't had video issues, I'd still be using that. But a used base model Macbook Air M1 serves the purpose for now.

I'll likely switch back to an Apple Silicon Mac Mini and a Windows laptop of some sort later as I much prefer a Windows laptop to my current Macbook.


Intel and AMD should get ahead of the curve and put some real support behind desktop linux, which has an actual path to ARM adoption for a much larger portion of its software.

But then again at least one of them should have been doing that 20 years ago.


I don't use Windows. The Raspberry Pi has great Linux support, and the Linux Geekbench scores were even higher than in Windows. Unfortunately, I couldn't find Linux Geekbench scores in the 23W configuration.


To a first approximation - no one cares about how well a Qualcomm desktop chip runs Linux. The consumer market is Windows and Macs


The consumer market is Windows ecosystem.

Without the ecosystem who wants Windows? And what will it do for reputation of the vendor or Microsoft when people can't run their apps.

People are already buying non Apple ARM laptops. They are called Chromebooks. They can run Linux apps and Android apps. And thats more than most consumers would expect.


Chromebooks are basically the hardware equivalent of a SaaS product where the user is not the buyer. Schools are buying Chromebooks for students.


Most people I know have some kind of chromebook (next to MacBook for work). They bought them in stores.

Is this a US thing where your school buys you a laptop?


Many schools in the US provide their students with laptops and Chromebook is the overwhelming favorite. The student has to return the laptop at the end of the year. As an anecdote, I know of no one who bought a Chromebook for personal use. My friends, colleagues and acquaintances are buying Macs or Windows machines if they want a laptop, iPads or Android tablets if they just want a tablet.

In a way, Google's strategy of getting Chromebooks into schools may have backfired as they're largely seen as kids' computers.


> Is this a US thing where your school buys you a laptop?

It is a thing that is essentially an equivalent of an employer-provided work laptop, but for students and provided by their school.


I’ve never even seen a Chromebook in real life here in Australia. I’m sure they must exist but I’ve never heard of anyone using one.


They are in use by the millions in schools. I have one and I really like, can run android apps, chrome web browser, and in the crostini linux system I can run any apps, dev tools, web browsers, emacs, and it is native. I like it better than raw linux because of the built in android support.

I gave chromeos laptops to my family because they aren't trustworthy. Now they have reliable laptops and don't get virus infections or os problems.


Yeah ok interesting! Makes sense for the Linux parts - not sure I’d trust Google with my family’s data but that doesn’t mean they’re not useful.


That's a complicated story. No one really wants surveillance capitalism. I don't think google copies what I'm doing on my chromebook, but almost every website has google tracking. Chrome has google tracking. You can use non-chrome browsers on chrome os, they are all there via the linux subsystem. You can also run the android ones.


Linux is 10% of desktop/laptop share and even higher of laptop share, given that very few people use ChromeOS on desktops.


Just a quick Google search shows 3%.


My previous comment explains why you interpreted your Google search incorrectly. ChromeOS is Linux. https://www.gizmochina.com/2023/03/07/windows-losing-market-...


This also shows the power of marketing. ChromeOS is a subset of Linux -- it doesn't do anything you couldn't always have done with Ubuntu. But for years people said that normal people don't want Linux, it doesn't run their apps, they can't use it.

One company shows up with a marketing budget and it's got triple the market share and is now up to the level that Mac traditionally held when all of the things "nobody makes for Linux" because "nobody uses it" supported that.

We're also at the point where things like bank websites don't "officially" support Linux, but as a general rule they don't have any problems on it, and if they did have problems it would be a problem the bank has to deal with instead of a problem the customer has to deal with.


That’d be pretty incredible if it plays out in reality once products are released. Though it sounds too good to be true.

I’m skeptical since Qualcomm has failed for years to catch up to Apple processors in the smartphone market. So why would their first effort in desktop processors be so much better?


There's actually a good reason. In short, a large portion of the apple silicon team ended up leaving a few years ago to could start a new company named Nuvia. Their goal was to produce high performance chips for the enterprise/server market, and they had some very aggressive performance targets [1].

Then, in early 2021 Qualcomm ended up acquiring Nuvia, and these new chips are the first showings of this acquisition. Naturally there's a lot of hype since said team represents a lot of the talent that made apple silicon so good in the first place.

[1]: https://images.anandtech.com/doci/15967/N2.png


Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first, not because they had some kind of geniuses on the project.

It is no doubt promising that Qualcomm has brought in more talent, but it still takes time and effort to turn that into a best-in-class product. I’m not saying it’s impossible, though I’ll be skeptical of the hype until I see a real product.


> Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first, not because they had some kind of geniuses on the project.

Right, and the guys who learned all the hard-won lessons along the way walked out the door to start a company, bringing along expert knowledge of Apple's designs and processes. And then Qualcomm bought them.

So on a surface level it seems implausible that QC could produce such a chip. But when you zoom out and go "oh, Qualcomm effectively bought Apple's senior chip engineers" it starts to make more sense.

It would be like if Qualcomm's top modem engineers started a company, which Apple bought. And then a couple years later Apple's long-running modem project mysteriously turned a corner and was ready to launch an exceptional modem. Like yeah, no kidding.

So yes, we need to see independent benchmarks and make sure it's not hype. But it's not so unbelievable that Apple's former top engineers could also produce a good a chip for another company. There's nothing magical about the Apple office--it's the engineers.


Engineering something as complex as a CPU is a long process regardless of how smart and experienced your engineers are. I mean, you can certainly speed it up with great talent, but there is still long and hard work to do with any difficult engineering challenge.

I’m not saying there’s something special about Apple other than the scale of their investment over a long period of time.

It’s the same deal for Qualcomm and their 5G modems. Apple no doubt has hired many talented engineers to make a custom 5G modem. But Qualcomm’s modem is still the best one around. It’s hard to catch up because Qualcomm has been investing heavily in that space for a very long time.

Again, that’s not to say Apple won’t ever catch up. Just that I wouldn’t expect that their first effort will be better than Qualcomm’s modems.


Nuvia has been working on this tech for years before being snapped up by Qualcomm. And before that, those same engineers had worked on Apple silicon for years. Why do you keep thinking this is an overnight thing?


For starters, the claim that they were working on high performance computing before getting bought.

Taken at face value, that story is hard to reconcile with a sudden pivot to mobile.


To be fair it’s a bit of a myth that only mobile cares about efficiency and thermal management. It is definitely a factor for HPC and server too.

Apple scaled iPhone first designs up to the M* Ultra chips. Going from HPC to a mid wattage laptop is definitely serious work, but I don’t think it’s impossible. Especially with ARM.


Apple clearly iterated on that process over the course of a decade and multiple generations of chips, eventually achieving that outcome.


Am I taking crazy pills?

The whole point of this thread is that those same Apple engineers made these Qualcomm chips.

Yes. Apple iterated over many, many years. Learning so much along the way about how to make performant, efficient ARM designs.

And then a bunch of the most important of those guys left to start their own company.

And then Qualcomm bought that company.

Y’all are acting like a few college kids from Stanford made Qualcomm a new CPU over their summer internship. “It takes longer than that to make a good CPU.” Yah no shit!


You are taking crazy pills. Making a high performance chip requires more than just having a bunch of talented and experienced engineers. Is it a necessary requirement? Sure! But it’s far from sufficient.

Apple brought on PA Semi and then slowly iterated on actually shipping hardware for years. They didn’t hire PA Semi and have a best-in-class product on the first go.


And those same guys who slowly iterated on shipping hardware for Apple for years are at Qualcomm now.

Are you saying it’s a requirement that these guys ship a crappy chip first? Why? They already know how to make good ones.

Can you tell me what more they need other than their talent and years of experience to make a good chip? Because if it’s just “I demand they make a bad chip now because they’ve changed logos on their corporate polos” I don’t think this conversation has anywhere to go.


Is the assumption here Apple has developed a business process for building best-in-class CPUs while treating its engineering workforce as fungible commodities? If so they've succeeded in doing what Intel has been trying to do for decades.


> Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first, not because they had some kind of geniuses on the project.

Given that they were confident enough to leave and start their own company, I'm not sure this is true. Indeed I wouldn't discount the value of high talent density.

> I’ll be skeptical of the hype until I see a real product.

They have shown real hardware demos [1] to reviewers already, and the numbers look solid. Obviously there are no comparisons vs M3 yet, but it seems promising.

[1]: https://www.anandtech.com/show/21112/qualcomm-snapdragon-x-e...


That Anandtech article was full of caveats about how the benchmarks are not typical and about how comparisons are not easy at this point.

Again, it seems promising, but we’ll find out a lot more once a product is actually released that can be fully benchmarked and reviewed.


Not one or two Geniuses, but a really good team. They lost supposedly key members of that team which threw them off track. An organisation can recover from this, but it takes time and money. Not everyone likes to work under Apple-like working conditions towards Apple's goals.


> Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first

Doesn't that also describe what Qualcomm has been doing?


Sure, but they’ve been behind the whole time. It’s not like they’ve been trading blows each generation with Apple.

If the premise is that Qualcomm hired a team of super talented engineers who can build a product that competes with Apple, then those engineers will still need time to develop a product.

Again, maybe their new processors will be everything they claim. It’s possible for this to happen. I’m just not willing to buy into the hype yet.


There's also a pair of massive lawsuits alleging IP theft/infringement from ARM and Apple.


I am going to go out on a limb and suggest that Qualcomm's pre-production benchmarks are of no value and the actual performance of real devices will be worse than Apple.


That's quite a limb to be standing on if you expect performance to decrease by half.


To be fair, I don't think many would have trusted that Apple's M1 chips were going to be as good as they were before they were actually in consumers hands. I'll reserve judgement too until that's the case.


What you didn't mention is that the Snapdragon X Elite has 12 high performance cores, making it a M2 Max competitor and not an M2 competitor, at least on the CPU side. The GPU is disappointing with no ray tracing support.


The M2 Max has a TDP of 79 W. I gave the Snapdragon's results at 23 W. The 80 W configuration scored even higher.

Where did you see that there was no ray tracing support? Qualcomm introduced hardware accelerated ray tracing in the previous generation.


Are these Qualcomm chips in products shipping next month?


It’s like different parts of the TSMC factory are in competition with each other. What a surreal situation.


TSMC doesn’t care. It’s like an arms dealer, it will happily sell to both sides.


At least 1/2 hour of teams call


Perhaps the most interesting thing is the UK pricing. I paid £1899 for my M1 Pro. The M2 Pro was £2149. The M3 Pro is £1899 again for same spec.

The insults however come now. There is now a £1699 8Gb option. Seriously in 2023? 8Gb?!?! And you can get an i7 Lenovo T14 gen 4 with 32Gb of RAM and 1TB disk for £1297. Every 8Gb RAM increment with Apple is a complete rip off.

And the total build price for my silent i5-13500, 32Gb, 1TB desktop with an RTX 4070 and 27" 4k Dell monitor was £50 less than the base level 8Gb MacBook Pro.

No thanks. I'm out.


When it comes down to it, I think you'll find most people buy Mac for the software.

I've tried using Windows and Linux and it's just not as good. If this weren't the case, then I doubt Apple could charge such exorbitant prices for their hardware.


I'm not sure why you'd go for a Mac for the software? The cloud experience sucks and is really really expensive, most of the provided desktop apps are buggity or so disparate to the rest of the universe that they are a disability and everything is pretty portable.

I mean here I am all VSCode + Linux (WSL2) + Office 365 + Adobe + Davinci Resolve + mathematical tools (SciPy/Maxima/Minitab) here. All are portable (apart from Minitab)


I agree that Apple's software is low quality. In too many cases shockingly and consistently so over many years. You are not surprised to see the spinning ball of fail when running Apple software. You expect their developers to make beginners' mistakes.

And Apple has some spectacularly useless systems developers that manage to have at least a couple of daemons regularly blow up and use lots of CPU in every release. It is almost like some elaborate running joke.

I mean, what idiot wrote the calendar daemon. How the heck do you keep a job as a systems developer when you need to burn 70-80% CPU for prolonged periods of time to sync a calendar that hasn't even had significant use for years? Seriously, if anyone knows who wrote it, I'd be interested to understand how things went so wrong for them.

Yes, Apple's software sucks.

However, I don't buy Apple to use their software or their cloud services.

I use macs for the same things you do: run the applications I need to do my job. Applications that are not made by Apple. My list of requirements isn't long.

  - Must be a unix variant
  - Must be decent hardware (most laptops are horrible)
  - Must be able to run the developer tools I use
  - Must be able to run Fusion 360
  - Must be able to run professional photo editing software
  - Nice if it can run Ableton Live
That's it.


One such example is music production. Audio on Mac just works. You can plug in a microphone from 20 years ago and it will just work. With Windows this is not the case. You need appropriate drivers, often which are not available. This is not including all the issues with audio routing etc.

I will agree that native cloud sucks on Mac. I would disagree about the desktop apps, I think they are superior to Windows. Programming is also superior on Mac, in my opinion.

Personally speaking, the big thing I hate most about Windows are the defaults. Like, the default keyboard shortcuts i.e. using F keys in opinion, is very poor design. Also the lack of an additional modifier key makes Windows less ideal from this point of view.

Of course, each OS has their pros and cons. Windows just doesn't work for me.


> Audio on Mac just works.

Except when it doesn’t. Like for example with macOS Ventura, where lots of people had huge problems with audio dropouts on USB interfaces.

The problem was never really acknowledged by Apple (as usual), but “magically” fixed in Sonoma (without any mention in release notes).

Leaving many users with the dilemma that they should upgrade to Sonoma asap to fix the USB problems, but on the other hand should wait with upgrading because maybe their DAWs or plugins are not compatible with Sonoma yet.


Yeah my piano doesn't work with macOS Ventura suddenly. Audio dies after 2 minutes. I'm only using it for monitoring so I just plug bloody headphones in it and be done with the computer.


The audio HDMI output from my MacBook M1 Pro since release always had choppy buffer loops. That is until one of the most recent software updates. Finally it 'just works'.


I agree that it's a problem. But also, you don't have to upgrade to the latest OS. Most people wait to see if there are any problems before they upgrade.

I think this applies to most things in life. Especially programming libraries at software companies. It's almost assumed that something will break when you upgrade any kind of system.


If you bought a Mac with Ventura, you pretty much just “had” the USB problem, without upgrading.

And downgrading these machines is somewhere between impossible and a huge pain in the b…


I heard this a lot. I did music production on Linux for 5 years, and tried it on Windows for a few months.

Then I bought my M1 Mac and was shocked to learn that I think it's actually the least sophisticated, by far.

Don't get me wrong, my M1 hardware competes with my 5950x Desktop system... which amazes me.

But, Jack2 on Linux was pretty cool, a little rough around the edges, but worked really well.

Now with Pipewire becoming the standard, I can't imagine why anyone would pick a platform besides Linux for any real time Audio+Visual routing/processing.

It's just so easy to pipe any of your data to any other piece of software, including over the network.


> Now with Pipewire becoming the standard, I can't imagine why anyone would pick a platform besides Linux for any real time Audio+Visual routing/processing.

Software compatibility almost entirely non existent for industry standard workflows.

> It's just so easy to pipe any of your data to any other piece of software, including over the network.

Not a frequently required feature (or at all) for 99% of music production use cases.

Linux audio is getting there, but it's a million miles away from being a feasible choice for most people in the industry. I mean, I don't think I could find a single person who would trade their MacBook for a Linux laptop when its running their entire live setup.


I've found the opposite. All my Focusrite and Presonus hardware interfaces rely on OSX software that is no longer supported.


What does the software do that's necessary to make it function? I have a Focusrite and don't rely on software for it to function.


Normally the mixing control software for older hardware that works fine like Saffire Pro (FireWire interface) or a Tascam USB interface that has an internal mixer that is impossible to use on the hardware. Eg. line/instrument switches are done digitally.

I ended up going back to Windows to use these and switched to Studio One instead of Logic Pro X.


What mic where you using 20 years ago in audio production that required drivers? Std interface was and still is XLR.


I keep going back to MacOS because it works fine, stays out of my way, and has the Unix stuff I work with.

I'm ready to pay a lot of money for an OS that neither annoys me nor demands work from me.

The apps themselves... yeah they're mostly the same.


Speaking as someone who bought a Mac for programming reasons but then picked up video editing, iMovie has been a fantastic first-time video editor.


After having so many windows laptops fall to pieces quite literally, I just get the MacBook now. They last so much longer and you easily get your monies worth


> most people buy Mac for the software

IMO the last OS release that was extremely smooth and nearly completely bug-free was Mac OS X Snow Leopard.

Nearly everything afterwards was a slow downwards spiral, both in terms of useability and quality.


But Lion let you resize windows from any corner or edge. It does mark a turning point with the loss of Front Row and iTunes keep getting worse.


I don't. I buy Apple products for the hardware. They last and they work when it comes to iOS. When it comes to Macs, the software usability is shite. I've had to resort to 3rd party apps (Amethyst, AltTab) to be able to effectively organise and switch windows.


No, the only reason I'm gonna buy a Mac is to be able to build & publish an iPhone app.

The Apple ecosystem is closed, you must pay the Apple hardware tax to get in.


For the price of Macs, I get Lenovo and Dell graphic workstation like laptops, e.g. Thinkpad P series.

While I am a big NeXTSTEP fan, I am not paying for the experience in detriment of less capable hardware, specially when I can just rent Mac hardware from work from our R&D pool.


I switched from Lenovo X1 Carbon and totally gave up on Windows and bought an Mac again. Windows has become some kind of shitty garbage of an OS. Reminds me of the toolbar creep in IE that use to happen 10 years ago. Now the entire OS is fucking like that.


Amen. Just started a job where they expected me to use Windows as a programmer. Not a fucking chance.


I generally disagree with arbitrary hardware comparisons, like "why buy apple when I could build some gaming rig for $5" for obvious reasons that have been beat to death in every comment section ever, but you're very right about the hardware upgrades specifically.

I want a new mac, but I feel like an idiot paying those prices for both ram and ssd upgrades, moreso than ever before. In CAD, at one point, ram upgrades were already too much at $200 or something, but I'd reluctantly suck it up and be happy with the decision anyway. Now they're literally $500 for each 16gb increment, and $500 for each TB increment. I'm not about to replace my existing mac with some Windows hunk of garbage, but I'll have to wait until I'm either rich or I can directly translate those upgrades into predictable dollar amount returns on the investments. It's bizarre that I price out a laptop that's $5k before tax that doesn't have more than a 512gb SSD, has only half the possible RAM, and doesn't even have the largest screen.

On the other hand, the mid-tier option is probably still great, but costs about $3500 and also still only has a 512gb ssd


Well, price includes membership for the sect of ‘good morning, Amazing’…

Honestly I feel much more in place with a win11 setup recently. For a dev who does not rely on apple’s Xcode etc it makes very little difference. Besides win11 is scary snappy on every decent box, not only Microsoft surfaces.

And except for the speed gains, the gains from using OSX have been diminishing rapidly for years. Sticking with osx12 on some devices does not impact my workflow at all. I guess I can go back to 10.x and … VScode, git, etc will still work the same.

Speaking of security et al - well that’s fine, but some of the most aggressive hacks are targeted at OSX.


I've been using M1 since it came out. Loved it. Fast, snappy, smooth. Prior to that, I had a Dell XPS that ran loud and always felt laggy to use.

I was going to upgrade to M2, but $2k+ for an essentially base laptop. Nah.

I grabbed a new Zenbook 14 OLED w/current gen i7. Metal chassis, nice keyboard, nice trackpad. Nice 120Hz screen. Solid construction. Windows 11 which I have gotten used to. It feels pretty well polished in terms of hardware and OS. Feels snappy. i7 is a wonderful, fast CPU.

While it isn't an M2 Mac, an M2 Mac is no longer wildly better than a good Windows laptop. The $1300 difference for the same performance is a strong selling point for the Zenbook which is 96% as good as an M2.


Why did you need to change computers at all? If you had an M1 mac, seems like a weird choice to want to replace it with what seems like a horizontal move


That sounds great! How's the battery life?

Theoretically using large black terminal windows can maximize it.


Agree on all counts.

I'm mostly just happy since I switched back to windows 11 that I've got an OS I can actually drive from a keyboard that doesn't leave my hands brutally mangled from meta key combinations.


No doubt current day Windows is a capable device for all tasks. Too bad it treats you like a free user by clogging your operating system with ads while spying on you like a crazy stalker.


I don't get any ads. But yeah I'll give you that on the Telemetry.


Honestly it doesn't feel any different for me on MacOS. Both OSes push first-party music and cloud services, both OSes beg you to use their browser, both OSes show news with their own built-in ad networks. At this point for Windows 11 and MacOS 13, I'm shocked how little these OSes have meaningfully changed since the early 2000s. Both companies feel like they're taking a detour around user empowerment to focus on Selling Moar Services™.

While we're looking at things from a retro-idealist lens, even Android and iOS seem unnecessarily bloated. It feels like the vast majority of every FAANG's engineering effort goes into crippling the user nowadays.


Almost 24 hours later so likely you won’t see this, but I’m interested what your setup is hardware (monitors) wise and workflow wise that works best on Windows?

I’m pretty platform agnostic. I own and relatively regularly use Windows, Linux, and MacOS machines and have spent time using each as my main work system in the last five years.

I currently use a single 49” monitor and I’ve found MacOS with Amethyst for tiling and window management plays the nicest on it but your comment about hand mangling (and never completely remembered) meta key combos definitely rings true.

I haven’t found anything on Windows or Linux that doesn’t have the same issues though. It’s possible I’m irrationally afraid of mistypes and overly avoid single meta key combos on any system.


Genuine question: have you actually used a modern Mac? OS X was renamed macOS 7 years ago in 2016.


I have used it for the last 15 years and it hasn't got worse and hasn't got better. It has barely changed. It is done. However there are load of little annoying bugs (Reminders makes me want to kill people) and the developer story on the platform is a mess now. Literally minor releases will break things I use on a regular basis. Last month they broke something that broke Maxima (mathematical CAS) for me. I have to use that, so off to windows it is.


Also, it was never called “OSX”.


> I paid £1899 for my M1 Pro. The M2 Pro was £2149. The M3 Pro is £1899 again for same spec.

Just take a look at the last several years of GBP/USD. Our currency was in the crapper for a while and that impacted prices.


I fucked up. The spec is worse.


I got (for my wife) an entire Framework Ryzen 7840U system, plus a 2TB SSD and 32GB RAM purchased separately, for THE UPGRADE COST on an M2-class MacBook between its base spec and 32GB RAM/2TB SSD. It's a farce. I don't want to leave the Apple/OSX ecosystem after 20 years, but I simply can't justify it any longer. I bought a €300 used ThinkPad. Fedora's pretty good these days, and I'm mostly in a terminal or Firefox. I'll live. Wife's been mostly running Windows on Macs since 2011 anyway.


to be fair that upgrade price also includes a jump from m2 to m2 pro and the improved design with better screen


As ever when new models come out I fully spec'd one out on the UK store. 128GB M3 Max, 8TB SSD. £6999 14", £7299 16". Ouch


Humans need at least 8 kidneys these days.


More scary is the lower spec prices. 64gb ram, best cpu because it's required for 64gb, 512gb ssd (no ssd upgrade), $5k CAD before tax.


It is a major failure of them not to put pressure on everyone else by doubling memory. The LLM models would massively benefit. They could buy 32GB for probably less than 100 USD as a manufacturer.


Why do you compare a self-built desktop PC with a Mac notebook? I fail to see anything they have in common.


Or ~£1000 for a Ryzen 7 32GB 2TB T14s Gen 3 if you buy your own SSD (saving £200+) and drop Windows (~£80).


No thanks. I had a Ryzen T495s. Nothing but aggro that thing. I religiously avoid AMD because in the last decade I've had nothing but trouble from all of their hardware. Intel and NVidia only these days.

I've got an i5-1235U T14 gen 3 which I paid £730 for (new, but damaged box) and stuffed another 8Gb stick and a 1TB WD SSD in it. That is fine for my personal use as it's mostly an Office and light code machine.


This caught my interest. What problems have you had? I had a T495 and was happy with it running Fedora. So much so, just ordered the latest P14s ryzen model as its replacement.

The T495 developed a fan problem just recently, but since it is out of warranty it was easier to get my employer to order a new machine than to figure out how to repair it. If it were my personal machine, I probably would have searched for parts and tried to repair it myself and keep using it...

Edit to add: I am in your mirror universe because it was bad experiences with Intel + NVIDIA in Thinkpads which motivated me to want to try AMD again. I don't mind pure Intel iGPU models, but find the combination nothing but trouble.


Same for me, the Ryzen GPU is good enough to run recentish games at 1080p 30fps which is great, and avoids the bugginess of a discrete GPU setup.


Up until the most recent Intel gen, Ryzen has been the clear leader in both battery life and integrated graphics, blowing Intel away. Pretty much any internet forum will advise you to buy AMD for the T series.


Don't need integrated graphics. Battery is not that important to me.

Reliability is.


There's no other computer that has this performance on integrated GPU, unified memory and battery time.

But if you don't need any of that, of course even a used windows laptop would do and can have 32GB RAM.

I paid 2200€ for the M2 Macbook Pro in Sweden, but PC hardware sales have been tanking hard, so I think the pricing reflects that.


But as I found, it has too many drawbacks.

Firstly, you can't add any more RAM to it. This turned out to be a BIG problem for me as I handle a lot of fairly huge RAW files in Adobe Lightroom.

Then there's the fact that it actually spends 95% of its time docked to my Studio Display and using the BT mouse and keyboard which fuck out all the time.

And the battery life isn't as great as people make it out to be. If I am on the road, which is basically never, then I'm lucky if I get 4 hours out of it.

So I thought sod it, why shouldn't I just have a proper desktop instead which solves all of these problems. I looked at the M2 studio, went pfffft at the price, and threw together a windows desktop. I've got hard wired keyboard, mouse, ethernet and headset and I can throw another TB in it or double the RAM to 64Gb for less than $100 each.

And Excel isn't a piece of shit on it.


> I handle a lot of fairly huge RAW files in Adobe Lightroom.

Define “huge”.

I edit 45 megapixel RAW files on a m2 MacBook Air all the time and the only way that makes any sense is if you cheaped out and bought the base model. 16 GB of ram is still plenty for Lightroom and 32 GB is more than enough. The battery life is also amazing unless I’m dumb enough to do something like generate 1:1 previews for thousands of photos all at once. Yeah, you can’t upgrade it later but that’s why you need to make sure to buy enough hardware when you’re getting it in the first place.


It's a price for a one of a kind portable device. I used to think like you but then I realized Ill be better of just accepting the price or suffering with an inferior windows device.


I've got an M1 Pro MBP, studio display, ipad pro, apple watch, iphone. Meh. IT cost a lot and makes you feel special. That is all. It's not objectively better at achieving your goals.

At this point I don't agree that windows is inferior. It is just different. And quite frankly from a developer and productivity perspective it's actually a lot better.


> At this point I don't agree that windows is inferior. It is just different. And quite frankly from a developer and productivity perspective it's actually a lot better.

I think it depends a lot of what you're developing.

For a lot of open source stuff, even OSX is a 2nd class citizen, and windows isn't even considered. Of course, you can do everything in a VM, but that can have its own problems.

If you live out of Visual Studio, though, I'm sure the experience is nice.


> For a lot of open source stuff, even OSX is a 2nd class citizen, and windows isn't even considered. Of course, you can do everything in a VM, but that can have its own problems.

While it is true that windows is not considered, with WSL2, that doesn't matter anymore. Everything (99%) just works seamless with the Linux environment. And yes that's not truely native and technically a VM but I prefer WSL to a VM on macOS every day of the week. Ironically, if I would need to switch to Linux, I'd buy a Macbook.


In my case mostly mathematical modelling tools, simple desktop apps etc. VScode is where it all takes place. I mean I even typeset in VSCode!


What laptop would you recommend as a substitute for mbp then?


But, then you've got a Lenovo.


Lenovo Lenovo Lenovo.... where do I start...

My Motorola (owned by Lenovo now) One Power phone got a dead pixel on it's screen :)

My Lenovo monitor not only got a dead pixel, within a year it also developed a horizontal dead line. I was very lucky that it happened a month before the warranty expires. And after the repair, the new replacement screen also got a dead pixel on it :)

My new Thinkpad dead within days after I received the machine, and it took over a month for the repair to complete :)

The monitor and Thinkpad were directly brought from Lenovo, shipped straight out from their warehouses, so no one else to blame.

Oh did I forgot the Thunderbolt controller firmware issue in my previous Thinkpad T480 :)

It's safe to say I'm not very happy with Lenovo.

... But on the over hand, for who's planing to buy a Macbook, I usually recommend them to watch these videos first just to lift the spirit: https://www.youtube.com/watch?v=yR7m4aUxHcM (this is part 3, there are also part 1 and part 2)

I mean, guess why I'm still with Thinkpads :)

BTW: It wasn't me who downvoted you :)


yeah Lenovo support is shit. It's why I just buy two of them most of the time :)


The sad thing is that we all know this. Normies buy the cheap version and you have to explain to them: Yes, it gets slower quick, it's shit, you shouldn't have bought it, no you can't upgrade anymore. Then they feel bad.


A “normie” would only need an M1 MacBook Air and that will be good for 10 years. I am using a 2015 MacBook Air, and since I only browse/do basic spreadsheet stuff/pdf things, it still works fine.

If you need anything more than a Macbook Air, you are not a “normie”, and should be capable of doing better research before you buy.


Unlikely. My daughter is a normie and hit a wall with the basic M1 Air.


Doing what?


University work. Pretty much what apple promote it for.


I was hoping for more specifics.

Our definitions of normie are probably different, because there is definitely university work that requires performance that “normal” (majority) of people will never need.


ah, yes, that's truer than it seems. it's literally a bait on non-technical people.

but then again, you don't get to 1+ trillion valuation just like that...


Nailed it.


> I paid £1899 for my M1 Pro. The M2 Pro was £2149. The M3 Pro is £1899 again for same spec.

How do you think Apple has become a 1+ trillions company?


Due to iphone


Apple showcased their continued silicon innovation with the new M3 chips. Here's what's new:

Performance Leaps:

  * CPU Up to 15% faster than M2
  * GPU Up to 30% faster than M2
  * Neural Engine 60% faster than M1
New Capabilities:

  * Hardware ray tracing support
  * Up to 128GB unified memory (M3 Max)
Efficiency:

  * Built on 3nm process
  * MacBook Pro up to 22 hr battery life
Value:

  * 14" MacBook Pro starts at $1599
  * 24" iMac starts at $1299


Also negative 50gbps on memory speed


I was wrong, it's minus 50GB/s


Does that have a big impact on use-feel if you aren't doing data-write intensive tasks?


LLM inference speed is bandwidth limited.


The baseline M3 model is a complete joke, 8GB of RAM and support for only one external display.

This should have been a MacBook Air.

Edit: Oh, and one less thunderbolt port, nice.


It does have a much better screen and relevant ports compared to an Air.


The craziest thing is that that single external display can be up to 6k/60hz! Surely for those 21.2 million pixels you could run 2 4k displays at 8.2 million pixels each.


It's not that easy to turn one fast clock into two slower clocks.


lol I don't think lack of clocks is an obstacle. Probably price differentiation.


> This should have been a MacBook Air.

Great, let's compare it to the MacBook Air, then. Compared to the 8GB/512GB M2 MacBook Air, at $1399, the M3 MacBook Pro has:

- M3 chip

- Slightly larger display with higher brightness, HDR, and ProMotion support

- SD Card and HDMI ports

For only $200 more!

Sounds like an amazing "Pro" MacBook Air.


Ports and display alone make this worth it. I wouldn’t recommend someone get 8GB ram but it’s pretty impressive when you consider it has everything else the other MBPs have (minus 1 port).

Also I’m glad they finally discontinued that embarrassment of the last touchbar Mac.


The passively-cooled Air will also hit thermal throttling sooner vs a Pro.

(Yes, for most users just browsing the internet and writing documents, this won't an issue.)


I imagine it will go into the MBA in the near future. This offers a low price point for an MBP


Wait. Isn't it 8GB of _unified_ RAM?


I really don't know why apple is insisting on making the entry specs something like 512GB SSD and 8GB RAM for a ~$1600 device and worse for Their Mac Air line where it starts with 216GB SSD for $1000 (M1) or $1100 (M2). That's already a price tag of some of the high-end alternatives.


I was close to getting the air for $900 from Costco but stopped in my tracks from the low storage and upgrading just the storage is +$500. A frankly insane mark up for 256 GB of storage.


>and upgrading just the storage is +$500

The base model airs all have 256GB now. Upgrading to 512GB is $200, not $500. I don't think any Mx laptop has ever had a $500 charge for a one-level storage upgrade.


The base model is offered at a discount at Costco but there's no such discount for the next level up in storage so the mark up must take this discount into account.


I think the worse thing is that many people don't even know it's a possibility to upgrade, if they want. I am not sure if it's the same in all countries, but third party sellers only have access to the base models to sell. I talked a nice couple out of buying a MacBook Air with 8GB of RAM at an in-store mini Apple Store at major department store recently. Of course the sales people don't mention that you can go to the real Apple Store and get models with 16GB (or 24 in the case of the M2 MacBook Air).


Right at the end at the beginning of the credits it said it was shot on an iPhone 15 Pro and edited on a Mac. I think it might have been an MBP M3 Max… flashed by so fast. It’s fascinating that it was shot entirely on a phone.


here’s the PR video on how they made that happen:

https://youtu.be/V3dbG9pAi8I?si=XqzYSZVpegOFMDTN


Thanks for posting this. At the end of that video is this tidbit of information:

Shot on iPhone Edited on Mac

Additional pro software and hardware used. Special thanks to Blackmagic Design and Beastgrip.


I get why they are calling out vs M1/Intel as this is primarily targeted at getting folks to upgrade but it is kind of annoying that they aren't emphasizing the incremental vs the last generation. Also, the callout to AI developers to get an ok GPU but with 128 GB of unified RAM is pretty smart.


Agreed, though they at least put M2 comps on screen in most of the places they did the M1 comps.


Will these M3 chips be competitive against cheap Nvidia cards for training small and medium sized networks?

M2 weren't, and Metal support on Torch was sketchy. Getting better lately, though.


For training you want the best NVIDIA card you can afford. Doesn't make much sense to use a laptop for training IMHO. There is an argument that the M3 Max is the best non-datacenter chip for inference with the ability to scale to 128 GB of memory.


Sure, I am thinking a bit ahead. That is, a Mac Mini / Studio / Pro with a M3 Max / Ultra could be interesting.


The base M3 model (whose price is a bit over €2100 with sales tax in the EU) has 8 GB of non-upgradeable ram. Yikes.


It’s pre-recorded so they don’t have to censor the audience’s boo


Think different.


It's great Apple realized cutting down on ports makes zero sense. My MBP has only two thunderbolt ports. I can see the new MBP also has HDMI, SD, and even microjack. I can finally throw my dongles and hubs away. That's the end of an ugly era.


You get back ports, but those base model M3 MBPs only support a single external display. You get HDMI and 3 40gbps thunderbolt ports, but can only use one of them at a time for screens. You need to bump up to the M3 pro models to get multiple external display support.


They realized?

This new “basic” “pro” has only two thunderbolt 3 ports but “pro” “pro” has four thunderbolt 4 ports.

Basic Pro even supports only one external monitor.

Basic Pro is very misleading people when they buy MacBook Pro.


> This new “basic” “pro” has only two thunderbolt 3 ports but “pro” “pro” has four thunderbolt 4 ports.

My understanding is that it can't be branded Thunderbolt 4 because it only supports one external monitor, but that it's effectively Thunderbolt 4 in every other respect.

The technical explanation for the limitation of the built-in display support is that the base SoCs have one Thunderbolt bus that supports 2 display outputs, one of which is used by the integrated display.

Still, there are workarounds: https://www.macworld.com/article/675869/how-to-connect-two-o...


That's not new, this laptop chassis launched in 2021. It's also thicker and brought MagSafe back!


Bit disappointed that they're still using the same mini LED displays. After the iPhone announce, I was looking forward to proper OLED with 1600 nit peak for the MacBook Pros.

Don't get me wrong, I'm very happy with my 16" Max display, but the dimming zones are quite visible, especially with poorly brightness-mastered content (some of which is on Apple TV+!).

I can't watch the whole event, but the continued complete lack of marketing toward gamers I must say is also a bit disappointing. You have these fancy GPUs, Apple; go for the jugular!


An OLED would actively put me off buying it for now, I think. Laptop screens are expected to keep operating a very long time; it seems likely that burn-in would shorten the product lifetime a lot.

My current personal laptop is a 2016 13" Pro, and it's basically _fine_. People keep laptops for quite a long time now; I don't think OLED is there on durability.


OLED image retention makes them a bad match for desktop operating systems that have a lot of static elements like menu bars, docs, task bars, etc.


It's especially annoying that they announced ray tracing support but didn't show a real title using it. I wonder what happened internally because I'm sure they were working with partners to get it working... I mean, RE Village launched today on iOS and macOS and would've been a great demo for graphics and to boost sales.


Switching display tech is a massive ordeal for Apple.

Because a huge point of pride for Apple is that regardless of the Apple device you are using, color accuracy & picture are the same and matched across devices (between iPhone / iPad/ Studio Display / MacBooks).


The first in tech - skipping a generation in performance comparisons! M3 being always compared to M1, not M2!


They had comparisons to both.


I feel like it's Apple realizing (I mean they likely already knew it, but this is sorta publicly saying it) that people are not going to upgrade their Mac's every year like many do for iPhones. Or likely even every 2 years.

So I think the focus on Intel and M1 (but still showing M2 but just not saying it out loud) was the right call.

I wouldn't be surprised if when the M4 roles around we are still hearing about M1, but maybe they drop intel by that point.


Virtually no-one upgrades their phone every year. For the US you're looking at about 2-10% depending on which poll you believe.


Even 2% I would not call "virtually no one". 232 million iPhones were sold in 2022, 2% of that is still 4 million iPhones.

However looking at another survey of specifically iPhone users I see as high as 36%.

Going further, of those that upgrade every year how many of those are getting the highest pro models?

Or are subscribed to Apple iPhone Upgrade Program. Even if they are a lower portion of the population they are a consistent and important part of the population.


This is either to make the performance improvement bigger or to target user with m1 because they know most of their users are still using m1? (Me included with M1 pro and M1 max)

The perf difference though is still impressive compared to m2. It is bigger than the m2 > m1 upgrade


it is to target intel hold-outs.

this year’s model is this many percentage points faster than the first chance they had to ditch the intel mac, and the mental math for converting whatever they internalized to decide to hold out to a comparison against the m3 should be easy.

some early adopters will salivate, some will pull the trigger, but the holdouts should be near their breaking points. the fear of shipping dates slipping due to early adopters with fomo will drive holdouts further towards the cliff.


I think they are targeting us M1 Pro upgraders. M2 Pro/Max wasn't necessary, then AI hit, and now they had a perfect message for a perfect customer


Apple’s been doing this for a while. Hardly a first.

Charitably, its more realistic. Not many people upgrading from M2 to M3.


Also comparing it to an intel based macbook instead of intels newest offering


I noticed that... Kind of weird. But guessing for marketing copy, "60% faster!" is an easier sell than 18% or whatnot.


Apple is trying to get people to upgrade from the M1 family, not people who have recently bought a laptop with an M2 chip.


Both, and also some to the old Intel machines. This is probably more relevant for actual purchasers; approximately no-one would be going from M2 to M3.


I figure it's part marketing and also part targeting the individuals most likely to upgrade. With an M2 I'm far less likely to pull the trigger.


And Intel CPUs


This makes sense to me. Very few people are upgrading the M2 laptops they bought earlier this year. I'm finally in the market to move from Intel because visionPro dev requires Apple Silicon. Also - last Intel laptops were less than 4 years ago. Even as a power user I only upgrade every 3-5 years - regular users even less.


Not just Intel CPUs -- old Intel CPUs in old discontinued MacBooks.


People with those Macs and the M1 are the most likely to be considering an upgrade.


They compared to both in all the charts I saw. I expect most people with any interest in upgrading are on M1 or lower, so it makes sense from that POV.


Per the benchmarks, the M3 family of chips is 15% faster than the M2 family of chips.

That doesn't seem as much of a performance gain as expected given the move from 5nm to 3nm.


Whatever opinions I have against Apple I just cannot deny their ability to not only innovate but also deliver every year. I wonder just how far into the future they're working on stuff internally at the company. Well done to this company.


It’s like they can do nothing wrong. Everything seems so meticulously worked out and competent. Nothing good can last forever. Wonder how many more good years Apple has left.


Their hardware org is amazing. The software org, decidedly less so.


Spend some time using Siri and you might change your mind.


Here's a Web result for "change your mind."


> Basecalling for DNA sequencing in Oxford Nanopore MinKNOW is up to 20x faster than the fastest Intel-based MacBook Pro and up to 36 percent faster than the 16‑inch MacBook Pro with M1 Pro.

What does the average consumer know about basecalling?

I mean, they are still selling just consumer electronics. Scientists and industry use Linux and Windows, and speed isn't the main reason.


Ironically a parallel comment laments they showed off checking the weather. Can’t win if you’re apple!


There's an entire industry of consumer products that take massive amounts of compute called "gaming". Apple could have showed off the chips performance at that.

Or at least something like photo or video editing?


> I mean, they are still selling just consumer electronics. Scientists and industry use Linux and Windows

Anyone who's worked at a research university knows Macs are popular among professors, researchers and students. These folks also start companies and work in industry and they generally don't switch platforms unless it's absolutely necessary.


A lot of marketing is aspirational though. People want what the pros use. If it's their top hobby or they aspire to becoming one of those pros... then they want the best, no limits if they can afford it. Even if many of those buyers realistically never will hit those limits.

I suspect that impulse sells a lot of MBPs, DSLRs back in the day, Photoshop licenses, sports cars, high end ski gear, etc. etc.


The advert at the start seemed more targetted to creatives, not 'average' consumers - whatever they are.

Also, are 'average' consumers watching this kind of marketing? I suspect not, only enthusiast nerds like us are?


"Liquid Retina XDR display" I'm thinking ol' Steve wouldn't have liked this branding direction.


Don't forget the MagSafe(TM) port, Force Touch(TM) trackpad, and Magic Keyboard(TM) with Touch ID(TM).

(though tbh this is not a new thing from Apple, they did this even back when Jobs was around...)


Maybe they’ll soon up the resolution and have a Quadra XDR Display


mixed metaphors here


I believe the whole event contained 0 actual camera footage. Everyone on screen, down to Tim Cook, was an animated prop.


The event ended saying the entire event was shot on iPhone and edited on a Mac.


I was sitting here thinking to myself, "There's no way its that dark in Cupertino right now??"

I guess they never said it was a live-streamed presentation..


First thing I noticed since I'm sitting here in Mountain View looking at a nice sunset just above the horizon (or specifically, the Santa Cruz mountains), so nope, not close to dark yet. Maybe another hour.


These events have not been live-streamed in like three years.


Prerecorded presentations are prerecorded.


I can't find any details on Dynamic caching. Anyone else?

I'd be interested in instrumentation on the memory allocation.


Also the base 14" M3 only has one external monitor support vs two in in M3 Pro and 2 Thunderbolt 3/USB C port vs 3 Thunderbolt 4/USB C ports.


This is such a terrible way to segment the line.


Yes and they do not make this clear and then people come crying to reddit when they find out.

It gets even better with Apple steadfastly refusing to support MST well over a decade after its introduction so even if you have an M\d Pro or Max CPU many docks and chaining monitors will not work.


how do they not make it clear its literally on https://www.apple.com/macbook-pro/

even on the order page https://www.apple.com/shop/buy-mac/macbook-pro


I just bought a 14" Macbook Pro on sale with the top spec M1 Pro and 1TB storage. I'm already amazed at how powerful and efficient this laptop is. I can't see myself wanting/needing to upgrade for another couple of years at least.


M3 is absolutely impressive, but the event overall was... short? Uneventful? Underwhelming?


It went as expected given the weird event timing + no in-person showoff.

Oddly no gaming demos which people expected.


Yeah, I was hoping for Mother Nature S01E02


So. Tempted. Decent ports again, decent keyboard again, plus the M3 goodness, and it comes in black :)

But the MacOS shitshow is a deterrent. Anyone know if you can install Linux on them?


There is Asahi Linux, but I wouldn't buy a Mac to install Linux. Either buy a laptop with Linux preinstalled, or a PC that you know is well supported.


> So. Tempted. Decent ports again, decent keyboard again

Maybe I'm missing something but both of those things are unchanged from the M2 model right? Maybe even the M1 MB Pro had the same as well, can't remember


Yeah, but version 3. I have learned over the decades to not consider a new tech product until it hits v3 ;)


Why is MacOS a shit-show? And why would Linux be better?


to me, every release is more locked down and dumbed down.

and privacy. I remember when macos used to ask you before sending data back to apple. Now it does it all day every day, with 1000 pages of privacy policy explaining what they do, but little or no ability to prevent it.


> every release is more locked down and dumbed down.

Do you have some specific examples, that are still a problem?

You can still switch over to "unsafe" mode, and turn off SIP.


People say this every time. It's no more locked down than it was 10 years ago.


pretty much this.

It's a consumer OS, same as Windows, designed to be as idiot-proof as possible and protect the user from making stupid mistakes. I'm not in that target market (though I still make a lot of stupid mistakes).

Neal Stephenson said it better: http://www.team.net/mjb/hawg.html


I'm using a 14" M1 macbook pro for work and frankly, the hardware is good/okay but MacOS is awful.

I'm keeping an eye on Asahi Linux, i might pull the trigger and use that on my macbook.

But I'm skeptical about this, I might just buy a second hand thinkpad (as usual) or maybe a framework laptop (waiting for reviews on the 16" model) and call it a day.


That's pretty much where I am too. Using a Purism 13 laptop at the moment which is great but showing its age, and I don't think there's a processor upgrade path for it. So at some point in the next couple of years I'm getting a new laptop, and Framework is the leading contender at the moment, but the Apple chips are looking mighty tasty.


I'm surprised there's still no pro chip for the iMac, and I guess at this point we should assume there never will be. It's a shame. Give me a 27" M3 Pro with target display mode, and I'd be awfully tempted.

As it is, there's nothing (other than that new color) that tempts me to update from my 16" M1 Pro, but I didn't expect there to be. I suspect I won't be eyeing upgrades until at least M5 and more realistically M6.


Mac mini + Studio Display. It's expensive but I don't know that an iMac Pro would be any cheaper.


> I'm surprised there's still no pro chip for the iMac, and I guess at this point we should assume there never will be.

I wouldn't assume that. It's likely there will be a 32-inch iMac next year running on the M3 Pro or M3 Max.


that, and the missing 27" iMac, seemed to very keenly indicate that if you want more power or more pixels, Apple will push you to buy a Mini + a Studio Display.


13,3” MacBook Pro is discontinued in favor of 14” MacBook Pro with M3.

I say good riddance.

Would love it if they did made the bezel smaller on an iMac though, alongside the M3 upgrade.


Pretty underwhelming a bit like the A16 release. They basically upped the frequency in A16 and that's how they got their single core performance improvement, not much of an IPC increase to speak of. Given the numbers they quoted, same story will repeat for M3. And A16 also came with a power consumption increase so the 3nm process from TSMC is disappointing as well.


Going to be looking for outside validation of that Max GPU performance before making a buy decision, I think. Biggest hopes for this generation were dramatically better GPU performance--and Apple says they've got it--but we'll have to see how that actually pans out.

I was also hoping for a fourth USB4/TB4 port, which doesn't seem to be coming. Oh well.


MacBook Pro product page/tech specs is up: https://www.apple.com/macbook-pro/specs/

Prices for the old Pro/Max compariable SKUs seem changed: Pro SKU is $100 lower, Max SKU is $100 higher?

Notably, starting at 18GB RAM for M3 Pro, 36GB RAM for M3 Max (+4GB bump for each)

Pro/Max GPUs have lower memory bandwidth?

Up to 12 hours battery life using wireless web on M3 Pro and Max 14 inch, 15 hours on 16 inch.


The full 40c Max GPU is back up to the full 400GB/s bandwidth, but you're right, the Pro has significantly lower memory bandwidth, and the 30c Max does too.


I wonder what fps is baldur gate’s 3 playing on m3, m3 pro and m3 max

A lot of users are reporting not consistent 30fps on m1 even after larian studio worked with apple eng to port BG3 to mac


To be honest, Act 3 has ridiculous frame drops on a dedicated GPU. Act 2 is also heavy but Baldur’s Gate (the town) is just very graphically heavy for some reason. I wouldn’t be surprised if it’s just that same performance issue.


I used to buy Pros for years until I more or less accidentally ended up with an Air for a period and never went back. Lighter, enough performance and cheaper.


I'm using an M2 Air professionally for around a year now, it's excellent and it makes me seriously question why developers need macbook pro's.

The only limitation I notice is that I can only have a single external display.


> makes me seriously question why developers need macbook pro's

For work I am developing a massive JVM monolith server application and even the M1 Pro 16" takes minutes for it just to start up. Compilation is also slow with millions of lines of code. I have SQL Server running on an x86 Linux VM with Rosetta in Docker. Need all the CPU I can get. The Macbook Air simply has less CPU performance and is thermally throttled. It would waste too much of my time.

For my personal hobby C++ coding, the Macbook Air 8GB would be enough.


Yeah, the single external display is the biggest limiting factor for me. The performance is generally there (except for more heavy compilations). I'm thinking of picking up a M3 Pro Mac Mini when those eventually release to combat the display limitations.


RAM and displays, for sure.

But also, Airs used to be absolutely AWFUL devices in terms of performance when they were still intel. I think the stigma has stuck around.


Honest question,but how do you live with 8GB or RAM? I have 16 now and I run out regularly without doing anything too crazy (though in large part because Emacs and CIDER leak memory like crazy)

Doesn't having to close background applications constantly absolutely maddening ?


You can get up to 24GB RAM on the Air.


did not know that! The Apple webpage somehow threw me off. Thanks. Would be a bit hard downgrade from a 16inch 3:2, but I'll keep it as an option if I need to get a new laptop at some point


I would never buy an Apple device but damn the 128GB memory option is tempting. I am still looking for a decent (business) laptop that has 64Gb but the options I found until now are limited... Lenovo P-series offers 64GB max and the max I could get on Dell is 32GB. (If someone has alternatives: please enlighten me)


Lenovo has been offering 128 GB since P52, that is for around 6 years. The current model with 128 GB RAM is P16 Gen 2.

https://www.lenovo.com/us/en/p/laptops/thinkpad/thinkpadp/th...


128GB models have been around for quite a while... Essentially any laptop with four DDR4 ram slots should be able to take 128GB ram.


AFAIK 64GB SO-DIMM DDR4 modules are not available, so you need a laptop with 4 ram slots, that's just a few laptops in the workstation / gaming series of a few vendors.


Weird. There are Dell laptops with 64gb memory on Amazon.

https://www.amazon.com/Dell-XPS-Business-Laptop-Processor/dp...


Yeah have a 64Gb dell here (precision 7670)


> If someone has alternatives

If you want notebooks with 128 GB memory, you want to look at the highest-end workstations.

Dell Precision 7680 and 7780 (comes with new CAMM memory form factor)

HP ZBook Fury 16 G10

Both of these have options for up to 128 GB (ECC) memory.


Good lord are those things designed like shit.


Could you elaborate on what you think is 'shit'? Do they look bad, are they functionally worse, or worse on performance/thermals/etc?


Yes, they look bad. They look like products that were designed by passionless bean counter MBA types that create products off a spec sheet with zero regard for the end user. It’s a 1.2” thick lump of plastic (somehow heavier than an all metal 16” MacBook Pro) that will creak and bend when you pick it up reminding you how little care was given to it.

http://imgur.com/a/T2ZVZ8R

Horrific.


> 1.2" thick lump of plastic

I don't know what that measurement means. Neither the Precision nor the ZBook are plastic; they have metal chassis. That's how magnesium, aluminium, and titanium look like without anodising them with fancy colours like Apple does.

These notebooks are also thicker and heavier than usual because they have to dissipate something like 250 W of heat. Good on Apple for designing such power-efficient SoCs; many high-end Windows notebooks have to make do with Intel CPUs and NVIDIA cards that will perform better than that MacBook Pro, but unfortunately also draw about twice as much power.

Finally, these are business workstations, not fashion statements. People—or rather, companies—buy them to get work done on them, not show them off in Starbucks. They are meant to be serviced easily and quickly, and come with up to 5 years of next-business-day onsite support with 24/7 telephone service.

Both the ZBook and the Precisions have 4 DDR5 slots, 3-4 NVMe SSD slots and a WWAN NVMe slot, a replaceable (though possibly not upgradeable) GPU card, an easily-replaceable display assembly and battery, and a productive keyboard with a keypad, and a reasonably large and accurate trackpad with buttons. They have Ethernet ports, USB-A ports, and can support up to five high-resolution displays.

These workstation product lines have also come with service manuals for the past two decades. Apple released service manuals for its own products this year, after significant regulatory and consumer pressure.

The MacBook has its advantages, and I fully understand why someone might buy one. But your argument is in extremely bad faith—it 'looks ugly', but you haven't seen it in person. Having actually used these machines for 5+ years, I can tell you they are pretty damn solid.

I daresay these don't even look that different to Framework's notebooks, which are the apple (pun not intended) of HN readers' eyes.


Businesses ostensibly buy them "to get work done on them" but they actually tend to do so regardless of the work, and they're often truly awful day to day form factors loaded with so much crap software that getting work done on them could be considered lucky. Everything has tradeoffs; rapid serviceability, good deals, and high specs, are all useful traits, but if they come at the cost of introducing friction between the operator and the task, then that's usually ignored by business. Sometimes that's a great tradeoff and no friction is introduced, but that's rarely part of the consideration in a top-down hierarchy where purchasing decisions are made by some other department en-masse.

There is some threshold of computational demand and cost past which the specs dominate other factors, but before that I personally consider friction in various circumstances to be a higher priority.


Most of these notebooks also come with fairly spartan Windows images. They can also be configured from the factory with Linux (usually Ubuntu).

Plus, any company worth its salt that deploy these notebooks would re-image them anyway.


They might come from the factory that way, but often they're re-imaged with much less spartan Windows installations. My personal reference for this is ages ago in real time, but relatively recent in corporate years


Dell Precision 7780 is a terrible machine. It's heavy, gets extremely hot and is just not good to hold. You might find some use for it if you intend to work it in cold desktop-like setting.


Honest question: What tasks are you doing that require 128GB RAM?


> What tasks are you doing that require 128GB RAM?

You’re thinking about it the wrong way.

The more RAM you have, the faster + smoother your experience using an application will be, and this will be much more noticeable if you usually run multiple applications at the same time, like virtually everyone in the world do.

This is especially true now that we have options to run local-only AI models. The next couple of years will be interesting.


> The more RAM you have, the faster + smoother your experience using an application will be,

This is the mentality that MSI targets with the $1300 AORUS Z790 Xtreme X motherboard. More is clearly better, right? Nope. Not at all. In this case, there is an amount of RAM an app will consume and you have that much, the rest will not do much. Even buffering/caching only can consume so much. It's very hard to fill even 64GB in a laptop but 128GB is near impossible.


While I do agree that 8GB is a little dated, even the 8GB MacBook Air from 2020 runs most applications smoother than modern Windows machine with twice as much RAM. Apple have the smoothness part figured out, without the need for more memory (for the average consumer at least).

Your point about local AI is pretty interesting. It does seem highly likely that computers with limited memory could "age" faster than they have done in the past ten years, with the advent of more AI workloads.


A 8GB Air can exist without a 8GB Pro.


Software development. Microservices running locally (Docker/Podman, etc.)


What do you mean, Llama 70b at full precision will barely fit ;-)


Large neural net inference.


That's probably a threshold where I'd start looking at a well integrated solution for remoting into a decent server. Do you actually need/want to use 128GB locally? Not disputing that if you do have a requirement, but have you considered a smaller machine and spending the rest on a non-mobile decent setup? The battery and weight are really affected once you go into the mobile workstation territory.


Framework (https://frame.work) offers up to 2x32GB RAM support.


ThinkPad P1 gen 6 offers 96GB max. [1] I usually buy the minimum version of memory and SSD, and then upgrade myself.

[1] https://psref.lenovo.com/syspool/Sys/PDF/ThinkPad/ThinkPad_P...


> I would never buy an Apple device

Why not?


Good question. I would like to hear OPs answer.

Personally I've been tempted by the hardware, but I don't care for Apple's software bundle (MacOS, iTunes, etc.). With the M-series they have show indifference/reluctance/hostility to work with other vendors and the open source community. Asahi Linux has made major progress, but despite support from Apple.

Another (minor) frustration is the keyboard layout. I'm user to having backspace, delete, insert.

Finally, Apple's offerings are a bit heavier that the ~1kg ultrabook I like to work with.

[Summary] I wouldn't say I would never buy Apple, but so far they haven't presented products appealing to me. And at the root that is caused by their approach to stewarding their "ecosystem".


I am a linux fanboy. I have worked on an Apple macbook before and I did not like it. Apple is too expensive for what they offer (my opinion).


No OP, but UK keyboard layout on Apple devices sucks, seperately from overriding decades of muscle memory reaching for cmd instead of ctrl. Could never get used to UI/workflow. No native window snapping (back in the day at least). Workarounds required for a lot of my tooling, no native containers, until recently no native VM solution (believe UTM solves this). Brew.

As a disclaimer I do think Linux with Gnome provides the absolute best developer experience so you can take all my opinions with that in hand.


> No OP, but UK keyboard layout on Apple devices sucks, seperately from overriding decades of muscle memory reaching for cmd instead of ctrl.

Is cmd somewhere else on UK Apple keyboards, than on US ones?

If it's like the US ones, I spent my first 25ish computing years on Windows and Linux (and Sun, et c.—point is, not Mac) but after getting used to Mac's shortcuts, I gotta hand it to them: what they've done is more correct than what everyone else is doing. Ctrl-shortcuts involve more stretching and wrist movement, so, are slower and push one closer to RSI, and putting more common shortcuts on cmd than ctrl leads to fewer collisions (no awkwardness in the terminal, for example).


It's why I rebind ctrl to caps lock on all my devices.


>overriding decades of muscle memory reaching for cmd instead of ctrl

You get used to this in three days.


Then you go back to your other devices and get it wrong there instead.


If that's the issue then configure your mac to switch command and control. It's somewhere in settings.


Then you do some simple remap on your other devices and live a consistently happy life after that


I don't get it, the button swap is trivial to configure, how can it be such an important criticism? (though the cmd key is more comfortably located, so it's better to do the swap on Windows)


I completely agree with you. As a developer working with containers; Linux is the only viable option.


Personally, you pay for a thing that you can't modify in any way and it ends up being not yours.

I don't trust the software either.


Because despite being currently at the forefront of technology, Apple has shown time and time again to be anti-user.


“Apple bad” probably.


Through several comparisons and careful word choices, it seemed clear that Apple is strongly hinting that Intel Mac users should seriously consider updating to M3, or Apple Silicon in general.

This may indicate that the next version of macOS could support Apple Silicon exclusively.


That will only happen when all the Intel Macs are depreciated and no longer receive major OS updates, which will take a couple years (macOS Sonoma's minimum supported MBP is from 2018)


This release cadence seems too much. I don't have insights on why, but I'd be shocked if there's this much demand. Even a 2018 Mac works fine for most work information work.


Looks like they are reusing almost everything from last years model. If they have the new chip ready, it makes perfect sense to start sticking it in the MacBooks they are putting out.


What release cadence? One a year?


This is the second generation of Macbook Pro this year. The M2Pro/Max where introduced only in January.


As the other person said, it's been 9 months since the m2.

You don't need to defend Apple here. They'll be fine. It's OK to question this.


Why should anyone care how often a business sells a newer product?


Well it's frustrating that basically no matter how close you buy a device to launch it's almost immediately invalidated. I got an M2 max 3 months ago and it's already last gen...


What is the definition of invalidated? I am assuming your machine is still capable at doing what it did 3 months ago.


My guess: It’s not the shiniest, newest toy model anymore. Many people want the latest greatest just for the sake of having it.


I do understand my machine is still (more than) capable. But it is frustrating when the release cycle is so rapid, it's not just a desire for 'latest and greatest', it's an almost immediate loss of bang for buck unless you buy it basically immediately.

They released the M2 max less than a year ago.


I wasn’t defending anything, I didn’t realise it was that recent. Way to assume, though. Really engaging in good faith, mate.


As a customer, I very much like more frequent updates. This removes the problem of having to time your purchases right to get recent hardware.


> Better for the Environment

I wonder if with Apple's new support for repairability has translated to them making their new laptops more repair friendly.

That would certainly be much better for the environment.


The thin gradient text they used is out of step with most modern advertising, and even their own, I wonder what kind of advertising/design philosophy they're trying to push


Earlier submission:

https://news.ycombinator.com/item?id=38078063

(483 points/6 hours ago/590 comments)


I heard mini-LED might be announced this time, but I guess not?


They’ve had mini LED displays since the M1 MacBook Pro.


is this why m2 mackbook pro has such terrible pixel response times? swiping work spaces becomes a blurry mess, not seen displays this bad since we moved from CRT to LCD


Ah I misread the Macrumours article.


No mention of the Ultra.

Should we expect an M3 Ultra, with the interposer, or have they ditched that idea?

I can't really tell from the die shot if it's been cropped out, or is just missing.


The Ultra will come when the Studio (and Pro, but that was just released) is spec-bumped, I am sure.


It's weird they mentioned something about how you can do 8K video editing with Premier Pro, but didn't mention Final Cut Pro X. Are they killing FCPX?


nah they just ported it to iPads, no way they are killing it


256gb of ram coming up! Wonder how this will affect nvidia with machine learning practitioners having so much unified ram for the gpu to work with.


As expected, this aged perfectly well for the M3 3 year prediction [0] and the competition is already left in the dust before they tried catching up towards the M1.

Now that Apple Silicon has stabilized for software support, it looks like a great upgrade from those stuck on Intel laptops since 2020 or below.

[0] https://news.ycombinator.com/item?id=25549796


Not really? Qualcomm just showed off benchmarks beating M2 and all we have to go off is Apple's dubious relative graphs that have been essentially lies before (3090). We will see how they really stack up when both are in shipping products.


As I said before, let’s see a product first running Windows, with native software and a decent x86 emulator.


??

It already exists? You can buy ARM Windows devices right now. I'm typing this on one. The x86/x64 emulator is pretty good but performance is limited by the slower CPUs.


I’m referring to a Windows ARM chip using the latest Qualcomm chip that is suppose to be so good.

All indications are that the Windows x86 emulator has relatively bad performance.


We can already use the Windows x86/x64 emulator. Unless you have specifics that Snapdragon X Elite performs worse under it than current chips we can expect about 60% the performance. Rosetta 2 is about 75%. Based on Qualcomm's Geekbench 6 numbers it'll have about the same x86/x64 performance as the M1.


Where is the DirectX 12 conversion?


They keep getting heavier. Not even sure I can reliably handle the 16" MBP with one hand.


Wow, way to drop the Baldur's Gate 3 spoilers, Apple!


Cool, the M2 should go on sale soon then!


Unfortunately this appears to be a drop-in replacement - can't click "buy" on any of the M2 MBPs if you can even get to the model page.


Retailers still have M2 MBPs..


You can now spec a 14‑inch MacBook Pro up to $6,899.00. You’ll be so pissed when you leave that bad boy at Starbucks.


22 hour battery life.. Fuck me!


They only compare it to the M1, does anyone know how the M3 would compare to the M2?


If you watched the video then you'd have seen most of the slides also featuring M2 comparisons. Though, not very impressive changes. Not enough for me to upgrade at least.


And it still only does 1 external monitor on base M3...

A $1600 laptop that can only use a single monitor. Wild.


yeah it is indefensible to me at this point that the base chips are still limited to one external display.

I was willing to cut them some slack with the original M1 chip, thinking there might be a decent technical reason for it with their first generation of laptop/desktop chips, but three years later it is absurd


> it is indefensible to me at this point that the base chips are still limited to one external display

They’re the base chip for a reason. Why would you add a feature a small fraction of that chip’s users would use when you can use that space for something else?


> Why would you add a feature a small fraction of that chip’s users would use when you can use that space for something else?

Because it is a basic feature! I am not asking for the moon here, I have no idea why the idea that a laptop should support more than one external monitor is controversial on this site.


> Because it is a basic feature

Not according to Apple. And, frankly, not according to me. Most laptop users will never hook up an external monitor. They don't need to. That's a fair base case.


Most users will never hook up a thunderbolt device so why not just have USB?


...Great counterargument.

Also for SD card. I bet that many Macbook users who don't regularly work on photos/videos (e.g. developers) has very little use for SD card reader. And indeed it doesn't exist on MacBook Air. Somehow that is deemed essential on Macbook Pro.

But for $1,599 you must be really deep into Apple's shoe to say 8GB and one external monitor is ok.


If it doesn't include an overpriced dongle accessory, is it really an Apple product?

Actually it would, my decade old MBP 13 has 3(!) ports for video out.


No dongle will help with only one native external monitor.


2019 Intel MBP has no native external monitor support. It's dongle hell all the way down.


But at least it can use a dongle to utilize 2 or more external monitors. The point is the base M1/M2/M3 cannot use 2 external monitors. At all. No matter how many dongles you buy.


It has native support for 4 external thunderbolt displays.


It’s not complicated.

If you benefit enough from the 2nd external monitor then you can easily afford and justify the upgrade.

If you’d just like the 2nd external monitor then go nuts and spend your money on the upgrade. Or don’t and buy a different laptop.


I agree that it isn't complicated, for completely different reasons though.

Supporting multiple external monitors is such a basic feature, it is absurd that apple insists on using it to differentiate products.

it's akin to making me shell out $400 extra dollars if I want the laptop to have a trackpad, or if I want both the left and right halves of the screen to work


It’s not akin to that at all. A laptop has one screen. You may like having 2 extra monitors but to call it a basic feature is clearly wrong.

It’s a basic feature to you. To Apple, for whatever reason, it’s a premium feature.

You’d probably be better off with a different laptop if you’re looking for a budget option that meets your needs.


It is a basic feature. I dare you to find a single laptop not made by Apple with an MSRP of $1000+ that only does a single external monitor that's currently sold. If all your competitors have it, it's table stakes.

Sure SOME people can tolerate it or are dumb enough to reward Apple for it, but it doesn't make a basic feature premium.


> If all your competitors have it, it's table stakes

If all you do is tail the competition. Most laptop users don’t use a second monitor. This is a silly gripe.


Lol "most"

Go to any software company and see how many people are using dual monitors with their company issues laptops

And many of those developers have dual monitor at home

Same for many other people who work on data/finances/etc who need multiple, large monitors to work efficiently


Presumably most people using two external monitors close the laptop lid?


What does that have to do with anything lol


Quite a lot. A laptop screen + external monitor is an awkward solution. Display size and pixel density issues make it annoying to navigate the combined desktop environment. I have very little interest in doing so, personally -- if I'm using an external monitor, I'm generally never using my laptop screen.


You still can't use a second external monitor if you close the laptop screen on a base M1/M2/M3. So that's why they're asking how is it relevant here.


Right my point was that wanting to connect two monitors isn't some niche thing like laptop screen + 2 monitors which indeed would be more niche


It's a $1600 laptop. It's sort of excusable on a $400 one. Not $1600 or even $700. Should $2000 be the bar for 2 external monitors? The OLD 2016 13" $1500 could do 2 external monitors. The 2020 Intel i3 MBA which was $999 could do two.


My 2015 MBP could do two external monitors.


> Should $2000 be the bar for 2 external monitors

For Apple laptops the bar is whatever Apple decides it is.

> The 2020 Intel i3 MBA which was $999 could do two

Neat! Buy one of those.


I have a honest question for people like you:

Do you get a buzz for defending the $2tn mega-corp? Why do you do it? What's the point?


Yes but to be fair I'm quite stupid.


The bar is where money is.

That's why I don't give my money to Apple.


From your own logic it sounds like Apple decided the bar was $999.


bars are movable


meh, they need to do something with the industrial design or OS. Feels like they made a big bang with M1 and then kind of stagnated.


If we're to believe them, the new M3 Max is "up to 11x faster" than the maxed out 2019 16" Pro. That doesn't seem like they stagnated.


Didn't they intentionally gimp the thermals on those machines though?


What do you expect? A 2x improvement in performance every year? There are limits to physics that would hit pretty quickly at that rate.


Something like a better screen, wireless, camera, weight reduction, ports, gaming initiative, I don’t know, something more exciting than just a spec bump.


They did give the MacBook Pros better screens and a better 1080p camera. It’s also way better at gaming with the improved performance. What ports do you want them to add?


Its just so strange to me that they can release some of the most powerful and efficient hardware we have ever seen, and show it off by telling us we can now (checks notes…) check off tasks in a widget on our desktop.


After several minutes of scientific computing, 3D rendering, game development, movie production, and Xcode, among others.


They showed music production in Logic / Protools about 50% of the human scenes too, which can be demanding.


I was just reading something related to that (can't remember the text) but the gist was that improvement in hardware do not follows software as they try to limit the control that the user has over that hardware.

My own impressions after 4 years with apple devices is that there is no expert mode on the Mac. It, and the iPhone and the iPad, are computing appliances, something you use to get work done. They either work for you, or they don't. The mac is more flexible than the others, but the whole ecosystem is trending towards making your workflows fitting the tools, not the tools fitting to the workflow.


So exactly how does the Mac keep you from running software you want to run?


- Notarization

- Opaque apps

- Locked down hardware

- SIP,...

You can go and find a solution for most of the shackles, but then you're likely to break down the whole system as it was not designed to be that way. The mac is fragile in terms of customizability. It's not that it's preventing running software, it's that it may break down if you do! (not that the others are better, but at least they're trying to be more resilient).


You can bypass notarization easily and SIP. By opaque apps you mean everything is not open source? What do you want to be able to do “unsolder the memory from the chip”?


Transparent apps for me means configurable up to the option of deleting them and replacing it for someone else. Instead of the tight coupling we have now in MacOS, I'd like to have federated applications bound by protocols instead of needing to update the whole system to add a single feature or fix a bug. I don't think you can delete stock apps without disabling SIP or something.

I don't need to unsolder the chips, or do anything hardware wise, but I'd like to be able to use the hardware without the OS if needs be. That's why I say it's an appliance. It makes sense in terms of business, but it's not like the user is free to use the device how he wants.


What the heck are you talking about? What types of apps do you think you can’t find alternatives for on the Mac?

And why wouldn’t you want SIP by default? This is what can happen when you disable SIP.

https://community.gigperformer.com/t/be-careful-if-you-use-a...

This was a known issue that Google later on fixed.

Do you run your Linux box using root?


> What the heck are you talking about? What types of apps do you think you can’t find alternatives for on the Mac?

It's not about finding alternatives. I did for a lot of mac's default. It's about replacing the default option. On Ubuntu, you can remove the default login screen and use KDE's panel alongside the i3 window manager. Because they're not tightly coupled together. Imagine if you could have replace the MacOs top bar, because it is a program that just answers to some IPC protocol? Or the window manager would expose windows and their placements so you could script a layout without the accessibility workaround.

> And why wouldn’t you want SIP by default?

I want it, which is why it's not disabled on my computers. But I'd like to set my own snapshot of what I meant to preserve instead of Apple's.

> Do you run your Linux box using root?

I don't but I'm always a real sudoer and I have the capability to do `su root` anytime.

As I've said, it's about expert mode, not something on by default, but that can be and the system has been built to support it.


With no disrespect to you, what you desire is literally the antithesis of Apple

Steve loved talking about the intersection of humanities with technology. I think he'd proudly agree Apple computers are meant to be appliances


It’s more like something that would be nice to have because experts will always wants to build tools that will fit them, not just take something off the shelf. It’s like the kindle reader which are perfectly fine for reading books, but some people want the perfect experience (according to them). So they will jailbreak it and install koreader which is the expert mode of reading ebooks.

I like Apple’s hardware. And I use the software. But it’s always a convenience for me, not something that fits how my mind would like to use it.


With no disrespect to Steve, we might have to settle on a better standard of computation than one that was designed to reinforce a trillion-dollar business model.


> Imagine if you could have replace the MacOs top bar, because it is a program that just answers to some IPC protocol?

That is irrelevant in the bigger scheme of things. I grew out of that tinkering phase after I was a student running Gentoo for a few years. Eventually most of us just need to get shit done. The default top bar is fine. It works. Virtually no one gives a f*** about replacing that. You are in the 0.0000001% and Apple will never cater to that. You can install Asahi Linux on Mac and play with the login screen there.


Again why would you ever trust a third party for your login screen?

You can script window layout with Applescript or JavaScript. You’ve literally been able to do that since the 90a.

If you had the programming chops, there is nothing stopping you from replacing the menu bar.


If you want to spend your time being productive/in the zone that’s what you want. Your workflows are in the music or movie production or graphic/3D app, etc… Macs always were about running apps and doing all your work there.

It’s like the difference between having a car for mechanics projects and racing vs actually needing to go from A to B reliably when you need it.


Looks like it’s raining outside, time to fire up the three AAA games available on this thing.


Apple's lack of serious long-term dedication to 3D gaming is such a self-own that makes their 3D gaming benchmarks feel weak & thin.

I'm hopeful the Game Porting Toolkit + advances in this years macOS (Gaming Mode, better support with MoltenVK and SPIRV-Cross, etc) start a publishing shift toward macOS as a target platform, but I can absolutely see Apple taking their foot off the gaming gas pedal again.


Macs are great to run Geforce Now.


And Xbox Cloud


But then... why would you need a Mac? Any PC can do.


Because I would pay money to not use Windows 11, and the Macbook hardware is generally better than the alternatives.


Good screen, silent, long battery.


Really hard to do ios dev on a non mac.


I think it's great computing hardware has finally met the threshold to run emacs as it was intended.


Yes the very first Mac OS feature mentioned was “now you can quickly see the weather.”


They really shouldn't mention the weather until they fix that app. That app is so inexplicably bad, it feels like an intern's summer project.


My biggest pet peeve: Apple has decided to convey low/high temperatures different than standard. Every major weather agency and app advertises the daily low as the low temperature during the night following the advertised high. Apple, on the other hand, advertises the lowest temperature during that calendar day.

This is a small but crucial difference. If I want to know the coldest temperature it will reach tonight, I need to look at the hourly forecast, because it's quite possibly that the low for today will be for 1am this morning (the past) whilst the low for tomorrow will be 11pm tomorrow, without an advertised daily low for tonight.


That app is like a series of contradictions.

At times it seems very averse to tell me about perception on the main screen ... EXCEPT the notifications where it sends me double notifications about rain, snow, whatever all the time, and it's right maybe 15% of the time, maybe.


> They really shouldn't mention the weather until they fix that app.

After your edit: What a pity! I thought to have just read a great one-liner about the climate crisis and how tech people want to solve everything applying tech.


and watch Monarch: Legacy of Monsters.


Finally, the promise of Active Desktop might be realized.


Don’t forget it can also play Myst.


myst with ray tracing is just upgraded crysis.

no one intends to play it.

that it can be played is the important part.


Did you miss the opening video?


Can't wait to see how AMD and NVIDIA respond with their new ARM offerings. It's going to be an interesting next few years!


My point of view as well! Several months ago I purchased the M2 MBA I'm currently using, and which I plan to use for the next 3-5 years. By the time I'm ready for a new machine, its excited to think about how far not only performance but efficiency will have increased by then. Perhaps we'll have reached a point where battery life will be measured in days instead of hours.


meh


Less space than a Nomad?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: