From a marketing/product positioning perspective, I liked that Apple was focusing on performance per watt as their main selling metric. Yes it's fast and you can likely advertise on that alone, but the per-watt performance is just leagues ahead of everyone else, especially with them being the first to release with TSMC's 3nm fab.
The 22hr advertised battery life is definitely a side effect of this. I'm super curious how long it ends up lasting in real-world tests.
Historically, though, focusing on performance per watt is a technique that hardware companies use to distract from otherwise not very impressive concrete performance improvements. You'd expect a new process node and chip shrink to bring modest performance per watt improvements even with largely identical designs. When it comes to GPUs, this has been combined with actually increasing the power draw to make the generational performance improvement look more impressive; so technically, performance per watt would improve, but the footprint would still increase, and yet the performance improvement would still be rather lackluster for any given price bracket. (This especially because you're competing with last generation's current street prices and used prices: if I can get a used M2 machine that's competitive in performance to a new M3 machine in the same price bracket, it might be more worthwhile.)
That's not to say there's no value in focusing on energy efficiency, especially for laptop chips; Apple Silicon was leagues ahead in energy efficiency when it launched, even if the gap is a lot smaller with newer Ryzen mobile chipsets. And speaking of Ryzen, the 5950X is a paragon of performance-per-watt in a desktop chip, which is of course still just as great as it makes it easier to power, cool and overclock. It's more to say that a shift toward focusing on energy efficiency could signal that they may be hitting road blocks in unlocking substantial performance improvements through better chip design alone, which of course makes sense, as its not like you can just linearly come up with design improvements every year forever, and they already came out swinging with their first generation chips.
The other point here is that "performance per watt" is how you get "performance" when single-thread performance is hard, because it lets you fit more cores into the same TDP. The top end EPYC 9754 that will stomp basically anything has a 360W TDP but that's for 128 cores, which is less than 3W/core.
The trouble for Apple is that although you can do this, they're paying the highest bid for first access to TSMC's new process, and doubling the core count would have used that much more die area. Which might have delayed the release as it takes more time to build up stock. So now we're into this weirdness where they give you a 22 hour battery life (how often are people awake for that many hours straight?) but they're getting outperformed by the Ryzen 7000 HX line on the older generation process node because it has more cores.
> Historically, though, focusing on performance per watt is a technique that hardware companies use to distract from otherwise not very impressive concrete performance improvements.
Before Apple came along and made people realize PPW is pretty damn important in the mobile/laptop space.
It's definitely something that at least vendors knew was important. However, ARM/AArch64 chips already decimated x86 in performance per watt even when x86 chips generally had a process advantage. But ARM laptops generally sucked. Yes, the battery life was excellent: even a crummy Samsung Series 5 Chromebook delivered pretty impressive battery life for its light weight and price. But the performance? Yeah it was shit. Good enough for a few tabs of Chrome and nothing more, honestly not so bad. But the calculus for phones definitely worked out in ARM's favor.
But that's the thing. Sufficient performance per watt is very important. Improvements to performance per watt are rarely unwelcome. However... you can mislead people regarding generational improvements when touting performance per watt metrics. If the design didn't actually improve much, a new process node and pumping more power in can cover a bit. Pretty much every vendor does this at some point, and even if Apple isn't currently doing this, pivoting towards touting the efficiency improvements rather than the raw performance is not a good sign, especially considering that these chips go in desktops too, and Apple still has some ground to cover if it wants to compete fully with high end desktop CPUs from the PC market.
perf per watt is not new and it is not just marketing hype, it is the limiting factor for not only notebooks but desktop gpus and cpus. the envelope is power vs. heat vs. performance. increasing perf per watt will let you do more in that envelope, whether it is in a notebook with batteries, or in a desktop with giant coolers holding you back.
Anecdotally, I've been able to use my M2 Max MacBook Pro for 12 hours straight with 50% battery at the end of the day. Nothing intensive, just Chrome, Emacs, Spotify, Slack, Discord, etc.
My 14" M1 Pro rarely got more than 7 hours and my former work 16" M1 Pro more than 9 on a React workflow. When introducing VM work or anything fairly heavy to the mix take 2 hours off each. Typically at 50-60% brightness, have iStat showing load so I can catch any errant processes.
My biggest issue with this release is no published improvements in battery life. 22 hrs is for movie watching. Wireless web - of course very low load - is still 15 for the 16" and 12 for the 14", same as M2 Pro which is each just 1 hour more than the M1 Pro.
I almost never touch the brightness. It does what it wants. Not sure. Looks like it's at about 1/3 right now, but it'd be brighter if it weren't kinda cloudy here, I bet.
Program choices matter a lot. I gained about 30% battery life switching from Chrome to Safari, years back. I do have Teams and MS Outlook running all the time on that machine though, and they're not exactly slim.
Today, we saw independent benchmarks of Qualcomm's new 4nm chip that provided 50% (Geekbench) to 100% (Cinebench) more multicore performance than M2, while using 23W. And it can efficiently encode AV1. Maybe M4 will catch up.
Well, benchmarks are nice. But a processor is not an end user product. How do you think this is going to be as a product once you put it in a PC running Windows - especially as Microsoft struggles trying to get third party apps on ARM and MS’s own substandard x86 emulator
I can not fathom the level of incompetence by vendors to want to install Windows on ARM devices.
1. The only reason you make people deal with Windows is backwards compatibility.
2. When you advertise it as a Windows laptop people will expect to be able to run their apps (they expect backwards compatibility). RIP your reputation and support inbox.
Yes, it will be harder to sell when it runs Linux. But it's the correct expectations management and at least it will suck less.
Oh well. This is what Google will do with their Chromebooks. Windows on ARM has the same future as the Windows Phone.
It's slower and one of the main things people want Windows to run is games. Also, games are one of the things emulation systems are most likely to break, because games use all kinds of weird performance hacks and come with heinous anti-cheat systems.
I had a Thinkpad X13s for a while that worked quite well except for a video issue (used pawn shop purchases are a risk like that). Firefox, Edge, and MS Office worked great natively on ARM. LibreOffice worked just fine via the MS x86 emulator. And the X13s had the old Snapdragon 8cx Gen 3 with less than half the performance. Granted, I wasn't doing any heavy lifting with it the month I had it.
I mainly got it to test out my PortableApps.com stuff running under ARM. I'm doing it now on a Macbook Air M1 with Windows 11 running under UTM.
I got it to handle 3 things: a laptop I can use as a laptop for basic stuff and to use to remote in to my development desktop at home, a Windows ARM machine I can test my Windows x86/x64/ARM64 software on, and a Mac to test out my software running under macOS via Wineskin. It's a little clunky but it works for all 3.
My original set was a regular full-fat Windows laptop, a super-cheap used Galaxy Book Go Gen1, and a used Intel Mac Mini. I later replaced the Windows laptops with the Thinkpad 13s. If it hadn't had video issues, I'd still be using that. But a used base model Macbook Air M1 serves the purpose for now.
I'll likely switch back to an Apple Silicon Mac Mini and a Windows laptop of some sort later as I much prefer a Windows laptop to my current Macbook.
Intel and AMD should get ahead of the curve and put some real support behind desktop linux, which has an actual path to ARM adoption for a much larger portion of its software.
But then again at least one of them should have been doing that 20 years ago.
I don't use Windows. The Raspberry Pi has great Linux support, and the Linux Geekbench scores were even higher than in Windows. Unfortunately, I couldn't find Linux Geekbench scores in the 23W configuration.
Without the ecosystem who wants Windows? And what will it do for reputation of the vendor or Microsoft when people can't run their apps.
People are already buying non Apple ARM laptops. They are called Chromebooks. They can run Linux apps and Android apps. And thats more than most consumers would expect.
Many schools in the US provide their students with laptops and Chromebook is the overwhelming favorite. The student has to return the laptop at the end of the year. As an anecdote, I know of no one who bought a Chromebook for personal use. My friends, colleagues and acquaintances are buying Macs or Windows machines if they want a laptop, iPads or Android tablets if they just want a tablet.
In a way, Google's strategy of getting Chromebooks into schools may have backfired as they're largely seen as kids' computers.
They are in use by the millions in schools. I have one and I really like, can run android apps, chrome web browser, and in the crostini linux system I can run any apps, dev tools, web browsers, emacs, and it is native. I like it better than raw linux because of the built in android support.
I gave chromeos laptops to my family because they aren't trustworthy. Now they have reliable laptops and don't get virus infections or os problems.
That's a complicated story. No one really wants surveillance capitalism. I don't think google copies what I'm doing on my chromebook, but almost every website has google tracking. Chrome has google tracking. You can use non-chrome browsers on chrome os, they are all there via the linux subsystem. You can also run the android ones.
This also shows the power of marketing. ChromeOS is a subset of Linux -- it doesn't do anything you couldn't always have done with Ubuntu. But for years people said that normal people don't want Linux, it doesn't run their apps, they can't use it.
One company shows up with a marketing budget and it's got triple the market share and is now up to the level that Mac traditionally held when all of the things "nobody makes for Linux" because "nobody uses it" supported that.
We're also at the point where things like bank websites don't "officially" support Linux, but as a general rule they don't have any problems on it, and if they did have problems it would be a problem the bank has to deal with instead of a problem the customer has to deal with.
That’d be pretty incredible if it plays out in reality once products are released. Though it sounds too good to be true.
I’m skeptical since Qualcomm has failed for years to catch up to Apple processors in the smartphone market. So why would their first effort in desktop processors be so much better?
There's actually a good reason. In short, a large portion of the apple silicon team ended up leaving a few years ago to could start a new company named Nuvia. Their goal was to produce high performance chips for the enterprise/server market, and they had some very aggressive performance targets [1].
Then, in early 2021 Qualcomm ended up acquiring Nuvia, and these new chips are the first showings of this acquisition. Naturally there's a lot of hype since said team represents a lot of the talent that made apple silicon so good in the first place.
Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first, not because they had some kind of geniuses on the project.
It is no doubt promising that Qualcomm has brought in more talent, but it still takes time and effort to turn that into a best-in-class product. I’m not saying it’s impossible, though I’ll be skeptical of the hype until I see a real product.
> Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first, not because they had some kind of geniuses on the project.
Right, and the guys who learned all the hard-won lessons along the way walked out the door to start a company, bringing along expert knowledge of Apple's designs and processes. And then Qualcomm bought them.
So on a surface level it seems implausible that QC could produce such a chip. But when you zoom out and go "oh, Qualcomm effectively bought Apple's senior chip engineers" it starts to make more sense.
It would be like if Qualcomm's top modem engineers started a company, which Apple bought. And then a couple years later Apple's long-running modem project mysteriously turned a corner and was ready to launch an exceptional modem. Like yeah, no kidding.
So yes, we need to see independent benchmarks and make sure it's not hype. But it's not so unbelievable that Apple's former top engineers could also produce a good a chip for another company. There's nothing magical about the Apple office--it's the engineers.
Engineering something as complex as a CPU is a long process regardless of how smart and experienced your engineers are. I mean, you can certainly speed it up with great talent, but there is still long and hard work to do with any difficult engineering challenge.
I’m not saying there’s something special about Apple other than the scale of their investment over a long period of time.
It’s the same deal for Qualcomm and their 5G modems. Apple no doubt has hired many talented engineers to make a custom 5G modem. But Qualcomm’s modem is still the best one around. It’s hard to catch up because Qualcomm has been investing heavily in that space for a very long time.
Again, that’s not to say Apple won’t ever catch up. Just that I wouldn’t expect that their first effort will be better than Qualcomm’s modems.
Nuvia has been working on this tech for years before being snapped up by Qualcomm. And before that, those same engineers had worked on Apple silicon for years. Why do you keep thinking this is an overnight thing?
To be fair it’s a bit of a myth that only mobile cares about efficiency and thermal management. It is definitely a factor for HPC and server too.
Apple scaled iPhone first designs up to the M* Ultra chips. Going from HPC to a mid wattage laptop is definitely serious work, but I don’t think it’s impossible. Especially with ARM.
The whole point of this thread is that those same Apple engineers made these Qualcomm chips.
Yes. Apple iterated over many, many years. Learning so much along the way about how to make performant, efficient ARM designs.
And then a bunch of the most important of those guys left to start their own company.
And then Qualcomm bought that company.
Y’all are acting like a few college kids from Stanford made Qualcomm a new CPU over their summer internship. “It takes longer than that to make a good CPU.” Yah no shit!
You are taking crazy pills. Making a high performance chip requires more than just having a bunch of talented and experienced engineers. Is it a necessary requirement? Sure! But it’s far from sufficient.
Apple brought on PA Semi and then slowly iterated on actually shipping hardware for years. They didn’t hire PA Semi and have a best-in-class product on the first go.
And those same guys who slowly iterated on shipping hardware for Apple for years are at Qualcomm now.
Are you saying it’s a requirement that these guys ship a crappy chip first? Why? They already know how to make good ones.
Can you tell me what more they need other than their talent and years of experience to make a good chip? Because if it’s just “I demand they make a bad chip now because they’ve changed logos on their corporate polos” I don’t think this conversation has anywhere to go.
Is the assumption here Apple has developed a business process for building best-in-class CPUs while treating its engineering workforce as fungible commodities? If so they've succeeded in doing what Intel has been trying to do for decades.
> Apple Silicon was great because Apple had invested huge amounts of resources for a decade on smartphone processors first, not because they had some kind of geniuses on the project.
Given that they were confident enough to leave and start their own company, I'm not sure this is true. Indeed I wouldn't discount the value of high talent density.
> I’ll be skeptical of the hype until I see a real product.
They have shown real hardware demos [1] to reviewers already, and the numbers look solid. Obviously there are no comparisons vs M3 yet, but it seems promising.
Not one or two Geniuses, but a really good team. They lost supposedly key members of that team which threw them off track. An organisation can recover from this, but it takes time and money. Not everyone likes to work under Apple-like working conditions towards Apple's goals.
Sure, but they’ve been behind the whole time. It’s not like they’ve been trading blows each generation with Apple.
If the premise is that Qualcomm hired a team of super talented engineers who can build a product that competes with Apple, then those engineers will still need time to develop a product.
Again, maybe their new processors will be everything they claim. It’s possible for this to happen. I’m just not willing to buy into the hype yet.
I am going to go out on a limb and suggest that Qualcomm's pre-production benchmarks are of no value and the actual performance of real devices will be worse than Apple.
To be fair, I don't think many would have trusted that Apple's M1 chips were going to be as good as they were before they were actually in consumers hands. I'll reserve judgement too until that's the case.
What you didn't mention is that the Snapdragon X Elite has 12 high performance cores, making it a M2 Max competitor and not an M2 competitor, at least on the CPU side. The GPU is disappointing with no ray tracing support.
Perhaps the most interesting thing is the UK pricing. I paid £1899 for my M1 Pro. The M2 Pro was £2149. The M3 Pro is £1899 again for same spec.
The insults however come now. There is now a £1699 8Gb option. Seriously in 2023? 8Gb?!?! And you can get an i7 Lenovo T14 gen 4 with 32Gb of RAM and 1TB disk for £1297. Every 8Gb RAM increment with Apple is a complete rip off.
And the total build price for my silent i5-13500, 32Gb, 1TB desktop with an RTX 4070 and 27" 4k Dell monitor was £50 less than the base level 8Gb MacBook Pro.
When it comes down to it, I think you'll find most people buy Mac for the software.
I've tried using Windows and Linux and it's just not as good. If this weren't the case, then I doubt Apple could charge such exorbitant prices for their hardware.
I'm not sure why you'd go for a Mac for the software? The cloud experience sucks and is really really expensive, most of the provided desktop apps are buggity or so disparate to the rest of the universe that they are a disability and everything is pretty portable.
I mean here I am all VSCode + Linux (WSL2) + Office 365 + Adobe + Davinci Resolve + mathematical tools (SciPy/Maxima/Minitab) here. All are portable (apart from Minitab)
I agree that Apple's software is low quality. In too many cases shockingly and consistently so over many years. You are not surprised to see the spinning ball of fail when running Apple software. You expect their developers to make beginners' mistakes.
And Apple has some spectacularly useless systems developers that manage to have at least a couple of daemons regularly blow up and use lots of CPU in every release. It is almost like some elaborate running joke.
I mean, what idiot wrote the calendar daemon. How the heck do you keep a job as a systems developer when you need to burn 70-80% CPU for prolonged periods of time to sync a calendar that hasn't even had significant use for years? Seriously, if anyone knows who wrote it, I'd be interested to understand how things went so wrong for them.
Yes, Apple's software sucks.
However, I don't buy Apple to use their software or their cloud services.
I use macs for the same things you do: run the applications I need to do my job. Applications that are not made by Apple. My list of requirements isn't long.
- Must be a unix variant
- Must be decent hardware (most laptops are horrible)
- Must be able to run the developer tools I use
- Must be able to run Fusion 360
- Must be able to run professional photo editing software
- Nice if it can run Ableton Live
One such example is music production. Audio on Mac just works. You can plug in a microphone from 20 years ago and it will just work. With Windows this is not the case. You need appropriate drivers, often which are not available. This is not including all the issues with audio routing etc.
I will agree that native cloud sucks on Mac. I would disagree about the desktop apps, I think they are superior to Windows. Programming is also superior on Mac, in my opinion.
Personally speaking, the big thing I hate most about Windows are the defaults. Like, the default keyboard shortcuts i.e. using F keys in opinion, is very poor design. Also the lack of an additional modifier key makes Windows less ideal from this point of view.
Of course, each OS has their pros and cons. Windows just doesn't work for me.
Except when it doesn’t. Like for example with macOS Ventura, where lots of people had huge problems with audio dropouts on USB interfaces.
The problem was never really acknowledged by Apple (as usual), but “magically” fixed in Sonoma (without any mention in release notes).
Leaving many users with the dilemma that they should upgrade to Sonoma asap to fix the USB problems, but on the other hand should wait with upgrading because maybe their DAWs or plugins are not compatible with Sonoma yet.
Yeah my piano doesn't work with macOS Ventura suddenly. Audio dies after 2 minutes. I'm only using it for monitoring so I just plug bloody headphones in it and be done with the computer.
The audio HDMI output from my MacBook M1 Pro since release always had choppy buffer loops. That is until one of the most recent software updates. Finally it 'just works'.
I agree that it's a problem. But also, you don't have to upgrade to the latest OS. Most people wait to see if there are any problems before they upgrade.
I think this applies to most things in life. Especially programming libraries at software companies. It's almost assumed that something will break when you upgrade any kind of system.
I heard this a lot. I did music production on Linux for 5 years, and tried it on Windows for a few months.
Then I bought my M1 Mac and was shocked to learn that I think it's actually the least sophisticated, by far.
Don't get me wrong, my M1 hardware competes with my 5950x Desktop system... which amazes me.
But, Jack2 on Linux was pretty cool, a little rough around the edges, but worked really well.
Now with Pipewire becoming the standard, I can't imagine why anyone would pick a platform besides Linux for any real time Audio+Visual routing/processing.
It's just so easy to pipe any of your data to any other piece of software, including over the network.
> Now with Pipewire becoming the standard, I can't imagine why anyone would pick a platform besides Linux for any real time Audio+Visual routing/processing.
Software compatibility almost entirely non existent for industry standard workflows.
> It's just so easy to pipe any of your data to any other piece of software, including over the network.
Not a frequently required feature (or at all) for 99% of music production use cases.
Linux audio is getting there, but it's a million miles away from being a feasible choice for most people in the industry. I mean, I don't think I could find a single person who would trade their MacBook for a Linux laptop when its running their entire live setup.
Normally the mixing control software for older hardware that works fine like Saffire Pro (FireWire interface) or a Tascam USB interface that has an internal mixer that is impossible to use on the hardware. Eg. line/instrument switches are done digitally.
I ended up going back to Windows to use these and switched to Studio One instead of Logic Pro X.
After having so many windows laptops fall to pieces quite literally, I just get the MacBook now. They last so much longer and you easily get your monies worth
I don't. I buy Apple products for the hardware. They last and they work when it comes to iOS. When it comes to Macs, the software usability is shite. I've had to resort to 3rd party apps (Amethyst, AltTab) to be able to effectively organise and switch windows.
For the price of Macs, I get Lenovo and Dell graphic workstation like laptops, e.g. Thinkpad P series.
While I am a big NeXTSTEP fan, I am not paying for the experience in detriment of less capable hardware, specially when I can just rent Mac hardware from work from our R&D pool.
I switched from Lenovo X1 Carbon and totally gave up on Windows and bought an Mac again. Windows has become some kind of shitty garbage of an OS. Reminds me of the toolbar creep in IE that use to happen 10 years ago. Now the entire OS is fucking like that.
I generally disagree with arbitrary hardware comparisons, like "why buy apple when I could build some gaming rig for $5" for obvious reasons that have been beat to death in every comment section ever, but you're very right about the hardware upgrades specifically.
I want a new mac, but I feel like an idiot paying those prices for both ram and ssd upgrades, moreso than ever before. In CAD, at one point, ram upgrades were already too much at $200 or something, but I'd reluctantly suck it up and be happy with the decision anyway. Now they're literally $500 for each 16gb increment, and $500 for each TB increment. I'm not about to replace my existing mac with some Windows hunk of garbage, but I'll have to wait until I'm either rich or I can directly translate those upgrades into predictable dollar amount returns on the investments. It's bizarre that I price out a laptop that's $5k before tax that doesn't have more than a 512gb SSD, has only half the possible RAM, and doesn't even have the largest screen.
On the other hand, the mid-tier option is probably still great, but costs about $3500 and also still only has a 512gb ssd
Well, price includes membership for the sect of ‘good morning, Amazing’…
Honestly I feel much more in place with a win11 setup recently. For a dev who does not rely on apple’s Xcode etc it makes very little difference. Besides win11 is scary snappy on every decent box, not only Microsoft surfaces.
And except for the speed gains, the gains from using OSX have been diminishing rapidly for years. Sticking with osx12 on some devices does not impact my workflow at all. I guess I can go back to 10.x and … VScode, git, etc will still work the same.
Speaking of security et al - well that’s fine, but some of the most aggressive hacks are targeted at OSX.
I've been using M1 since it came out. Loved it. Fast, snappy, smooth. Prior to that, I had a Dell XPS that ran loud and always felt laggy to use.
I was going to upgrade to M2, but $2k+ for an essentially base laptop. Nah.
I grabbed a new Zenbook 14 OLED w/current gen i7. Metal chassis, nice keyboard, nice trackpad. Nice 120Hz screen. Solid construction. Windows 11 which I have gotten used to. It feels pretty well polished in terms of hardware and OS. Feels snappy. i7 is a wonderful, fast CPU.
While it isn't an M2 Mac, an M2 Mac is no longer wildly better than a good Windows laptop. The $1300 difference for the same performance is a strong selling point for the Zenbook which is 96% as good as an M2.
Why did you need to change computers at all? If you had an M1 mac, seems like a weird choice to want to replace it with what seems like a horizontal move
I'm mostly just happy since I switched back to windows 11 that I've got an OS I can actually drive from a keyboard that doesn't leave my hands brutally mangled from meta key combinations.
No doubt current day Windows is a capable device for all tasks. Too bad it treats you like a free user by clogging your operating system with ads while spying on you like a crazy stalker.
Honestly it doesn't feel any different for me on MacOS. Both OSes push first-party music and cloud services, both OSes beg you to use their browser, both OSes show news with their own built-in ad networks. At this point for Windows 11 and MacOS 13, I'm shocked how little these OSes have meaningfully changed since the early 2000s. Both companies feel like they're taking a detour around user empowerment to focus on Selling Moar Services™.
While we're looking at things from a retro-idealist lens, even Android and iOS seem unnecessarily bloated. It feels like the vast majority of every FAANG's engineering effort goes into crippling the user nowadays.
Almost 24 hours later so likely you won’t see this, but I’m interested what your setup is hardware (monitors) wise and workflow wise that works best on Windows?
I’m pretty platform agnostic. I own and relatively regularly use Windows, Linux, and MacOS machines and have spent time using each as my main work system in the last five years.
I currently use a single 49” monitor and I’ve found MacOS with Amethyst for tiling and window management plays the nicest on it but your comment about hand mangling (and never completely remembered) meta key combos definitely rings true.
I haven’t found anything on Windows or Linux that doesn’t have the same issues though. It’s possible I’m irrationally afraid of mistypes and overly avoid single meta key combos on any system.
I have used it for the last 15 years and it hasn't got worse and hasn't got better. It has barely changed. It is done. However there are load of little annoying bugs (Reminders makes me want to kill people) and the developer story on the platform is a mess now. Literally minor releases will break things I use on a regular basis. Last month they broke something that broke Maxima (mathematical CAS) for me. I have to use that, so off to windows it is.
I got (for my wife) an entire Framework Ryzen 7840U system, plus a 2TB SSD and 32GB RAM purchased separately, for THE UPGRADE COST on an M2-class MacBook between its base spec and 32GB RAM/2TB SSD. It's a farce. I don't want to leave the Apple/OSX ecosystem after 20 years, but I simply can't justify it any longer. I bought a €300 used ThinkPad. Fedora's pretty good these days, and I'm mostly in a terminal or Firefox. I'll live. Wife's been mostly running Windows on Macs since 2011 anyway.
It is a major failure of them not to put pressure on everyone else by doubling memory. The LLM models would massively benefit. They could buy 32GB for probably less than 100 USD as a manufacturer.
No thanks. I had a Ryzen T495s. Nothing but aggro that thing. I religiously avoid AMD because in the last decade I've had nothing but trouble from all of their hardware. Intel and NVidia only these days.
I've got an i5-1235U T14 gen 3 which I paid £730 for (new, but damaged box) and stuffed another 8Gb stick and a 1TB WD SSD in it. That is fine for my personal use as it's mostly an Office and light code machine.
This caught my interest. What problems have you had? I had a T495 and was happy with it running Fedora. So much so, just ordered the latest P14s ryzen model as its replacement.
The T495 developed a fan problem just recently, but since it is out of warranty it was easier to get my employer to order a new machine than to figure out how to repair it. If it were my personal machine, I probably would have searched for parts and tried to repair it myself and keep using it...
Edit to add: I am in your mirror universe because it was bad experiences with Intel + NVIDIA in Thinkpads which motivated me to want to try AMD again. I don't mind pure Intel iGPU models, but find the combination nothing but trouble.
Up until the most recent Intel gen, Ryzen has been the clear leader in both battery life and integrated graphics, blowing Intel away. Pretty much any internet forum will advise you to buy AMD for the T series.
Firstly, you can't add any more RAM to it. This turned out to be a BIG problem for me as I handle a lot of fairly huge RAW files in Adobe Lightroom.
Then there's the fact that it actually spends 95% of its time docked to my Studio Display and using the BT mouse and keyboard which fuck out all the time.
And the battery life isn't as great as people make it out to be. If I am on the road, which is basically never, then I'm lucky if I get 4 hours out of it.
So I thought sod it, why shouldn't I just have a proper desktop instead which solves all of these problems. I looked at the M2 studio, went pfffft at the price, and threw together a windows desktop. I've got hard wired keyboard, mouse, ethernet and headset and I can throw another TB in it or double the RAM to 64Gb for less than $100 each.
> I handle a lot of fairly huge RAW files in Adobe Lightroom.
Define “huge”.
I edit 45 megapixel RAW files on a m2 MacBook Air all the time and the only way that makes any sense is if you cheaped out and bought the base model. 16 GB of ram is still plenty for Lightroom and 32 GB is more than enough. The battery life is also amazing unless I’m dumb enough to do something like generate 1:1 previews for thousands of photos all at once. Yeah, you can’t upgrade it later but that’s why you need to make sure to buy enough hardware when you’re getting it in the first place.
It's a price for a one of a kind portable device. I used to think like you but then I realized Ill be better of just accepting the price or suffering with an inferior windows device.
I've got an M1 Pro MBP, studio display, ipad pro, apple watch, iphone. Meh. IT cost a lot and makes you feel special. That is all. It's not objectively better at achieving your goals.
At this point I don't agree that windows is inferior. It is just different. And quite frankly from a developer and productivity perspective it's actually a lot better.
> At this point I don't agree that windows is inferior. It is just different. And quite frankly from a developer and productivity perspective it's actually a lot better.
I think it depends a lot of what you're developing.
For a lot of open source stuff, even OSX is a 2nd class citizen, and windows isn't even considered. Of course, you can do everything in a VM, but that can have its own problems.
If you live out of Visual Studio, though, I'm sure the experience is nice.
> For a lot of open source stuff, even OSX is a 2nd class citizen, and windows isn't even considered. Of course, you can do everything in a VM, but that can have its own problems.
While it is true that windows is not considered, with WSL2, that doesn't matter anymore. Everything (99%) just works seamless with the Linux environment. And yes that's not truely native and technically a VM but I prefer WSL to a VM on macOS every day of the week.
Ironically, if I would need to switch to Linux, I'd buy a Macbook.
My Motorola (owned by Lenovo now) One Power phone got a dead pixel on it's screen :)
My Lenovo monitor not only got a dead pixel, within a year it also developed a horizontal dead line. I was very lucky that it happened a month before the warranty expires. And after the repair, the new replacement screen also got a dead pixel on it :)
My new Thinkpad dead within days after I received the machine, and it took over a month for the repair to complete :)
The monitor and Thinkpad were directly brought from Lenovo, shipped straight out from their warehouses, so no one else to blame.
Oh did I forgot the Thunderbolt controller firmware issue in my previous Thinkpad T480 :)
It's safe to say I'm not very happy with Lenovo.
... But on the over hand, for who's planing to buy a Macbook, I usually recommend them to watch these videos first just to lift the spirit: https://www.youtube.com/watch?v=yR7m4aUxHcM (this is part 3, there are also part 1 and part 2)
The sad thing is that we all know this. Normies buy the cheap version and you have to explain to them: Yes, it gets slower quick, it's shit, you shouldn't have bought it, no you can't upgrade anymore. Then they feel bad.
A “normie” would only need an M1 MacBook Air and that will be good for 10 years. I am using a 2015 MacBook Air, and since I only browse/do basic spreadsheet stuff/pdf things, it still works fine.
If you need anything more than a Macbook Air, you are not a “normie”, and should be capable of doing better research before you buy.
Our definitions of normie are probably different, because there is definitely university work that requires performance that “normal” (majority) of people will never need.
The craziest thing is that that single external display can be up to 6k/60hz! Surely for those 21.2 million pixels you could run 2 4k displays at 8.2 million pixels each.
Ports and display alone make this worth it. I wouldn’t recommend someone get 8GB ram but it’s pretty impressive when you consider it has everything else the other MBPs have (minus 1 port).
Also I’m glad they finally discontinued that embarrassment of the last touchbar Mac.
I really don't know why apple is insisting on making the entry specs something like 512GB SSD and 8GB RAM for a ~$1600 device and worse for Their Mac Air line where it starts with 216GB SSD for $1000 (M1) or $1100 (M2). That's already a price tag of some of the high-end alternatives.
I was close to getting the air for $900 from Costco but stopped in my tracks from the low storage and upgrading just the storage is +$500. A frankly insane mark up for 256 GB of storage.
The base model airs all have 256GB now. Upgrading to 512GB is $200, not $500. I don't think any Mx laptop has ever had a $500 charge for a one-level storage upgrade.
The base model is offered at a discount at Costco but there's no such discount for the next level up in storage so the mark up must take this discount into account.
I think the worse thing is that many people don't even know it's a possibility to upgrade, if they want. I am not sure if it's the same in all countries, but third party sellers only have access to the base models to sell. I talked a nice couple out of buying a MacBook Air with 8GB of RAM at an in-store mini Apple Store at major department store recently. Of course the sales people don't mention that you can go to the real Apple Store and get models with 16GB (or 24 in the case of the M2 MacBook Air).
Right at the end at the beginning of the credits it said it was shot on an iPhone 15 Pro and edited on a Mac. I think it might have been an MBP M3 Max… flashed by so fast. It’s fascinating that it was shot entirely on a phone.
I get why they are calling out vs M1/Intel as this is primarily targeted at getting folks to upgrade but it is kind of annoying that they aren't emphasizing the incremental vs the last generation. Also, the callout to AI developers to get an ok GPU but with 128 GB of unified RAM is pretty smart.
For training you want the best NVIDIA card you can afford. Doesn't make much sense to use a laptop for training IMHO. There is an argument that the M3 Max is the best non-datacenter chip for inference with the ability to scale to 128 GB of memory.
It's great Apple realized cutting down on ports makes zero sense. My MBP has only two thunderbolt ports. I can see the new MBP also has HDMI, SD, and even microjack. I can finally throw my dongles and hubs away. That's the end of an ugly era.
You get back ports, but those base model M3 MBPs only support a single external display. You get HDMI and 3 40gbps thunderbolt ports, but can only use one of them at a time for screens.
You need to bump up to the M3 pro models to get multiple external display support.
> This new “basic” “pro” has only two thunderbolt 3 ports but “pro” “pro” has four thunderbolt 4 ports.
My understanding is that it can't be branded Thunderbolt 4 because it only supports one external monitor, but that it's effectively Thunderbolt 4 in every other respect.
The technical explanation for the limitation of the built-in display support is that the base SoCs have one Thunderbolt bus that supports 2 display outputs, one of which is used by the integrated display.
Bit disappointed that they're still using the same mini LED displays. After the iPhone announce, I was looking forward to proper OLED with 1600 nit peak for the MacBook Pros.
Don't get me wrong, I'm very happy with my 16" Max display, but the dimming zones are quite visible, especially with poorly brightness-mastered content (some of which is on Apple TV+!).
I can't watch the whole event, but the continued complete lack of marketing toward gamers I must say is also a bit disappointing. You have these fancy GPUs, Apple; go for the jugular!
An OLED would actively put me off buying it for now, I think. Laptop screens are expected to keep operating a very long time; it seems likely that burn-in would shorten the product lifetime a lot.
My current personal laptop is a 2016 13" Pro, and it's basically _fine_. People keep laptops for quite a long time now; I don't think OLED is there on durability.
It's especially annoying that they announced ray tracing support but didn't show a real title using it. I wonder what happened internally because I'm sure they were working with partners to get it working... I mean, RE Village launched today on iOS and macOS and would've been a great demo for graphics and to boost sales.
Switching display tech is a massive ordeal for Apple.
Because a huge point of pride for Apple is that regardless of the Apple device you are using, color accuracy & picture are the same and matched across devices (between iPhone / iPad/ Studio Display / MacBooks).
I feel like it's Apple realizing (I mean they likely already knew it, but this is sorta publicly saying it) that people are not going to upgrade their Mac's every year like many do for iPhones. Or likely even every 2 years.
So I think the focus on Intel and M1 (but still showing M2 but just not saying it out loud) was the right call.
I wouldn't be surprised if when the M4 roles around we are still hearing about M1, but maybe they drop intel by that point.
Even 2% I would not call "virtually no one". 232 million iPhones were sold in 2022, 2% of that is still 4 million iPhones.
However looking at another survey of specifically iPhone users I see as high as 36%.
Going further, of those that upgrade every year how many of those are getting the highest pro models?
Or are subscribed to Apple iPhone Upgrade Program. Even if they are a lower portion of the population they are a consistent and important part of the population.
This is either to make the performance improvement bigger or to target user with m1 because they know most of their users are still using m1? (Me included with M1 pro and M1 max)
The perf difference though is still impressive compared to m2. It is bigger than the m2 > m1 upgrade
this year’s model is this many percentage points faster than the first chance they had to ditch the intel mac, and the mental math for converting whatever they internalized to decide to hold out to a comparison against the m3 should be easy.
some early adopters will salivate, some will pull the trigger, but the holdouts should be near their breaking points. the fear of shipping dates slipping due to early adopters with fomo will drive holdouts further towards the cliff.
Both, and also some to the old Intel machines. This is probably more relevant for actual purchasers; approximately no-one would be going from M2 to M3.
This makes sense to me. Very few people are upgrading the M2 laptops they bought earlier this year. I'm finally in the market to move from Intel because visionPro dev requires Apple Silicon. Also - last Intel laptops were less than 4 years ago. Even as a power user I only upgrade every 3-5 years - regular users even less.
They compared to both in all the charts I saw. I expect most people with any interest in upgrading are on M1 or lower, so it makes sense from that POV.
Whatever opinions I have against Apple I just cannot deny their ability to not only innovate but also deliver every year. I wonder just how far into the future they're working on stuff internally at the company. Well done to this company.
It’s like they can do nothing wrong. Everything seems so meticulously worked out and competent. Nothing good can last forever. Wonder how many more good years Apple has left.
> Basecalling for DNA sequencing in Oxford Nanopore MinKNOW is up to 20x faster than the fastest Intel-based MacBook Pro and up to 36 percent faster than the 16‑inch MacBook Pro with M1 Pro.
What does the average consumer know about basecalling?
I mean, they are still selling just consumer electronics. Scientists and industry use Linux and Windows, and speed isn't the main reason.
There's an entire industry of consumer products that take massive amounts of compute called "gaming". Apple could have showed off the chips performance at that.
Or at least something like photo or video editing?
> I mean, they are still selling just consumer electronics. Scientists and industry use Linux and Windows
Anyone who's worked at a research university knows Macs are popular among professors, researchers and students. These folks also start companies and work in industry and they generally don't switch platforms unless it's absolutely necessary.
A lot of marketing is aspirational though. People want what the pros use. If it's their top hobby or they aspire to becoming one of those pros... then they want the best, no limits if they can afford it. Even if many of those buyers realistically never will hit those limits.
I suspect that impulse sells a lot of MBPs, DSLRs back in the day, Photoshop licenses, sports cars, high end ski gear, etc. etc.
First thing I noticed since I'm sitting here in Mountain View looking at a nice sunset just above the horizon (or specifically, the Santa Cruz mountains), so nope, not close to dark yet. Maybe another hour.
Yes and they do not make this clear and then people come crying to reddit when they find out.
It gets even better with Apple steadfastly refusing to support MST well over a decade after its introduction so even if you have an M\d Pro or Max CPU many docks and chaining monitors will not work.
I just bought a 14" Macbook Pro on sale with the top spec M1 Pro and 1TB storage. I'm already amazed at how powerful and efficient this laptop is. I can't see myself wanting/needing to upgrade for another couple of years at least.
> So. Tempted. Decent ports again, decent keyboard again
Maybe I'm missing something but both of those things are unchanged from the M2 model right? Maybe even the M1 MB Pro had the same as well, can't remember
to me, every release is more locked down and dumbed down.
and privacy. I remember when macos used to ask you before sending data back to apple. Now it does it all day every day, with 1000 pages of privacy policy explaining what they do, but little or no ability to prevent it.
It's a consumer OS, same as Windows, designed to be as idiot-proof as possible and protect the user from making stupid mistakes. I'm not in that target market (though I still make a lot of stupid mistakes).
I'm using a 14" M1 macbook pro for work and frankly, the hardware is good/okay but MacOS is awful.
I'm keeping an eye on Asahi Linux, i might pull the trigger and use that on my macbook.
But I'm skeptical about this, I might just buy a second hand thinkpad (as usual) or maybe a framework laptop (waiting for reviews on the 16" model) and call it a day.
That's pretty much where I am too. Using a Purism 13 laptop at the moment which is great but showing its age, and I don't think there's a processor upgrade path for it. So at some point in the next couple of years I'm getting a new laptop, and Framework is the leading contender at the moment, but the Apple chips are looking mighty tasty.
I'm surprised there's still no pro chip for the iMac, and I guess at this point we should assume there never will be. It's a shame. Give me a 27" M3 Pro with target display mode, and I'd be awfully tempted.
As it is, there's nothing (other than that new color) that tempts me to update from my 16" M1 Pro, but I didn't expect there to be. I suspect I won't be eyeing upgrades until at least M5 and more realistically M6.
that, and the missing 27" iMac, seemed to very keenly indicate that if you want more power or more pixels, Apple will push you to buy a Mini + a Studio Display.
Pretty underwhelming a bit like the A16 release. They basically upped the frequency in A16 and that's how they got their single core performance improvement, not much of an IPC increase to speak of. Given the numbers they quoted, same story will repeat for M3. And A16 also came with a power consumption increase so the 3nm process from TSMC is disappointing as well.
Going to be looking for outside validation of that Max GPU performance before making a buy decision, I think. Biggest hopes for this generation were dramatically better GPU performance--and Apple says they've got it--but we'll have to see how that actually pans out.
I was also hoping for a fourth USB4/TB4 port, which doesn't seem to be coming. Oh well.
The full 40c Max GPU is back up to the full 400GB/s bandwidth, but you're right, the Pro has significantly lower memory bandwidth, and the 30c Max does too.
To be honest, Act 3 has ridiculous frame drops on a dedicated GPU. Act 2 is also heavy but Baldur’s Gate (the town) is just very graphically heavy for some reason. I wouldn’t be surprised if it’s just that same performance issue.
I used to buy Pros for years until I more or less accidentally ended up with an Air for a period and never went back. Lighter, enough performance and cheaper.
> makes me seriously question why developers need macbook pro's
For work I am developing a massive JVM monolith server application and even the M1 Pro 16" takes minutes for it just to start up. Compilation is also slow with millions of lines of code. I have SQL Server running on an x86 Linux VM with Rosetta in Docker. Need all the CPU I can get. The Macbook Air simply has less CPU performance and is thermally throttled. It would waste too much of my time.
For my personal hobby C++ coding, the Macbook Air 8GB would be enough.
Yeah, the single external display is the biggest limiting factor for me. The performance is generally there (except for more heavy compilations). I'm thinking of picking up a M3 Pro Mac Mini when those eventually release to combat the display limitations.
Honest question,but how do you live with 8GB or RAM? I have 16 now and I run out regularly without doing anything too crazy (though in large part because Emacs and CIDER leak memory like crazy)
Doesn't having to close background applications constantly absolutely maddening ?
did not know that! The Apple webpage somehow threw me off. Thanks. Would be a bit hard downgrade from a 16inch 3:2, but I'll keep it as an option if I need to get a new laptop at some point
I would never buy an Apple device but damn the 128GB memory option is tempting. I am still looking for a decent (business) laptop that has 64Gb but the options I found until now are limited... Lenovo P-series offers 64GB max and the max I could get on Dell is 32GB. (If someone has alternatives: please enlighten me)
AFAIK 64GB SO-DIMM DDR4 modules are not available, so you need a laptop with 4 ram slots, that's just a few laptops in the workstation / gaming series of a few vendors.
Yes, they look bad. They look like products that were designed by passionless bean counter MBA types that create products off a spec sheet with zero regard for the end user. It’s a 1.2” thick lump of plastic (somehow heavier than an all metal 16” MacBook Pro) that will creak and bend when you pick it up reminding you how little care was given to it.
I don't know what that measurement means. Neither the Precision nor the ZBook are plastic; they have metal chassis. That's how magnesium, aluminium, and titanium look like without anodising them with fancy colours like Apple does.
These notebooks are also thicker and heavier than usual because they have to dissipate something like 250 W of heat. Good on Apple for designing such power-efficient SoCs; many high-end Windows notebooks have to make do with Intel CPUs and NVIDIA cards that will perform better than that MacBook Pro, but unfortunately also draw about twice as much power.
Finally, these are business workstations, not fashion statements. People—or rather, companies—buy them to get work done on them, not show them off in Starbucks. They are meant to be serviced easily and quickly, and come with up to 5 years of next-business-day onsite support with 24/7 telephone service.
Both the ZBook and the Precisions have 4 DDR5 slots, 3-4 NVMe SSD slots and a WWAN NVMe slot, a replaceable (though possibly not upgradeable) GPU card, an easily-replaceable display assembly and battery, and a productive keyboard with a keypad, and a reasonably large and accurate trackpad with buttons. They have Ethernet ports, USB-A ports, and can support up to five high-resolution displays.
These workstation product lines have also come with service manuals for the past two decades. Apple released service manuals for its own products this year, after significant regulatory and consumer pressure.
The MacBook has its advantages, and I fully understand why someone might buy one. But your argument is in extremely bad faith—it 'looks ugly', but you haven't seen it in person. Having actually used these machines for 5+ years, I can tell you they are pretty damn solid.
I daresay these don't even look that different to Framework's notebooks, which are the apple (pun not intended) of HN readers' eyes.
Businesses ostensibly buy them "to get work done on them" but they actually tend to do so regardless of the work, and they're often truly awful day to day form factors loaded with so much crap software that getting work done on them could be considered lucky. Everything has tradeoffs; rapid serviceability, good deals, and high specs, are all useful traits, but if they come at the cost of introducing friction between the operator and the task, then that's usually ignored by business. Sometimes that's a great tradeoff and no friction is introduced, but that's rarely part of the consideration in a top-down hierarchy where purchasing decisions are made by some other department en-masse.
There is some threshold of computational demand and cost past which the specs dominate other factors, but before that I personally consider friction in various circumstances to be a higher priority.
They might come from the factory that way, but often they're re-imaged with much less spartan Windows installations. My personal reference for this is ages ago in real time, but relatively recent in corporate years
Dell Precision 7780 is a terrible machine. It's heavy, gets extremely hot and is just not good to hold. You might find some use for it if you intend to work it in cold desktop-like setting.
> What tasks are you doing that require 128GB RAM?
You’re thinking about it the wrong way.
The more RAM you have, the faster + smoother your experience using an application will be, and this will be much more noticeable if you usually run multiple applications at the same time, like virtually everyone in the world do.
This is especially true now that we have options to run local-only AI models. The next couple of years will be interesting.
> The more RAM you have, the faster + smoother your experience using an application will be,
This is the mentality that MSI targets with the $1300 AORUS Z790 Xtreme X motherboard. More is clearly better, right? Nope. Not at all. In this case, there is an amount of RAM an app will consume and you have that much, the rest will not do much. Even buffering/caching only can consume so much. It's very hard to fill even 64GB in a laptop but 128GB is near impossible.
While I do agree that 8GB is a little dated, even the 8GB MacBook Air from 2020 runs most applications smoother than modern Windows machine with twice as much RAM. Apple have the smoothness part figured out, without the need for more memory (for the average consumer at least).
Your point about local AI is pretty interesting. It does seem highly likely that computers with limited memory could "age" faster than they have done in the past ten years, with the advent of more AI workloads.
That's probably a threshold where I'd start looking at a well integrated solution for remoting into a decent server. Do you actually need/want to use 128GB locally? Not disputing that if you do have a requirement, but have you considered a smaller machine and spending the rest on a non-mobile decent setup? The battery and weight are really affected once you go into the mobile workstation territory.
Personally I've been tempted by the hardware, but I don't care for Apple's software bundle (MacOS, iTunes, etc.). With the M-series they have show indifference/reluctance/hostility to work with other vendors and the open source community. Asahi Linux has made major progress, but despite support from Apple.
Another (minor) frustration is the keyboard layout. I'm user to having backspace, delete, insert.
Finally, Apple's offerings are a bit heavier that the ~1kg ultrabook I like to work with.
[Summary] I wouldn't say I would never buy Apple, but so far they haven't presented products appealing to me. And at the root that is caused by their approach to stewarding their "ecosystem".
No OP, but UK keyboard layout on Apple devices sucks, seperately from overriding decades of muscle memory reaching for cmd instead of ctrl. Could never get used to UI/workflow. No native window snapping (back in the day at least). Workarounds required for a lot of my tooling, no native containers, until recently no native VM solution (believe UTM solves this). Brew.
As a disclaimer I do think Linux with Gnome provides the absolute best developer experience so you can take all my opinions with that in hand.
> No OP, but UK keyboard layout on Apple devices sucks, seperately from overriding decades of muscle memory reaching for cmd instead of ctrl.
Is cmd somewhere else on UK Apple keyboards, than on US ones?
If it's like the US ones, I spent my first 25ish computing years on Windows and Linux (and Sun, et c.—point is, not Mac) but after getting used to Mac's shortcuts, I gotta hand it to them: what they've done is more correct than what everyone else is doing. Ctrl-shortcuts involve more stretching and wrist movement, so, are slower and push one closer to RSI, and putting more common shortcuts on cmd than ctrl leads to fewer collisions (no awkwardness in the terminal, for example).
I don't get it, the button swap is trivial to configure, how can it be such an important criticism? (though the cmd key is more comfortably located, so it's better to do the swap on Windows)
Through several comparisons and careful word choices, it seemed clear that Apple is strongly hinting that Intel Mac users should seriously consider updating to M3, or Apple Silicon in general.
This may indicate that the next version of macOS could support Apple Silicon exclusively.
That will only happen when all the Intel Macs are depreciated and no longer receive major OS updates, which will take a couple years (macOS Sonoma's minimum supported MBP is from 2018)
This release cadence seems too much. I don't have insights on why, but I'd be shocked if there's this much demand. Even a 2018 Mac works fine for most work information work.
Looks like they are reusing almost everything from last years model. If they have the new chip ready, it makes perfect sense to start sticking it in the MacBooks they are putting out.
Well it's frustrating that basically no matter how close you buy a device to launch it's almost immediately invalidated. I got an M2 max 3 months ago and it's already last gen...
I do understand my machine is still (more than) capable. But it is frustrating when the release cycle is so rapid, it's not just a desire for 'latest and greatest', it's an almost immediate loss of bang for buck unless you buy it basically immediately.
The thin gradient text they used is out of step with most modern advertising, and even their own, I wonder what kind of advertising/design philosophy they're trying to push
is this why m2 mackbook pro has such terrible pixel response times? swiping work spaces becomes a blurry mess, not seen displays this bad since we moved from CRT to LCD
As expected, this aged perfectly well for the M3 3 year prediction [0] and the competition is already left in the dust before they tried catching up towards the M1.
Now that Apple Silicon has stabilized for software support, it looks like a great upgrade from those stuck on Intel laptops since 2020 or below.
Not really? Qualcomm just showed off benchmarks beating M2 and all we have to go off is Apple's dubious relative graphs that have been essentially lies before (3090). We will see how they really stack up when both are in shipping products.
It already exists? You can buy ARM Windows devices right now. I'm typing this on one. The x86/x64 emulator is pretty good but performance is limited by the slower CPUs.
We can already use the Windows x86/x64 emulator. Unless you have specifics that Snapdragon X Elite performs worse under it than current chips we can expect about 60% the performance. Rosetta 2 is about 75%. Based on Qualcomm's Geekbench 6 numbers it'll have about the same x86/x64 performance as the M1.
If you watched the video then you'd have seen most of the slides also featuring M2 comparisons. Though, not very impressive changes. Not enough for me to upgrade at least.
yeah it is indefensible to me at this point that the base chips are still limited to one external display.
I was willing to cut them some slack with the original M1 chip, thinking there might be a decent technical reason for it with their first generation of laptop/desktop chips, but three years later it is absurd
> it is indefensible to me at this point that the base chips are still limited to one external display
They’re the base chip for a reason. Why would you add a feature a small fraction of that chip’s users would use when you can use that space for something else?
> Why would you add a feature a small fraction of that chip’s users would use when you can use that space for something else?
Because it is a basic feature! I am not asking for the moon here, I have no idea why the idea that a laptop should support more than one external monitor is controversial on this site.
Not according to Apple. And, frankly, not according to me. Most laptop users will never hook up an external monitor. They don't need to. That's a fair base case.
Also for SD card. I bet that many Macbook users who don't regularly work on photos/videos (e.g. developers) has very little use for SD card reader. And indeed it doesn't exist on MacBook Air. Somehow that is deemed essential on Macbook Pro.
But for $1,599 you must be really deep into Apple's shoe to say 8GB and one external monitor is ok.
But at least it can use a dongle to utilize 2 or more external monitors. The point is the base M1/M2/M3 cannot use 2 external monitors. At all. No matter how many dongles you buy.
I agree that it isn't complicated, for completely different reasons though.
Supporting multiple external monitors is such a basic feature, it is absurd that apple insists on using it to differentiate products.
it's akin to making me shell out $400 extra dollars if I want the laptop to have a trackpad, or if I want both the left and right halves of the screen to work
It is a basic feature. I dare you to find a single laptop not made by Apple with an MSRP of $1000+ that only does a single external monitor that's currently sold. If all your competitors have it, it's table stakes.
Sure SOME people can tolerate it or are dumb enough to reward Apple for it, but it doesn't make a basic feature premium.
Quite a lot. A laptop screen + external monitor is an awkward solution. Display size and pixel density issues make it annoying to navigate the combined desktop environment. I have very little interest in doing so, personally -- if I'm using an external monitor, I'm generally never using my laptop screen.
It's a $1600 laptop. It's sort of excusable on a $400 one. Not $1600 or even $700. Should $2000 be the bar for 2 external monitors? The OLD 2016 13" $1500 could do 2 external monitors. The 2020 Intel i3 MBA which was $999 could do two.
Something like a better screen, wireless, camera, weight reduction, ports, gaming initiative, I don’t know, something more exciting than just a spec bump.
They did give the MacBook Pros better screens and a better 1080p camera. It’s also way better at gaming with the improved performance. What ports do you want them to add?
Its just so strange to me that they can release some of the most powerful and efficient hardware we have ever seen, and show it off by telling us we can now (checks notes…) check off tasks in a widget on our desktop.
I was just reading something related to that (can't remember the text) but the gist was that improvement in hardware do not follows software as they try to limit the control that the user has over that hardware.
My own impressions after 4 years with apple devices is that there is no expert mode on the Mac. It, and the iPhone and the iPad, are computing appliances, something you use to get work done. They either work for you, or they don't. The mac is more flexible than the others, but the whole ecosystem is trending towards making your workflows fitting the tools, not the tools fitting to the workflow.
You can go and find a solution for most of the shackles, but then you're likely to break down the whole system as it was not designed to be that way. The mac is fragile in terms of customizability. It's not that it's preventing running software, it's that it may break down if you do! (not that the others are better, but at least they're trying to be more resilient).
You can bypass notarization easily and SIP. By opaque apps you mean everything is not open source? What do you want to be able to do “unsolder the memory from the chip”?
Transparent apps for me means configurable up to the option of deleting them and replacing it for someone else. Instead of the tight coupling we have now in MacOS, I'd like to have federated applications bound by protocols instead of needing to update the whole system to add a single feature or fix a bug. I don't think you can delete stock apps without disabling SIP or something.
I don't need to unsolder the chips, or do anything hardware wise, but I'd like to be able to use the hardware without the OS if needs be. That's why I say it's an appliance. It makes sense in terms of business, but it's not like the user is free to use the device how he wants.
> What the heck are you talking about? What types of apps do you think you can’t find alternatives for on the Mac?
It's not about finding alternatives. I did for a lot of mac's default. It's about replacing the default option. On Ubuntu, you can remove the default login screen and use KDE's panel alongside the i3 window manager. Because they're not tightly coupled together. Imagine if you could have replace the MacOs top bar, because it is a program that just answers to some IPC protocol? Or the window manager would expose windows and their placements so you could script a layout without the accessibility workaround.
> And why wouldn’t you want SIP by default?
I want it, which is why it's not disabled on my computers. But I'd like to set my own snapshot of what I meant to preserve instead of Apple's.
> Do you run your Linux box using root?
I don't but I'm always a real sudoer and I have the capability to do `su root` anytime.
As I've said, it's about expert mode, not something on by default, but that can be and the system has been built to support it.
It’s more like something that would be nice to have because experts will always wants to build tools that will fit them, not just take something off the shelf. It’s like the kindle reader which are perfectly fine for reading books, but some people want the perfect experience (according to them). So they will jailbreak it and install koreader which is the expert mode of reading ebooks.
I like Apple’s hardware. And I use the software. But it’s always a convenience for me, not something that fits how my mind would like to use it.
With no disrespect to Steve, we might have to settle on a better standard of computation than one that was designed to reinforce a trillion-dollar business model.
> Imagine if you could have replace the MacOs top bar, because it is a program that just answers to some IPC protocol?
That is irrelevant in the bigger scheme of things. I grew out of that tinkering phase after I was a student running Gentoo for a few years. Eventually most of us just need to get shit done. The default top bar is fine. It works. Virtually no one gives a f*** about replacing that. You are in the 0.0000001% and Apple will never cater to that. You can install Asahi Linux on Mac and play with the login screen there.
If you want to spend your time being productive/in the zone that’s what you want. Your workflows are in the music or movie production or graphic/3D app, etc… Macs always were about running apps and doing all your work there.
It’s like the difference between having a car for mechanics projects and racing vs actually needing to go from A to B reliably when you need it.
Apple's lack of serious long-term dedication to 3D gaming is such a self-own that makes their 3D gaming benchmarks feel weak & thin.
I'm hopeful the Game Porting Toolkit + advances in this years macOS (Gaming Mode, better support with MoltenVK and SPIRV-Cross, etc) start a publishing shift toward macOS as a target platform, but I can absolutely see Apple taking their foot off the gaming gas pedal again.
My biggest pet peeve: Apple has decided to convey low/high temperatures different than standard. Every major weather agency and app advertises the daily low as the low temperature during the night following the advertised high. Apple, on the other hand, advertises the lowest temperature during that calendar day.
This is a small but crucial difference. If I want to know the coldest temperature it will reach tonight, I need to look at the hourly forecast, because it's quite possibly that the low for today will be for 1am this morning (the past) whilst the low for tomorrow will be 11pm tomorrow, without an advertised daily low for tonight.
At times it seems very averse to tell me about perception on the main screen ... EXCEPT the notifications where it sends me double notifications about rain, snow, whatever all the time, and it's right maybe 15% of the time, maybe.
> They really shouldn't mention the weather until they fix that app.
After your edit: What a pity! I thought to have just read a great one-liner about the climate crisis and how tech people want to solve everything applying tech.
My point of view as well! Several months ago I purchased the M2 MBA I'm currently using, and which I plan to use for the next 3-5 years. By the time I'm ready for a new machine, its excited to think about how far not only performance but efficiency will have increased by then. Perhaps we'll have reached a point where battery life will be measured in days instead of hours.
Apple unveils M3, M3 Pro, and M3 Max - https://news.ycombinator.com/item?id=38078063
Apple supercharges 24‑inch iMac with new M3 chip - https://news.ycombinator.com/item?id=38078068