I guess the cooling let’s them tweak the CPU clocks accordingly? Wonder if we can hack the Mac mini with water blocks and squeeze higher clocks. The memory limitation makes it a dud though.
Wouldn't be surprised if the cooling solution was serialised and it had a detection whether the cooling is originally programmed for the particular unit like they do now with cameras and other peripherals (check iPhone 12 teardown videos). I bet that the logic would check the expected temperature for given binning and then shut down the system if it is too cool or too hot. Apple knows better than the users what hardware should work with the unit.
For a while, the fan was broken in my 2017 MacBook Pro 13". Didn't spin at all. The MacBook never complained (except when running the Apple hardware diagnostics). It didn't overheat or shut down unexpectedly. It just got a bit slower due to more thermal throttling.
I expect it would work the other way, too. Improve the cooling and performance under load would improve.
This is a video from Linus Tech Tips that demonstrates that no matter how much you cool it, they've physically prevented the chip from taking advantage of it.
And if it could be fixed with software, they would have worked out how, they're into that kinda tweaking.
Intel chips, on the other hand, are designed to work with a varying degree of thermal situations because they don't control the laptop it is put in. In this situation, Apple could potentially get more creative with their approach to thermals because they control the entire hardware stack.
Intel processors use Intel designed throttling solutions... which exist to keep their own processors from overheating because they have no control over the final implementation.
These new M1 laptops are the first laptops that have complete thermal solutions designed by a single company.
As an example, there is the potential to design a computer with no throttling at all if you are able to control the entire thermal design.
> As an example, there is the potential to design a computer with no throttling at all if you are able to control the entire thermal design.
This is not true. A laptop needs to work in a cold room, in a hot room, when its radiator is dusty, etc. If your CPU is not willing to throttle itself then a company with Apple's scale will have machines overheating and dying left and right.
For a computer to never _need_ to throttle, either (1)the cooling system has to be good enough to keep up with the max TDP of the CPU, or (2) you "pre-throttle" your CPU by never delivering it more power than the cooling system could handle. Apple refuses to accept solution 1, so they went with solution 2. If you watch the video I posted, it shows that even when there is adequate cooling, the new macbooks will not deliver more power to the CPU. In effect, the CPU is always throttled below its limit.
If Apple actually did that Louis Rossman would be out of a job.
No, not in the sense that the cooling lockout would make him unable to fix MacBooks - he clearly has the industry connections to get whatever tools he needs to break that lockout. Yes, in the sense that many Apple laptops have inadequate cooling. Apple has been dangerously redlining Intel chips for a while now - they even install firmware profiles designed to peg the laptop at 90C+ under load. The last Intel MBA had a fan pointed nowhere near the heatsink, probably because they crammed it into the hypothetical fanless Mac they wanted to make.
Apple actually trying to lock the heatsink to the board would indicate that Apple is actually taking cooling seriously for once and probably is engineering less-fragile hardware, at least in one aspect.
So, essentially their new Macbook line is a glorified iPhone/iPad but with a foldable display (on a hinge)?
Not too far-fetched when you see the direction MacOS is headed, UI-wise. And it sounds nice, but if it means that repairability suffers then we'll just end up with a whole wave of disposable laptops.
To be fair to apple, people keep their macbooks for years and years, keeping them out of landfill longer. They are well made and the design doesn't really age. Written on my 2015 Macbook pro.
To be fair to the rest of the world, this comment is written on a 20 year old PC. It has had some component upgrades, but works like a champ after 20 years.
If you keep replacing failed/failing components or give needed upgrades to the system every few years, is it fair to call it 'working like a champ for 20 years'?
I'll take it a step further. Is it fair to even call it the same system after 20 years of changes?
Like the Ship of Theseus thought experiment, at what point does a thing no longer have sufficient continuity to its past to be called the same thing? [1]
Yeah, but it does mean it is no longer the same bike. If you replace every part of a bike, even one at a time over years, it is no longer the same bike. So it all depends on what GP means by "replacing some parts". Is it entirely new computer in a 20 year old case? Or is it a 20 year old computer with a couple sticks of RAM thrown in?
Regardless, I have a hard time believing a 20 year old computer is "working like a champ". I've found the most people who say their <insert really old phone or computer> works perfectly have just gotten used to the slowness. Once they upgrade and try to go back for a day, they realize how wrong they were. Like how a 4k monitor looks "pretty good" to someone that uses a 1080p monitor everyday, but a 1080p monitor looks like "absolute unusable garbage" to someone who uses a 4k monitor everyday.
A typical "Upgradable" PC is in a box 10 times the size of the mini. If you upgrade the GPU on a PC, you toss out an older GPU because it has pretty much zero resale value. Typical Apple hardware is used for 10-15 years, often passing between multiple owners.
It's a shame we don't have charities that would take such parts and then distributed them to less fortunate countries. Ten years ago a ten year old graphics card would no longer be quite usable, but now 10 years old card should work just fine for most of the tasks, except more advanced gaming.
I don't see the point. There is nothing to put it into. It's far cheaper to just ship modern CPUs with integrated graphics which will be faster and more efficient than that 10 year old GPU. The era where computer components were big enough for it to make sense for them to be discrete parts is coming to a close.
This is particularly true on the lower end where a 10 year old part is even interesting.
It's not difficult to replace RAM or SSD with the right tools (which may be within reach of an enthusiast), problem is that you often cannot buy spare chips as manufacturers can only sell them to Apple or that they are serialised - programmed to work only with that particular chip and then the unit has to be reprogrammed after the replacement by the manufacturer. I think they started doing it after rework tools became affordable for broader audience. You can get a trinocular microscope, rework station and an oven for under a $1000 these days.
You can get a screwdriver (allowing you to replace RAM and SSDs in most laptops, including older macs) for $5. There's really no excuse for them to do this all the while claiming to be environmentally friendly.
Depends on the model. My 2012 mbp15r uses glue and solder, not screws. Maxed out the specs when I got it, which is why it's still usable. Would've been delighted for it to have been thicker and heavier to support DIY upgrades and further improve its longevity while reducing its environmental impact, but that wasn't an option. Needed the retina screen for my work, bit the bullet. Someday maybe there will be a bulletproof user-serviceable laptop form factor w a great screen, battery life and decent keyboard, that can legally run macOs... glad to say my client-issued 2019 mbp16r checks most of those boxes. /ramble
Something like ATX standard but for laptop shells would be awesome - imagine being able to replace a motherboard etc, just like you can with a desktop PC.
Intel tried this more than a decade ago. The designs were as horrible as you might imagine, and a few OEMs did come out with a handful of models and parts.
As I recall, consumers didn’t care or wouldn’t live with the awful designs that they initially brought out. I don’t remember. I remember thinking I wouldn’t touch one after seeing a bunch of engineering samples.
Mmm... it's certainly better than they had before. But really they ought to be designing repairable machines. If that makes them a little slower then so be it.
Sure, but you add the option to ignore the serialization, or options to reset the IDs as part of the firmware or OS. That way the machine owner can fix it after jumping through some security hoops, rather than requiring an authorized repair store.
Mostly because, its doubtful if state level actors (or even organized crime) aren't going to pay off an employee somewhere to lose the reprogramming device/etc. Meaning its only really secure against your average user.
I don't believe those reasons are more important than open access and reducing the environmental impact of planned obsolescence, outside of the kind of government agencies that are exempt from consumer electronics regulations anyway.
Surely there is a better (and I'd bet, more effective) way to handle environmental regulations than mandating specific engineering design patterns within the legal code.
Perhaps instead, it might be a better idea to directly regulate the actions which cause the environmental impact? i.e. the disposal of those items themselves?
Engineers tend to get frustrated with laws that micromanage specific design choices, because engineering practices change over time. Many of the laws that attempt to do so, backfire with unintended consequences.
It is quite possible that your solution might be just that -- many industries with high security needs are already very concerned with hardware tampering. A common current solution for this is "burner" hardware. It is not uncommon for the Fortune 500 to give employees laptops that are used for a single trip to China, and then thrown away. Tech that can give the user assurance that the device hasn't been compromised decreases the chance that these devices will be disposed of.
As a side note, I don't think serialized components is even one of the top 25 factors that does(/would) contribute to unnecessary electronics disposal.
I think resetting instead of bricking doesn't compromise security, but saves a burner laptop from ending up in landfill. I get your point, but I think company would have to demonstrate that e.g. serialising meets particular business need that is different from planned obsolescence. Could be a part of certification processes that products before getting marketed have to go through.
In practice, such a law could resemble right-to-repair bills like the one recently passed in Massachusetts, which requires auto manufacturers to give independent repair stores access to all the tools they themselves use. A bill like this for consumer electronics could practically ban serialized components, even without mentioning them explicitly.
Why beating around the bush? If the function of extra tax is to stop producers from implementing planned obsolescence, then why not just stop them directly and require that components are not serialised etc. as a part of certifications products need to go through? If you add tax, then all you do is restricting access to such products for people with lower income.
the point is to push the market into the correct^Wdesired direction without outright banning anything. non-serialized would be cheaper, hence more accessible. there are use cases where serialized parts are desired (e.g. if i don't want somebody swapping my cpu with a compromised part).
Normally I prefer nudges to bans, but I'm not sure they work on giant monopolies. Unless the tax were high enough to have no chance of passing, Apple would dodge it or write it off as cheaper than being consumer-friendly.
> So, essentially their new Macbook line is a glorified iPhone/iPad but with a foldable display (on a hinge)?
This isn't some new. Since day 1, the iPhone has always been a tiny computer with a forked version of OS X.
> but if it means that repairability suffers then we'll just end up with a whole wave of disposable laptops.
Laptops have been largely "Disposable" for some time. In the case of the Mac, that generally means the laptop lasts for 10-15 years unless there is some catastrophic issue. Generally after that long, when a failure happens even a moderate repair bill is likely to trigger a new purchase.
I had a quad-core Mini with 16GB in 2011. Almost 10 years later we should be much further, especially as the Intel Mini allows up to 64GB. (Which you probably would use only if you upgraded the memory yourself).
Bleeding-edge-clocked DRAM is a lot more costly per GB to produce than middle-of-the-pack-fast DRAM. (Which is weird, given that process shrinks should make things cheaper; but there's a DRAM cartel, so maybe they've been lazy about process shrinks.)
Apparently DRAM and NAND do not shrink as well because in addition to transistors in both cases you need to store some kind of charge in a way that is measurable later on - and the less material present, the less charge you are able to store, and the harder it is to measure.
> The M1's memory is LPDDR4X-4266 or LPDDR5-5500 (depending on the model, I guess?) which is about double the frequency of the memory in the Intel Macs.
That's a high frequency, but having two LPDDR chips means at most you have 64 bits being transmitted at a time, right? Intel macs (at least the one I checked), along with most x86 laptops and desktops, transfer 128 bits at a time.
> Apparently, this alone seems to account for a lot of the M1's perf wins — see e.g. the explanation under "Geekbench, Single-Core" here
That's a vague and general statement that site always says, so I wouldn't put much stock into it.
I missed that, I assumed virtualisation was dependent on Intel VT.
Then again I would have expected them to have discussed it as much as the video editing.
I am guessing that they’d need a M2 type chipset for accessing more RAM for that. Or maybe they’ve got a new way to do virtualisation since that is such a key thing these days.
Edit: thanks for pointing that out though, that’s why I mentioned it
How well this fits in with current virtualisation would be interesting to find out; I guess this will be for a later version of Big Sur, with a new beefier M2 chip.
Are they virtualizing x86 though? Having Docker running arm64 on laptops and Docker x86 on servers completely nullifies the main usecase of Docker imo.
The intel Mac Mini is still available with the same 8GB in its base model, but configurable up to 16/32/64. RAM is definitely the biggest weakness of these new Macs.
On iOS they can get away with less RAM than the rest of the market by killing apps, relaunching them fast, and having severely restricted background processes. On Mac they won't have that luxury. At least they have fast SSDs to help with big pagefiles.
With the heterogeneous memory, your 8GB computer doesn't even have its whole 8GB of main system memory.
When the touchbar MBP launched in 2016 people were already complaining that it couldn't spec up to 32GB like the competition. Four years later, and it's still capped at 16GB.
Hopefully they can grow this for next year's models.
And the Intel Mac Mini had user-replaceable RAM. Tired of fan noise and slow response, I went from a 4 Thunderbolt 2018 MacBook Pro with only 8GB of RAM to a 2018 Mac Mini with 32GB of RAM (originally 8GB, bought the RAM from Amazon and upgraded it).
It doesn't make sense for the system not to 'grab' a big chunk of your RAM. That is what it is there for. You want stuff to be preloaded into RAM so you can access it quickly if needed. You only want to leave some of it free so that if you launch a new application it has breathing room.
For example Chrome will scale the amount of RAM it reserves based on how much you have available.
> It doesn't make sense for the system not to 'grab' a big chunk of your RAM. That is what it is there for. You want stuff to be preloaded into RAM so you can access it quickly if needed. You only want to leave some of it free so that if you launch a new application it has breathing room.
Cache is excluded from just about any tool that shows RAM use, at least on desktops. If the ram shows as in use, the default assumption should be that it's in active use and/or wasted, not cache/preloading.
> For example Chrome will scale the amount of RAM it reserves based on how much you have available.
Which features are you thinking about that reserve ram, specifically? The only thing I can think of offhand that looks at your system memory is tab killing, and that feature is very bad at letting go of memory until it's already causing problems.
I have chrome, Firefox, photoshop, vs code, docker and a few other things running. As a kid I had to manage RAM. As an adult, I buy enough RAM to not need to think about it.
I was committed to buying an M1 on day one. I won’t buy a machine with only 16gb of RAM.
Another note on the Mini and MacBook Pro (in higher end SKUs) - these both used to have four USB-C ports, and now only have two. The Mini at least keeps its a pair of USB-A ports, but on the MBP you're back on the road to dongle-hub-land.
I'm betting this is due to Thunderbolt controller and PCIe lane capacity. They couldn't do four Thunderbolt ports with the M1 SoC, so they dropped the ports. Having four USB-C ports but only two supporting Thunderbolt would be a more obvious step back from the previous MacBook Pro. This way people can just blame it on Apple doing Apple things, instead of seeing a technical limitation.
Yes, based on my experience on a mac, I would not buy any mac with less than 32gb ram (I personally use 64gb and it's so much nicer)...
Yes, it seems crazy, yes it's a lot of ram, but I like to be able to run VMs locally and not have to boot up instances on AWS (insert provider of choice), I like to keep tabs open in my browsers, I like not to have to close apps when I'm using them and I like my computer to be snappy. 64 GB allows that 16 doesn't, 32 barely does.
Having read a bit more about the new M1 I really think it is designed and speced for the new Air. The RAM is on the package, which makes it very performant and 16G is a reasonable limit for an Air-like computer. The Mini got cheaper and more powerful, so it is not a bad trade off. I strongly assume, that there will be variations/successors to the M1 which are going to support more memory and also more IO (more USB-4 ports, more screens).
From their schematic, the two DRAM modules were directly on the SoC - possibly to improve bandwidth etc. So it looks like this cannot be upgraded / replaced. That said, it might be worth it to look past the specs and just use your applications on these machines to see how they perform. SSD storage is much faster these days and if the new OS has decently optimized paging, performance will be decent as well.
That is fine with the Air. But for a small desktop computer not to support more than 16GB in 2021? Its predecessor allowed up to 64GB (and possibly more with suitable modules).
You can get an external M2 USB 3.1 Gen 2 (10Gbps) enclosure plus 1TB M2 SSD for $130 and a 2TB for $320. That makes the 16GB Mac Mini 256GB a decent buy at $970 imo.
For the mini sure, but it's a massive pain having an external drive for a laptop. I use one all the time and as well as being waaaaay slower even with a good drive, I lose it all the time.
Yeah it's not feasible at all to use external storage on a two port laptop. Dongles that allow you to plug in power and monitor are still just not reliable enough for storage, the only reliable connection I can get on my 2 port MBP is with a dedicated Apple USB-C to A adapter.
Shocked they're still selling the two port machine, it's been nothing but hassle for me as someone who has to use one.
There are two lines of 13" MacBook Pro, the two-port and four-port versions. The two-port always lagged behind the four-port, with older CPUs, less RAM, etc. The four-port (which has not yet been replaced) is configurable to 32GB of RAM.
Web developers and photographers are the opposite of 'prosumers', kind of by definition. Plus, think of the size of a full res photo coming out of a high-end phone, never mind a DSLR.
Most of the professional photographers that I work with have PC workstations with 64gb to 256gb of RAM. Retouching a 48MP HDR file in Photoshop needs roughly 800MB of RAM per layer and per undo step.
Old undo steps could be dumped to SSD pretty easily.
And while I understand that many people are stuck on photoshop, I bet it would be easy to beat 800MB by a whole lot. But so I can grasp the situation better, how many non-adjustment layers do those professional photographer use? And of those layers, how many have pixel data that covers more than 10% of the image?
From what I've seen, quite a lot of layers are effectively copies of the original image with global processing applied, e.g. different color temperature, blur, bloom, flare, hdr tone mapping, high-pass filter, local contrast equalization. And then those layers are being blended together using opacity masks.
For a model photo shoot retouch, you'd usually have copy layers with fine skin details (to be overlaid on top) and below that you have layers with more rough skin texture which you blur.
Also, quite a lot of them have rim lighting pointed on by using a copy of the image with remapped colors.
Then there's fake bokeh, local glow for warmth, liquify, etc.
So I would assume that the final file has 10 layers, all of which are roughly 8000x6000px, stored in RGB as float (cause you need negative values) and blended together with alpha masks. And I'd estimate that the average layer affects 80%+ of all pixels. So you effectively need to keep all of that in memory, because once you modify one of the lower layers (e.g. blur a wrinkle out of the skin) you'll need all the higher layers for compositing the final visible pixel value.
Huh, so a lot of data that could be stored in a compact way but probably won't be for various reasons.
Still, an 8k by 6k layer with 16 bit floats (which are plenty), stored in full, is less than 400MB. You can fit at least eleven into 4GB of memory.
I'll easily believe that those huge amounts of RAM make things go more smoothly, but it's probably more of a "photoshop doesn't try very hard to optimize memory use" problem than something inherent to photo editing.
So why are you blaming the end user for needing more hardware specs than you'd prefer because some 3rd party software vendor they are beholden to makes inefficient software?
Also, your "could be stored in a compact way" is meaningless. Unless your name is Richard and you've designed middle out compression, we are where we are as end users. I'd be happy if someone with your genius insights into editing of photo/video data would go to work for Adobe and revolutionize the way computers handle all of that data. Clearly, they have been at this too long and cannot learn a new trick. Better yet, form your own startup and compete directly with the behemoth that Adobe is and unburden all of us that are suffering life with monthly rental software with underspec'd hardware. Please, we're begging.
> Also, your "could be stored in a compact way" is meaningless. [...]
That's getting way too personal. What the heck?
I'm not suggesting anything complex, either. If someone copies a layer 5 times and applies a low-cpu-cost filter to each copy, you don't have to store the result, just the original data and the filter parameters. You might be able to get something like this already, but it doesn't happen automatically. There are valid tradeoffs in simplicity vs. speed vs. memory.
"Could be done differently" is not me insulting everyone that doesn't do it that way!
I should wait for a 64 GB option. I've already got 16 GB on all my older laptops, so when buying a new gadget RAM and SSD should have better specs (you feel more RAM more than more cores in many usage scenarios).
It was surprising to see essentially the same form factor, the same operating system and not much to distinguish the three machines presented (lots of repetition like "faster compiles with XCode").
BTW, what's the size and weight of the new Air compared to the MacBook (which I liked, but which was killed before I could get one)?
Seeing two machines that are nearly identical reminds me of countries with two mainstream political parties - neither discriminates clearly what their USP is...
Apple's solution for upgradability for their corporate customers, is their leasing program. Rather than swapping parts in the Mac, you swap the Mac itself for a more-powerful model when needed — without having to buy/sell anything.
Apple doesn't care about your upgradability concerns on the notebook lineup. Once you get past that, it has traditionally done fairly well at covering a wide spectrum of users from the fanless MacBook to the high-powered MacBook Pros.
I have a late-2013 13" MBP with 16GB of memory. Seven years later I would expect a 13" MBP to support at least 32GB. I can get 13" Windows laptops that support 32GB of memory. The Mini is a regression, from 64GB to 16GB of memory. The only computer worth a damn is the new MBA.
Pretty sure my 2014 ish 13inch MBP with 16gb and 512 storage cost me around £1200, today speccing an M1 13inch MBP to the same 6 year old specs would cost almost £2000.
They already disappeared, I switched to Windows in 2019.
I use MacStadium for compiling and testing iOS apps. I was wondering if the ARM machines would be worth a look, but they are disappointing. If I was still using Macs as my daily driver, I would buy the new MBA for a personal machine.
The memory is on package, not way out somewhere on the logic board. This will increase speed quite a bit, but limit physical size of memory modules, and thus amount. I think they worked themselves into a corner here until the 16” which has a discreet GPU and reconfiguration of the package.
It's fair but if they choose fast but expensive and unexpandable technology, possibly the choice is failed in some perspective. I think most people who buy mini prefer RAM capacity than faster iGPU.
Can you actually link to a product, not a search ? Because none of the items coming up there are DDR5-5500, they're all DDR4-3600 or worse, as far as I can see.
I went to Apple's website right after I finished watching the keynote with the intention of buying a new Mac mini ... the lack of memory options above 16GB stopped that idea dead in its tracks though.
Also no 10G networking option. The combination of those feature exclusions makes it a dud for me; I don't want to have a $150 TB3 adapter hanging off the back, not when previous gen had it built in.
I bet “pros” never bought it and it’s only been viable as a basic desktop. Probably nobody ordered the 10 gigabit upgrade.
I bet they’re only upgrading it because it was essentially free. They already developed it for the developer transition kit.
I commend the idea of a small enthusiast mini desktop like a NUC but I don’t think the market is really there, or if they are, they’re not interested in a Mac.
I think it is notable the new mini’s colour is light silver, rather than the previous dark ‘pro’ silver. Presumably there will be another model a year from now.
It's not normally paging, but thermal throttling which involves the machine appearing to 'spin' but it's actually just the kernel keeping the cycles to itself, which typically give you beachballs as a side-effect.
And one tip is to use the right hand side USBC ports for charging, not the left hand ones as for some reason or other they tend to cause the machine to heat up more...
the right hand ones are the only ones that can drive external monitors (on mine anyway). I feel like I'm the only one that has this - I had a MBP 2019 - first batch - and I thought I'd read that one side was different than the other re: power. Power works on both sides, but monitors won't run from the left usb-c ports. but it's not documented anywhere. :/