Been extremely happy with mine the past couple months. The little modular port attachments seemed like a novelty at first, but now it feels absurd that you'd buy a laptop with a bunch of "hardcoded" ports that you can't ever change.
The only real Linux related quirk I've run into so far is that you have to disable panel self refresh (it's on by default and causes stuttering). Other than that tiny thing I pretty much just installed my stuff and started using it.
One little anecdote: I got a card in the mail from Framework saying that there was a problem with the cable for the touchpad, and it had instructions on how to fix it. Contrast that to my experience with Apple where they would delete forum threads for laptop problems and spend years denying issues until legal action forced them to acknowledge it.
Anyway, I'm a fan. I'm really looking forward to when the marketplace opens up with some new parts. I really want my blank keyboard. I'm hoping 2021 will be the year I can own a laptop without a god damn windows logo emblazoned on the keys.
Maybe I need some convincing here. How is the "modular port" concept any different than a universal port with dongles (i.e., how Macs have been since 2016). To me the fact that the port attachments are recessed is little more than a gimmick. Especially as all of my devices have transitioned to USB-C anyway, dongles/"modular ports" feel more like a stopgap measure than one requiring a permanent form factor change.
I find it hilarious that we spend multiple thousands of dollars on sleek, elegant hardware and then hook up chunky plastic dongles to overcome their bad hardware interfaces.
So I love the idea of these ports (agreed, they're basically "recessed dongles").
I couldn't lose them / forget them. They wouldn't take up space in my bag while I'm traveling. I could "set and forget" them to perfectly match whatever desktop / docking setup I'm using. In five years when my wireless VR system uses some as-yet-unknown hardware interface, I can swap a single component out to support it. Seems like brilliant design to me.
The crazy thing here is that it’s not so hard to hit the standard set of “pro” ports —
USB-C x 4 (new standard blabla)
USB-A x 2
SD x 1
HDMI x 1
Phone/Mic
I applaud the modular approach but Apple’s donglevision was the pure distillation of user-hostility between the Bean-Counter in Chief and the SVP, Thin Stuff.
And all the industrial sheep who followed them. May we all recover…
IMHO it’s a shame that Lightning didn’t become the standard connector for USBC. There’s a reason for that: socket fragility.
The socket is the most expensive part of the connection and when it breaks, it’s bad news. If you’re lucky and the bean counters didn’t overrule engineering over a microcent saving, the socket is on a daughter card otherwise you’re stuck with a one-port-down device, or an expensive motherboard replacement.
Instead, the Lightning connector is as stupid as it gets, worst issue is pocket lint you can easily remove with a toothpick
I'd disagree with that one, with Lightning the pins are in the connector. I've broken my iPhone before trying to get lint out of the charging port and bending the pins by accident.
USB C on the other hand has all the pins cable-side so there isn't anything to worry about ramming whatever you fancy into your phone or laptop since it's just a PCB with pads on rather than anything you can bend.
In my experience a sewing pin is just about the perfect thickness to get into an USB-C port. If you're even halfway careful you can dig out all the lint without damaging anything.
I do have to say that USB-C seems to be much more lint-prone than micro- or mini-USB. I have never needed to dig out any lint on my previous phones, but have had to do so a fair few times on my latest phone.
Thanks for sharing your anecdote. Mom & Dad didn't give me a call about your phones not charging yet, but knowing that it's doable is a relief.
I don't remember digging lint out of any previous USB generations either. Only a couple of USB-A connectors, which were integrated to some smaller MP3 players (yes, I remember them!).
The receptacle housing generally prevents you from bending the connector tongue, and even if you manage to do it somehow, AFAIK it's almost never FR4 in the receptacle (although USB-C receptacles printed directly on 0.8mm PCBs work great!). It'll generally be some sort of injection molded thermoplastic that's fairly flexible, so even if you manage to bend it, it'll spring back.
As for pins vs pads, you can make pads almost arbitrarily more durable by increasing the gold plating thickness, whereas it's really hard to make pins not bend.
I disagree. In my experience the lightning port/connector is the main pain point of the iphone. Even if you clean it out with a toothpick, it stops charging well (you have to fiddle with it and even hold it in a specific position to get it to charge), the cord falls out easily, etc. I truly hate the lightning connector. I'm not sure if USB-C will be better, but it certainly can't be worse.
Apple can figure how to put USB-C on a phone, but perhaps does not because lightning's connector has preferable RF sensitivity profile.
Not sure if it's measured as significant, but it's the sort of thing that RF engineering concerns itself with (preventing anything from detuning antennas or otherwise raising the noise floor).
> perhaps does not because lightning's connector has preferable RF sensitivity profile
Given that literally every other smartphone on the market has a USB-C port, i'd say this is not the reason why Apple used a non-standard connector, failing (voluntarily) to comply with European interoperability laws/standards.
FYI the interpretability standards passed recently were for the charging block. Every device has to charge from a usb c port so that’s why Apple switched.
Apple are slowly migrating to solutions that feature USB-C at some level, even if they keep Lightning on the phone itself. That's because EU authorities have signalled that they know Apple are taking the mickey, and will continue making more stringent rules until Apple play ball. E.g. https://www.bbc.co.uk/news/technology-58665809
Micro-USB was the standard back then. All phone manufacturers had custom proprietary (though most jack-based) connectors, but due to european regulations they all switched to micro-USB to comply with standards... all, except Apple of course who went another way.
Apple was not exactly unaware of these developments, as they have repeatedly signed the memorandums of understanding surrounding charger interoperability, according to wikipedia.
For what it's worth, Lightning is dramatically better than Micro-USB ever was, and IMHO it's still better than USB-C in terms of form factor (it's thinner, allowing for thinner ports, so thinner devices) but not compatibility (I have to have a bunch of C-to-lightning cables and a bunch of C-to-C cables).
> but perhaps does not because lightning's connector has preferable RF sensitivity profile
... or because Lightning has a commercial "preferable profile" - it's another form of lock-in, and at the hardware level no less; such an extremely desirable feature, from a commercial perspective, is very, very hard to give up. It would open the door to a world where phone accessories are effectively universal, and surely we can't have that.
to your point, Apple also earns money from a 2+ year head start on shipping thin-connector phones, shipped in Sep 2012 whereas usb-c had only reached a stable design decision in Aug 2014, so would wait even longer to ship.
Didn't Lightning ship (Sep 2012) before usb-c was standardized (Aug 2014)?
So no idea about RF engineer comparisons, but Apple seems to have stopped waiting for consensus on what USB-C would be, in wanting to ship something thin before Samsung and Google and LG could even design something thin, by almost 2 years.
I wonder if some research suggests a port change would stall phone upgrades among enough users of earlier models, and that phone upgrades are lucrative.
In itself the Lightning connector apparently is better than USB-C but just look at the market of USB-C (and also USB-A which can be connected to USB-C easily) devices, compare it to that of Lightning and it becomes obvious which is better for you and why does Apple want the other.
I think one reason why lightning works great is because it is made by Apple. Apple is expensive, and therefore, it can afford to spend the couple of extra cents needed to manufacture a good connector and install it properly.
USB has to go on devices where the port already represent a sizable fraction of the cost, and there is a race to the bottom to whoever will produce the least expensive parts, and of course, it is shit. If lightning was standard, we would probably see a lot more failures, simply because not everyone has the same quality requirements as Apple.
Of course, USB doesn't have te be terrible, but when you compare USB to lightning, you compare a mixed bag of good and bad parts to only good parts. To be fair, you should only compare USB implementations from reputable, expensive brands against lightning.
Changing a connector is not trivial if you have hundreds of millions of customers and a massive ecosystem of third-party vendors, each with their own roadmap.
Of course, Apple could pull it off if they really wanted. They’ve done it it with iPads. But please don’t frame it as customer trolling. We’re better than that here.
Before USB-C, the universal, international standard was micro-USB and every phone manufacturer (at least in Europe) was bound by law to implement it. Apple has changed its connectors since socket interoperability became effective and they could have adopted the standard. They just purposefully ignored the consumer-respecting standards in order to keep their 40$-connector business flowing.
According to European regulations, Apple's actions are strictly illegal, but if any law enforcement actually cared to protect people from wealthy corporations, we probably wouldn't have any climate change, tax evasion, planned obsolescence, science-denial smoking ads, corporate land grabs, companies stealing water supplies from local populations... As always, laws that protect the weak from the powerful are betrayed, while laws that protect the powerful from the weak are strongly enforced.
> Before USB-C, the universal, international standard was micro-USB and every phone manufacturer (at least in Europe) was bound by law to implement it.
That is false. The EUC program was about PSUs, not device ports, and Apple was compliant by providing PSUs with detachable cables. Furthermore the EUC never legislated on the subject, they considered that the voluntary covenant worked well enough and no legislation was necessary.
> Apple has changed its connectors since socket interoperability became effective and they could have adopted the standard.
They were already complying and the “standard” at the time (micro-usb) was bad, not using it was a good thing.
> They just purposefully ignored the consumer-respecting standards in order to keep their 40$-connector business flowing.
At this point you’re just outright lying.
> According to European regulations, Apple's actions are strictly illegal
You are, and I want to make it clear that this is an objective affirmation, high as a kite.
> The EUC program was about PSUs, not device ports
Are we talking about the same thing? You seem to reference this memorandum of understanding [0] promoted by the European Commission (and signed by Apple), whereas i reference further developments such as this vote [1] which was widely advertised in the press at the time.
I am unaware whether that vote was actually turned into a regulation, but i am fully aware that the European Commission is not the entity deciding on regulations in the EU (although it has way too much power to overrun the EU parliament).
> the “standard” at the time (micro-usb) was bad, not using it was a good thing
OK micro-USB was not the best. Still much better than using custom proprietary connectors overall. Just look at how much money/resources was saved by reusing existing cables: do you remember the hot mess we were in in the early 2000s when a phone charger broke, to find a spare compatible one?! Now i can't remember the last time i had to buy a phone charger, because there's an abundance of standard cables. It's a net win for me and my wallet, and a net win for the environment.
Also, not going with a standard you deem bad is fine... if you're working to either improve the standard or replace it with another one. Which Apple never did, as they were happy to have their custom hardware which their fanatic customers would buy no matter the price.
> At this point you’re just outright lying.
I may be misinformed on specifics, but i'm for sure not lying. If you're impying that Apple (or any multinational corporation for that matter) are good faith, you have some research to do on how industrial capitalism operates and its actual consequences on people.
>> According to European regulations, Apple's actions are strictly illegal
> You are, and I want to make it clear that this is an objective affirmation, high as a kite.
OK i'm high as a kite, maybe? Does that make my message wrong on every aspect? Apple has been known to and condemned for breaking many european regulations already [2] [3] [4] [5], often engaging in actions they knew were illegal. I'm not a lawyer so i can't comment on the technical legality of their Lightning connectors, but i can for sure as a european citizen say that they knowingly and willingly violated the spirit of the law to further their profit.
And as a pseudonymous person on a random orange forum, i can say you should take more time to correct facts with actual sources, instead of defending evil corporations while accusing your peers of lying.
Let's be charitable here, the lightning connector appears to be more durable than USB-C, at least on the device side. There's no protrusion, whereas in USB-C the contacts are on a very thin prong that çan be damaged if something small enough manages to get inside the connector.
I'm not aware of such issues, but i'm personally still running micro-USB devices only so i have zero clue. Let me know if you have links/resources on this issue.
However, i'm fully aware these were not the arguments presented by Apple when they refused the USB standards. If Apple cared for durability, which they definitely don't [0], i'm sure a lot of people would appreciate that and maybe standards could be improved across the industry.
The fact that Apple never cared for any form of standard that i know of [1] does not give them a lot of credit.
[0] They pioneered making it very hard to replace your own battery and flipped the finger on everyone by using non-standard screws on purpose. Seriously, how can it be legal to sell a product which requires any form of tooling to change a battery?! Let's not even get started on software obsolescence on iOS/macOS...
[1] USB and VGA, sure, because they were forced on them. Maybe FireWire? But even then i'm not sure it was a standard back when Apple started using it... On the software side, apart from email, DNS and WWW clients they also don't respect any standard protocols: AirPlay, iCloud, etc.
> I follow smartphone world quite closely and have never heard/read about USB-C issues.
This isn't an effective point as the "smartphone world" is plagued by ephemeral devices which are either susceptible to programmed obsolescence or are caught in an upgrade treadmill due to a myriad of reasons (non-replaceable batteries failing, screen problems, camera issues, hardware failing due to wear, blocked software updates, fads, etc..)
> Changing a connector is not trivial if you have hundreds of millions of customers and a massive ecosystem of third-party vendors, each with their own roadmap.
And yet not only has Apple already done exactly that for iPhones (specifically: migrating from the iPod connector to Lightning), but so has virtually every Android vendor done exactly that for Android devices (specifically: migrating from USB micro-B to USB C).
> And yet not only has Apple already done exactly that for iPhones (specifically: migrating from the iPod connector to Lightning)
That’s the point though is it not? Apple was just out of a connector switch, which required users to throw out all their old accessories and get new ones. They were not going to do that again within just a few years.
> so has virtually every Android vendor done exactly that for Android devices (specifically: migrating from USB micro-B to USB C).
Historically, Android had nowhere near the accessories ecosystem of Apple.
I believe that is why Apple is starting the switchover to USB-C, with the new iPad using USB-C:
* the dock connector lived for about 10 years, we’re approaching the 10th year of Lightning, that’s a pretty good lifecycle for a connector
* the universality of USB-C amongst Android manufacturers means there now is a large ecosystem of accessories and Apple won’t have to rebuild their ecosystem from scratch
I wouldn’t be surprised if the ipad was basically a warning shot, and Apple switched the rest of their mobile devices over to USB-C with the 2022 releases.
> I feel that only confirms my point. The switch happened ten years ago, yet many people are still being mad at Apple today over it.
It disproves your point from multiple directions:
1. It demonstrates that Apple has no qualms about abandoning proprietary connectors and leaving an entire connector ecosystem stranded overnight.
2. It demonstrates that said connector ecosystem has no qualms about adapting to a new proprietary connector - let alone a standardized one.
And no, I know of precisely zero people upset about switching away from the iPod connector. The only thing about which anyone is upset about is the fact that Apple chose a different proprietary connector instead of using that opportunity to standardize.
> I think I’m failing to see your point. None of those vendors has any amount of control over the USB accessory ecosystem, or do they?
The bigger players absolutely do manufacture their own accessories, but that's secondary to my point: that the accessory market readily adapted to phone manufacturers switching connectors on its own. Apple, if anything, would have an easier time for the exact reason you indicate: Apple has control over the Apple accessory ecosystem, and can use that control to put additional pressure on accessory makers.
What do people think they're saying by saying this? No, this has literally happened here by one of "us" so "we" are clearly not better than this. Heck, "we" have done and are continuously doing far worse than this.
In case it’s not entirely clear: I was referring to HN guidelines.
Accusing others of acting in bad faith is never helpful. I will continue to remind others of the rules, no matter how often they’ve been broken in the past.
Cool. Yet this behavior (and worse) is widespread and not being punished by moderators except for the most blatantly obnoxious cases. Guidelines are only relevant to the extent that they are enforced, otherwise they're just a wish list, not a code of conduct.
Speaking of good faith, a good faith reading of their comment would be that they think Apple is intentionally maintaining a non-standard connector for their smartphone range despite knowing that switching to USB-C would be beneficial to their users.
Alleging that Apple acts in bad faith hardly seems like a violation of HN guidelines. If anything, the claim that they put the needs of the users first would seem the preposterous one as one would expect them to be beholden to their shareholders (and thus profit) above all, not their customers, and there are plenty of reasons why maintaining their own connector might be more profitable.
Accusing people of acting in bad faith is impolite. Accusing companies of acting in bad faith when there is ample evidence of their wrongdoing is one's duty as a consumer.
No ethernet ? WiFi is nice and all but when I get a docker-compose project that decides to pull down the internet I really love the fact that I'm on a gigabit network.
This is where it goes wrong. Everyone thinks their particular favourite port is a 'pro' essential, and we end up with Homer-cars with a thousand ports. Just use USB-C. Almost everything can go through USB-C.
But not everything can go through the USB-C cable you have on hand.
That's the annoying bit with USB-C. We may have (almost) standardized on a single plug/socket shape, but we didn't escape the essential complexity - the fact that one type of connection cannot handle all the use cases we'd like it to. We just pushed that complexity into cables. Instead of having to deal with separate data, network and graphics ports, users now have to deal with potentially separate data, network, graphics and charging cables. I'm not convinced this is an improvement, because USB-C cables are a bottom-feeder market that will not hesitate to outright scam the buyer.
At this point I'm not sure it's an improvement. I feel like the optimum point would be a small amount of standards targeting mutually incompatible applications. That, or forcing some specification requirements on USB-C, and standardize some capability labels.
I hope that this is what USB4 will bring, since iiuc, USB4 is basically the IF's name for Thunderbolt-4-capable USB-C. This was enabled by Intel contributing the TB4 spec to the committee, in a shockingly benevolent move that I guess may have been the greatest internal political feat Intel staff pulled off in the last decade.
Edit: Oh and presumably the ports on the Framework are USB4, they just can't say that yet because the certification is still in the works.
Ah it seems I was slightly off, it's TB3 not TB4, "The USB4 specification is based on the Thunderbolt 3 protocol specification." [0] But it does require: USB-PD, PCIe & DP tunneling, minimum 20 Gbit speed, max 40 Gbit speed. Stated goals to "minimize end-user confusion".
I've seen some peripherals and such with it, no laptops yet though. The spec was released in 2019, so considering hardware cycle time we should start to see more devices soon. It's pretty cool that Framework will likely be on the leading edge of that wave.
Is Dell still gimping them to 10Gbit/s like they used to? Considered an 13" XPS for years, but then turned to Apple because of this ridiculous decision.
I do not want all my USB-C cables to be able to handle 90W. That would make them very thick and expensive.
I do not want all my USB-C cables to support the maximum 40 GBps speed (or whatever it is). That would require them to have all the 19 wires and shielding and all and again, would make them expensive and short.
And just imagine how much a 90W maximum speed 3 meter cable would cost...
I prefer having one power cable, one fast cable and then a bunch of disposable cables for general use cases.
I prefer all my cables with the same heads to be exactly the same. Why thought it was a good idea to make them different? As if someone buying the cable will know the difference.
But then you lose the flexibility of using one port in many different ways. What we need is some standard color coding or other clear visual indicator on cables to reflect their capabilities.
I've opined the same before. Just put standard-colored rings (with textures, if we want to be sight-flexible) on the cables, when they're shipped from the factory.
Standardize the colors through the IF, and bam, you can tell at a glance what a cable is capable of.
Like resistors, except I don't think cables are likely to shrink too much in the future.
You'll never get Apple to do that though. They didn't with USB-3, they didn't with mouse/keyboard, they won't put them on their cables. And since they're the premium brand, everyone else will have to follow them.
I'm not sure why you're being downvoted, those examples are all absolutely true. There's no chance of Apple complying with a spec which doesn't meet their sense of industrial design (which there's no way this would).
I will say, they're not alone. Look at Razer, for instance. Electric green is not exactly a part of the USB 3 standard.
Which is a great solution if all you have are Apple cables, but colored rings fall off or get broken. It's like the US solution to healthcare: "don't be poor", it's not practical in real-world sense. People are going to buy (and make) whatever shit cable they want and regulations and standards don't mean a thing.
I would have to go with the parent comment although I know what you're saying as it would reduce cost.
I think what we've learned throughout the years is that your color coding idea doesn't work out in practice due to an earlier comment stating that the USB-C market being bottom-feeder. There has to be an exact, rigid specification of USB-C cables that all of them should follow (i.e. USB4/TB4). Any more complicated than that like color coding results in giant scams by manufacturers, outright wrong, or impossible-to-find cables on an eCommerce search engine. I just don't want to deal with any of those anymore. It feels so much nicer right now to look up TB4 on AliExpress and be done with it, no more worrying or guessing.
USB-PD permits 100 W with 20 V at 5 A. If that's carried over just two round copper wires (I don't know whether it is in USB-C) they would need to be 18-gauge or thicker for safety—about 0.94 millimeters. If they're copper, that's about 7.3 grams of copper per meter, 14.6 grams including the return path. Copper is expensive: almost US$7/kg. So a 3-meter 5-amp DC or two-phase cable would weigh 45 grams and contain 15¢ worth of copper. You could drop both the cost and the weight by going to aluminum. If you were designing the system from scratch, you could use 3-phase AC to cut the weight by half again, and use 48 volts to cut the weight by another 58%.
I don't have any idea how thick the 19 wires have to be for USB 40Gbps (GBps?) but I imagine the answer is "not nearly that thick".
Bottom of the barrel vendors will absolutely give a fuck and will cheap out without telling you.
The advantage of USB 2 is that it's so simple that it's very hard to screw it up. You pretty much have to intentionally do it if you want to create a dangerous cable. Even the shittiest cable will work with the vast majority of devices (it might slightly heat up, voltage may sag at the receiving end meaning it will charge slower, but it'll somewhat work).
USB-C is significantly more complex and requires active electronics in the cable itself in some cases, and the potential for higher voltages means a faulty/recklessly-designed cable could request higher voltage from the charger and blow up whatever's connected at the other end.
It sounds like you could maybe make substantial progress by just separating the differential pairs (?) by a millimeter or two of dielectric, giving you a ribbon cable, with much lower crosstalk than the round kind. Bonus points if you color the dielectric rainbow colors.
I feel like it's harder than shielding and tolerances, given that longer passive cables don't seem to exist despite the very high prices people are paying for active cables.
There might be issues of attenuation; we're talking about signals in the GHz range, where you have to use waveguides instead of wires to get low losses.
> I do not want all my USB-C cables to be able to handle 90W. That would make them very thick and expensive.
The difference between the minimum and 100W is that the cables need to support 5 amps instead of 3. That's not much difference at all considering there are data wires too.
Supporting 240W requires a couple tiny components in the plug. That's also barely anything.
By “expensive” are you talking in the $20-30 range for a single sufficiently long cable? I don’t replace cables that often but I don’t see the big deal paying a reasonable for a high throughput cable when needed.
They are if you buy the right cables. Just buy cables which have the capabilities you want, and are obvious to you. It's pretty easy, as long as you're willing to put about 5 minutes into the effort one time.
It's definitely an improvement, because you can still carry the one cable that does it all, and use it for everything, even the things that don't actually require it.
Honestly? Presentation. That's why I consider it a dumb argument in general. People mention "Homer's car" or equivalent memes from works of fiction as some kind of ridiculous contraptions, but don't bat an eye when a show like Star Trek does the same. The big difference, IMO, is that Homer's car is delivered to you up front, a solution looking for problem(s). Star Trek's tricorder or roundabout or a starship only happen to show a different one-off feature every episode - so the realization that the equipment is deeply multipurpose, and has all those features already present, kind of flies past people who're not into this sort of thing.
The issue for me is how silly it is to hard-code these arbitrary and often single-purpose connectors in the laptop.
A laptop should be a general computing device. So why hard-code something as weirdly specific as an SD card reader into it? Give it the functionality to have any IO device attached (USB-C) instead.
How many additional watt-hours of battery would they have been able to fit in the laptop if they didn’t have the carve-outs for such dongles?
Say what you want about the MacBook’s lack of user-replacability, but it’s basically a tiny chip board about the same size as the iPhone’s, with a big box of batteries holding it.
> How many additional watt-hours of battery would they have been able to fit in the laptop if they didn’t have the carve-outs for such dongles?
Looking at the insides, I'm going to guess about 2 watt hours. Or they could have made it unmeasurably thinner.
> Say what you want about the MacBook’s lack of user-replacability, but it’s basically a tiny chip board about the same size as the iPhone’s, with a big box of batteries holding it.
Framework has 55 watt hours. The obsolete macbooks have 41. Both intel and M1 macbook pros have 58. Both intel and M1 macbook airs have 50.
Sounds like that lack of user-replacability isn't necessary.
Man I loved the idea of PCMCIA back in the day. Mobile network access (EDGE, if I remember correctly) via one of them on my chunky Toshiba was awesome.
I was fascinated by PCMCIA, because I had a laptop I was trying to put OpenBSD on it in 2001 and its ethernet port was not working, so the card was a workaround. I always wondered why it didn't really take off in Europe, it was a simple and pretty compact way (for the time) to get very advanced stuff in a laptop - I guess it was expensive to produce and the name was atrocious. I believe it got more popular in Japan.
I very much enjoy multiple USB 3 ports, ethernet, card reader on my laptop and do not have to carry any dongles. And I can easily hook 2 x 4K 60P screens using built in HDMI and mini-DP ports. It also has thunderbolt 3 so I can still hook anything extra should I ever wish.
That argument would make far more sense if these anorexia laptops at least compensated for the removed ports with more USB ports. But no, you get the same pathetic 4 (at most) as always.
Even worse when they're the pathetic failure (host-side) that is USB-C, so nothing fits without a dongle anyway. Bonus points if you have to waste one of them for charging the laptop, yay!
From what I have seen, it is more the hobby/semi professional range that has all the possible connectors built in whereas in the high end it is more modular and you buy different modules dependent on the connectivity you need. Especially if you need to fit it into a rack. E.g. you might only have some DSUB 25 pin connectors, but they cover dozens of analog I/O channels on minimal amount of space.
I guess some of those are analogue? I guess you can't squeeze those all through the same physical form factor connector. You can with digital, so let's reduce the clutter and do it!
I'm not a musician, but I've seen plenty of DJs setup their gear. It is clear that the connectors and cables are designed to be physically durable. They work in environments where even a beefed up USB cable would only last a few gigs, since building compact connectors for consumer grade electronics is at odds with the day to day reality of commercial applications.
I'm sure that other factors play a role. The economics of going digital would be terrible if it meant replacing a significant amount of equipment every time a new standard took over the market. Again, pointing to USB (since that it what everyone seems to associate with universal digital connections), we have seen three major iterations and a number of minor ones over the past 30 years. That's hardly the type of cycle that businesses want to hop onto given that a tiny operation requires thousands of dollars of equipment, where any given component may be anywhere from a couple of years old to over a decade old.
> we have seen three major iterations and a number of minor ones over the past 30 years.
... None of which broke existing functionality. I can plug a full-speed device from 2000 into a USB-3 A port and it will work perfectly (as long as there is still software support for the vendor-specific drivers that might have been necessary for non-class-compliant devices).
Except for USB, they are all analogue, and some are mutually interchangeable.
3.5mm TRS, dual 3.5mm TS, dual RCA, 1/4" TRS, dual 1/4" TS, XLR cables transfer the same kind of signal, and you can easily convert between the connector with dongles.
Mixers have all of these so that you wouldn't have to.
The utility is not thinking about where the f***ing dongle is when you just want to plug something in.
Yes, exactly, let’s suggest musicians to use USB-C, and every third cable won’t work, and they will be able to make a concert but with no guitar, exactly like the devices in front of us when we try to work.
The only insurance against “the USB-C downtime” is a subscription to Amazon Prime 24hrs delivery and another $68 (no kidding) Apple cable.
There's thunderbolt, USB 3, USB 4. External adapters of varying quality and capabilities are often inferior to even budget integrated stuff.
For example getting a 4k 60FPS HDMI dongle was going to cost me >100$, and the cheap ones I had overheated. Meanwhile a budget laptop with HDMI and integrate graphics works fine. Getting a dock with gigabit ethernet, high res HDMI, decent SD reader and a fast hub was >200$ last time I checked - and not that portable either.
Cannot agree more. USB-C is a big mess. I've quite a few of them in different specs. Some can do 100w PD, some support DP-Alt mode, some are Thunderbolt 3, and some are USB 3.0 and can allow a maximum of 2A, some even only support USB 2, however can deliver 5A. Put all those mess aside, some started to fail just after being used a couple of times.
That covers pretty much every common scenario when travelling.
USB-C for a connecting to a dock, HDMI for a meeting room screen, USB-A for reading a flash drive.
Homer cars is a macbook with a bunch of stupid HDMI and usb-c to usb-a dongles hanging off it so you can read a flash drive or connect to a meeting room screen.
My newest laptop gets a fairly consistent 700-800Mbps on WiFi.
Don't get me wrong I still prefer ethernet to avoid packet loss and reduce latency but download throughput isn't a problem I notice on WiFi anymore (since I'm also only on a 1Gbit/s line)
What a bizarre retort. It should also be fairly obvious that if you carry your laptop somewhere else you're not going to be able to reach it with the ethernet cable, either.
If you have a good WiFi network at home, that's great when your laptop is at home, but if you carry it outside your home you are at the mercy of whatever infrastructure you find there.
Usually if you need to transfer large amounts of data you can still plug in an Ethernet cable in e.g. an office.
Because usually, you can't just throw a 100m cable through a building and plug it just into some Ethernet socket. And who wants to carry around a spool of cable?
I don’t know really… yes Ethernet is nice to have but in 16 years of using an MBP as my “pro” machine in a big company I needed it like twice, 14 years ago. So yeah in principle, you’re right.
Why should anyone waste the precious space in a ultrportable laptop on the newfangled and time unproven technology which Ethernet is? I want my Token Ring port back to connect to my ring in a box with a Boy George connector – to celebrate the diversity of the computing I have filled up the basement and the attic of my house with.
I'm not disagreeing with the value of a wired Ethernet connection, but both my new ThinkPads (P1 and X1E Gen 3) have gigabit Wi-Fi. Connected to my Asus RT-AX86U, I got a 935Mbps download on a speed test over Comcast.
I had heard that 802.11ax (Wi-Fi 6) was pretty good, and it sure looks that way so far. I have some good Cat 8 Ethernet cables, so I will experiment with that too.
I have fiber optics internet connection at home, and my 4 year old MacBook Pro does consistently over 500Mbps (peaks close to 700Mbps) over WiFi. Granted, it’s still not 1Gbps, but I can’t think of any regular scenario where it would make a significant difference.
I live in an apartment building, while I have a 5g router in my living room my work room is separated by a bearing wall, but even in the same room I often get random interference where the internet starts stuttering.
My desktop with ethernet is way more stable than my MBP WIFI. Also ping is noticeably lower for games.
Hmm, I think the RJ-45 fits well enough on my 2019 Acer Aspire laptop. With the spring-loaded flap shut, it ends up being no thicker than the HDMI port next to it.
Apple did one good thing, which is make every USB-C port have the same capabilities (charging, thunderbolt). Windows laptops, especially once you get down into the budget section, are absolutely atrocious at this, you have to read little lightning symbols and can only charge from a special port...
Which came at the cost of just having fewer ports. 2 USB-C ports is a joke even if they are both thunderbolt 3 capable. One is taken by charging if you don't have a thunderbolt dock with power delivery, leaving you with effectively a single port.
It's always been that way on Macbooks. It's also simplified with USB 4, which means the newest Macbooks just support everything under Thunderbolt / USB 4 on every port. Older Macbooks may have had some Display Port shenanigans because of differences between DP 1.2 and DP 1.4 and whether it was over Thunderbolt 3 or USB 3.1, but all modes were basically supported.
Any port on any Macbook with USB-C can be used as the charging port, which is a big deal all on its own compared to most non-Macbook laptops that use USB-C charging.
USB-C is much more complicated than most of us would anticipate, I would prefer to make it more specific: 2 Thunderbolt and 2 USB 3.2 gen 1. And I don't know when was the last time I used SD, let's save it for something else. And on a computer, I would prefer DP or mini-DB over HDMI.
A laptop is a computer you use on the go. I don't see how this usage pattern includes that much of external hardware to use all those ports. Smartphone, data stick - that's it.
There may be a kinda-permanent place, where one using their laptop most of the time. I don't see any problem having a dock station there with all the the ports and a power routed via single USB C or Thunderbolt port.
The problem is not the industry. The problem is people using laptops where they should use desktop computers. Which are, coincidentally, are modular and expandable through the roof.
Not everyone is rich enough to also buy a desktop computer or have space for it? Not to mention that the hassle of duplicating software and data files between a laptop and a desktop is too much work for anyone who does not actually like to spend time on tech.
If I am on a tight budget, getting a desktop instead of a laptop is a no-brainer. There are very narow field where laptop is a must, and most of this are valid for employed individuals, so the burden of providing the hardware is on employer.
> The crazy thing here is that it’s not so hard to hit the standard set of “pro” ports —
Given that the cheapest USB hub allows you to plug in half a dozen USB-A devices and SD cards, and given that frequently they are not used at all by anyone, why would it be preferable to add 3 dedicated ports instead of just using one of the four available USB-C ports?
The same goes to the HDMI and phone/mic ports.
In fact, nowadays you have monitors that not only support video over USB-C but also serve as USB-A hubs, which means that with a single USB-C connector you can get everything you mentioned in your example.
Insulting all Apple users by calling them "industrial sheep" may put you into conflict with having your views given reasonable consideration, not to mention the site guidelines. It's not generally okay here to call people names for disagreeing with your views.
I'm not positive, but I think they were talking about Apple's competitors rather than their users. Samsung, for example, dropped the aux port for dongles soon after Apple.
I accepted the correction from others here in accordance with a site guideline about this exact scenario:
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.
I was unable to come up with a good faith and plausible interpretation; others found one that I'd missed, and thus I retracted my objection. The author apparently later confirmed their interpretation, but that was not factored into my retraction, and is not relevant to the guideline I'm trying to adhere to.
To your comment about "disingenuous", I've spent most of my life being misunderstood for making perfectly logical statements that other people decided were some sort of slander instead of trying to understand in good faith given the context that I'm a nerd with social disorders. So I'd prefer to avoid being upset with someone else over a misinterpretation when I wish others would be less upset with me about them.
Good approach, and I guess I can see my comment to be along the lines of those "perfectly logical" ones, but ignoring the social context: I try, though :)
I don't understand the logic. When the original usb-c MBP came out I spent $30 on Monoprice for usb-c to whatever cables and never looked back. I even still have many of those cables 3 laptops later.
People would actually comment about dongle gate in Meetups and I'd show them my usb-c to micro-usb cable... ...oh the look of shock in their eyes... "You mean... you never bought a dongle?". The concept of a cable with usb-c at one end and anything else at the other was completely foreign.
I had that complaint when working in a 5 story building and spending a third of my awake hours in meetings here and there.
That 30$ dongle become either a dangling bit you'll have on your laptop all day, it will be hiting stuff, get under the laptop, or worse case scenario stuck between the screen and the keyboard when you don't pay attention. As it'd always dangling it also become loose over time and get flacky accordingly.
Back then having a HDMI port was standard, no dongle being the norm. So yeah, having the choice between needing a permanent dongle or not, the answer is obvious.
What changed for me is WFH, otherwise I thing I'd still wish for no dongle until USB-C projectors and displays rule the world.
I have a monitor that acts as USB hub and power source, everything is plugged into it and then one single cable connects it and all of that and power to my company issued MacBook. Every meeting room used to have hdmi and DP and thunderbolt connectors but no more because every company issued laptop is now capable of thunderbolt (MacBook or dell precision series if you opt in for Linux)
If the company officially used many laptops with USB-C ports, it would make sense to have a USB-C to HDMI in every room with a projector, rather than making everyone carry their own.
Isn't plugging unknown USB-C-anything a huge security risk? It would be easy for a visitor to "forget" an evil adapter in a meeting room, and if employees are in habit of using them, boom.
Yes, I'm sure you could make one which acted as a USB-HDMI converter as well as a rubber ducky. sprinkle a few around, maybe bribe a cleaner to leave one in a meeting room, and you're set.
I think it's even easier than that; we now have cables that are normal shape and size of a USB connector that have an embedded system in them with a webserver, keyboard emulation, mass storage and wifi for a remote attacker to connect to.
It's also easier to just ask to quickly use the worker's computer to get a presentation going.
"Oh yeah sorry, my presentation is made in PowerShell instead of PowerPoint".
Yes. Transition periods are always painful in that respect, worsened this time as a ton of “business” line windows laptops still have a HDMI port, same for Dell’s linux offering for instance.
When the first full USB-C Macs went out they definitely were the odd ones out in the company, and even now there’s still that split between run of the mill windows laptops and macs. Adaptors are more common, but it’s still not great.
It plays more on the "why don't you just ... ?" question that raises when you ask for adapters being standard in every room.
It reminds me of asking to include decaffeinated pods in our recurring coffee orders for the espresso machine. The person had no opinion on coffee, but wasn't convinced they needed to accommodate for the minority that was concerned.
Luckily, we could always make it worse; there were times where full-sized ports were thought of standard yet we had PCs and Macs with mini-versions that were specific to the manufacturer (like mini-composite, AV-jacks, mini-VGA, mini-DVI).
I concur! I have about 3-4 different usb-c to what ever cables and one usb-c to female A port for thumb drives. My thinking has always been that having all usb-c “future-proofs” for future configurations… maybe I will have two HDMI external monitors in the future, rather than display port and DVI? Easy, just get two usb-c to HDMI cables when that scenario arises. With cables it allows for so many different configurations rather than proprietary modular adaptors that any given company might give up on, decide to sunset older versions for new ones with more features. After living what you just described for the last few years I can’t for the life of me fathom how this modular approach will gain mass appeal. USB-C with cables seems far more flexible to me.
Similarly, I bought some adapters that I carry around. I travel between a couple of locations, and I bring just one charging wall plug, and one 10-foot USB-C cable.
I have adapters that convert the usb-c to micro and lightning, to also charge my airpods, flashlight, etc. Each adapter is about 3/4" (2cm), female USB-C end, and male end of lightning/micro. I've glued them together so that it's just one little thing to take.
I hated carrying around 3+ cables, so this has been a welcome change.
It's true that I can only charge one thing at a time, but that's not an issue for me except in rare circumstances.
This sounds like a much better solution than mine. Rather than cables I should have gone with little adapters. Then I just need to take a couple usb-c cables and I can work with any legacy port.
As it stands I typically have four cables in my briefcase but at least they are still smaller than a mouse collectively.
I love the modular laptop concept, but not for ports. For those who don’t want them hanging, these are perfectly color matched, made of the same kind of aluminum as the Air, and sit flush. I prefer it to having extra bulk to the base laptop.
Example: my external USB mic (much better than the internal one) and my USB disk for daily local backups, connected to two different ports this morning (and many other days.)
Your unwillingness to understand or empathize is a form of dishonesty. Just because you personally never needed a dongle doesn't mean such situations don't exist or that they are somehow boundary conditions.
I don’t see a problem with them. The majority of users never need one and even when I use them I usually use them infrequently. I often leave them on the ends of cables. My display port cable has a usb c dongle left on it so it’s like it’s natively usb C anyway.
Sure, if you do some weird stuff or have an extreme use case, I can see why you would want more built in ports, but for the majority of users, they only plug in the charging cable and maybe video out.
Not fair. Very few laptops have serial port (I came up with GPD Micro PC) and maybe zero have modem, but some laptops have VGA port. Manufacturers know that VGA is still used but serial/modem aren't.
These days I suspect that something like a toughbook might be the only option for those sort of ports - although that GPD Micro PC does look quite fun.
Even 10-15 years ago, proper serial ports were becoming extremely rare, but there are times when you need a proper one.
Around that time we resorted to pc card/express card serial ports for occasions when USB to serial isn't good enough, although they were relatively expensive (3-4 times more than a USB serial dongle).
(The use case in that scenario was field engineers connecting to a very wide variety of odd equipment, like fire alarm panels and door entry systems, that sort of thing - USB dongles were massively inconsistent and unreliable - different dongles would be compatible/incompatible with different kit, was a right mess).
Obviously, these days, express card slots are also quite rare.
(The alternative is hauling out my old IBM T22, which I think maybe came with Win98... mostly still works apart from the battery).
Serial is less useful as you need a serial cable, so if you're going to carry a serial cable you might as well have one with USB on the end.
If you're going into an RJ45 serial connection (like I am at the moment), then an ideal laptop would have multiple RJ45s which could be used as either 10G or serial with a standard cat5 cable (not a specially wired one).
In Japanese market, some latest models support VGA but mostly by domestic brand. Some models are made by Clevo or whatever, so possibly also available on other markets. Here's a list: https://kakaku.com/pc/note-pc/itemlist.aspx?pdf_Spec047=1
I use a USB-Serial cable about once a week. I use an ethernet cable dozens of times a day.
Framework means in theory I could have a laptop (Well in theory) with say 4 ethernet / serial ports (switchable) and SDI, and that's far more useful to me than USB-C.
All the conferences I’ve been to had dongles readily available. Most used hdmi. Once I had hastily arranged breakout room that had VGA for the Beamer. I think this is a non-issue?
I always carry it with me but usually they have some sort of screen casting tech around already.
I do make it clear from the planning stages that they need to provide either one of 3 video inputs to their selected system (HDMI/DP/Screen casting) or they need to provide the computer that I can use to remote into my 13"(this is what they usually choose if they have older screens or projectors).
Then you'll take a dongle with you to that conference? Are you telling me you would always waste one of the 4? framework ports for Display Ouput X that you only use once a year?
ethernet and serial dongles are a requirement for emergency maintenance inside of datacenters. But of course not many people on hn spend time in datacenters anymare...
It's also needed for just making sure your internet is setup properly at home. Nobody cares about your speed test over wifi, but ISP's sometimes care if you can't get anywhere close to the rated speeds over Ethernet.
And of course that setup still requires Ethernet. Can't setup a wifi ap over wifi.
I'm curious as to how those work out. The modules are too short to fit a VGA, serial, or ethernet port and be flush with the laptop, but I think you could make one that extends further out and above, and would still have some benefits over a dongle.
> The modules are too short to fit a VGA, serial, or ethernet port and be flush with the laptop...
I'd be fine with a pop out style port for those ports. Won't be flush while in use, but I'd happily accept that to trade off having to carry around dongles. I'd rather pack and carry a small "stick" of these modules stacked together than a bundle of dongles.
I give it three months tops before someone starts selling a Pez-like "dispenser" that stores these modules. If the module bodies were designed to stick together though, that would spark joy in my inner Marie Kondo.
I would argue that a USB-C to VGA or HDMI cable is just a longer dongle. What if you take your USB-C-only laptop to a remote office to do a presentation, but your six foot USB-C to HDMI cable isn't long enough to reach the port because the projector is mounted in the ceiling and has a standard HDMI cable routed to the lectern? I'd much rather have the Framework with a HDMI port on the device than struggle with a common situation like that.
> What if you take your USB-C-only laptop to a remote office to do a presentation, but your six foot USB-C to HDMI cable isn't long enough to reach the port because the projector is mounted in the ceiling and has a standard HDMI cable routed to the lectern?
I personally really like the idea of what Framework is doing and wish more laptops followed suit, but that is a trivially solved problem you identified:
Of course it's trivially solved...with a dongle for your dongle! Or you could avoid dongle-ception by using a modular laptop like the Framework, or even a standard laptop with an HDMI port; even current-gen models from Dell, Lenovo, and HP still have it as an option especially on business-oriented machines. It all comes down to what your everyday requirements and tolerances allow for.
But again, the "dongle" argument is moot and not really a reason to either consider or avoid the Framework, for me at least. It's more about the device being open and repairable, and arguments about dongles are just attempts to justify one's current USB-C only device.
> It all comes down to what your everyday requirements and tolerances allow for
Agree completely. For me, Apple's USB-C only ports isn't an issue as everything I use plugs in via one or two TB3 cables (depending on personal vs work laptop) and daisy chains from the monitor or a TB3 dock so no dongles needed at all, but I still appreciate the design choice Framework made and think it's a good strategy.
I don't see how putting a cable in my backpack is going to be better than a dongle, and I'm certainly not going to a client for the first time then complain they don't have the right cable.
Not to mention the dongle supports several ports.
But you know what is better than either ?
The framework laptop solution of letting me configure the port I want before going to my client.
Theoretically you could emulate the signal with software/drivers given that USB-C has 24 pins. But there's actually display standards/signals built into USB-C so you "just have to" convert the digital signal to analog for VGA, but then it's no longer a stupid cable and more like a dongle.
Never have I seen a greater push against good design. The laptop ship with USBc if you didn't pick that up.
YOU CAN USE YOUR USBC TO VGA CABLE IF YOU WANT.
Or, if you don't want, you can grab A VGA module out of your drawer you store all your retired dongles in, slide it into your laptop, and there you have it.
This is not about using a module vs a cable. My comments refer to using a cable instead of a dongle. People make it seem as though using a dongle is the ONLY way to, for example, connect your MacBook Pro to a TV when you could just use a cable for it.
I was going to present from my phone to a projector the other day, but ( probably due the wear and tear of putting in the charger every day for several years) it was glitchy, so I asked if I could borrow a newer phone and got a few month old, still glitchy, so I had to use a PC anyway. The plan was that I was going to walk around with my phone during the presentation...
What I'm trying to say with this story is that for example monitor cable connectors are designed to fit tightly (vga and dmi even having screws) to give a constant signal, which you don't get from USB-C unless you stand still.
I carry a battery powered projector for this reason for talking with customers, providers or partners. I use standard airport suitcases for that.
It just makes no sense spending lots of time trying to adapt to obsolete infrastructure for every person you visit. If necessary I even have a blackboard and color chalks in my car and get away with them.
When I go to the meeting room, if I don't need to use my projector, great, but I will never use VGA, too much hassle.
- some conf room don't have a projector, but flat screens, a smart white boards or some remote conf setup that needs you to plug in, and/or no walls that fits the bill for projection
- some conf rooms don't have a place to put for your projector and get a good picture. Their is own the ceiling.
- unless you buy a very good one, some conf rooms won't have the light for your projector to be readable
- it addresses only the projector problem, not ethernet, sd card, usb A, etc
- a good projector is way more expensive that a few dongles, are easier to break, harder to replace if lost/broken or if you forget it at home
Not to say it's a bad idea to _also_ have a projector.
In 2019 my new employer sent me a new MacBook pro. I couldn’t connect it to my home office monitors which had vga and dvi ports, so I asked for dongles.
Rather than try to sort out the cable confusion, they simply shipped me brand new monitors (which I wasn’t asking for). I also needed dongles to attach my keyboard and mouse, dongles for same were provided by IT.
My point: Dongles are still an issue, not everyone throws out their displays/keyboard/mouse every time apple comes out with some new version. My 2010 dell displays still work just fine, and it would be great if I could plug them straight into my laptop.
While I'm not a big fan of dongles, they make them slim enough to just leave them attached to the device. I've had one attached to my mouse for 2 years now and it doesn't add much bulk. My Samsung phone came with one so small that you can't even tell is there (other than the extra width for the USB-A part).
I have one, it’s USBC on one end & regular USB on the other, just flips around in the protective sheath to whatever one you need. It’s also super fast, though I’ve never spent much for high end thumb drives to compare to.
I got it at Target. Love how easy it makes going between my USBC only MBP and other random computers.
You're being downvoted which seems a bit weird, but I agree at least for myself. I went USB-C only in my house, and it's been pretty excellent, up until my new job gave me a Windows laptop that has exactly one USB-C port and requires Mini-DisplayPort 1.4 for its display output.
Well, not everyone wants to carry a brick. Sure, Thinkpads are great because they have each port ever invented, but I still prefer a thin laptop (if you have USB-A or Ethernet ports you can't have a thin laptop) with the option of using a dongle once a month if I need it.
There are plenty of thin laptops with USB-A ports. Thinkpad X1 is both thinner than a macbook and has two USB-A ports. And there are laptops just 2mm thicker than a Macbook Air that have ethernet via some clever mechanical engineering.
> I find it hilarious that we spend multiple thousands of dollars on sleek, elegant hardware and then hook up chunky plastic dongles to overcome their bad hardware interfaces.
I don't understand how anyone can come up with that conclusion. I mean, your "cheap plastic dongles" jab is actually a testament to the extent of how superb it's interoperability is. I mean, you're for some reason complaining that we are free to even plug in "cheap plastic dongles" to a high-end device when in reality this means that we can even plug in the cheapest "plastic dongles" and expect it to work. How is this a bad thing?
Back to the "cheap plastic dongles" complain, I do use one from time to time, and the reason is quite simple: I had USB-A devices which I use for years but I also have a couple of laptops which only pack USB-C ports. Should I throw away perfectly good hardware just because a random guy on the internet dislikes cheap plastic dongles? Should I base my purchasing decisions on whether a laptop supports legacy ports? Or should I just spend $10 on a dongle intended to be used occasionally and stop worrying about inane details?
Those who prefer spending their time on relevant things don't even realize that complaining about the proper etiquete of pairing peripherals with computers is a reason for anyone to waste their time. Why do you?
The point of the original comment seems to have gone right over your head.
> So I love the idea of these ports (agreed, they're basically "recessed dongles").
It's juxtaposing two types of dongles: the common, cheap, plastic ones, and the sleek, integrated ones from Framework's laptop, in order to show that Framework's aligns better with the design ethos of the device itself.
> Should I throw away perfectly good hardware just because a random guy on the internet dislikes cheap plastic dongles? Should I base my purchasing decisions on whether a laptop supports legacy ports?
Again, you're missing the point. Both of these types of dongles will support your USB-A devices without you throwing anything away in your pique. One will just look good, feel good, and integrate with the machine you're using, while the other won't.
I wouldn't think of them as recessed dongles. After installing them I haven't changed them out at all so far. It's more that you can configure things how you want. If you wanted 4 usb-c ports and that's all; just do that.
I would love it if all my devices had transition to usb-c, but they haven't. I still occasionally need usb-a and sometimes I need an hdmi out. So... that's what I have. And if I stop needing usb-a I'll get rid of it and put in another usb-c. You could even do a single usb-c port and then 3 storage attachments if you wanted. Nobody is ever going to sell a laptop like that, but for someone who really needs storage and doesn't care about connectivity that might be perfect.
If you're okay with dongles then you're probably fine. I'm not. They clutter up the workspace, occupy permanent space in my bag which is annoying, and often enough aren't around when I actually need them.
It also goes a little bit beyond that: if one of your ports stops working (e.g. rust, water damage, etc), you can just buy a replacement port and you're back on track, as opposed to "welp, I guess I'll have to do without it..."
The modular port is a universal port with dongles. Except that (1) it doesn't take extra space outside of your computer, (2) it is cheaper than mainstream dongles (Apple sells USB-C to HDMI for $70 while the Framework HDMI expansion is $20), and (3) it is fully open source and you can actually print/sell your own.
Luckily, USB-C means you don't have to buy any accessory from Apple, and have the world of low cost peripherals at your disposal, like the $13 version on Amazon [1].
Quite the opposite, it means you can start with plenty of usbA port, and when you don't need them anymore, switch them to usbc, without changing your laptop.
It means when one port wears of, fixing it is easy, cheap, and doesn't immobilize your machine.
It means you can change your port to fit an hdmi or ethernet as needed, without having the stuff coming out your laptop, all ugly and taking space on the desk.
What upsets me about the solution is that the whole laptop is limited to 4 ports. They don't even offer the obvious "2 in 1" dongles where one dongle would contain e.g. two USB-A ports.
Many other laptops aren't modular - but they offer more than enough ports to make up for it.
You need at least 1 USB-C port for charging. If you want to be able to use an external mouse and keyboard without a hub, that's 2 USB-A ports that you need. That leaves you the choice between having one USB-C port OR HDMI for the last port. There isn't even an Ethernet option at all. And you still get the potential downsides of the ports being adapters from USB-C, if I understand correctly.
If you grab a Lenovo P14s, you can't swap one of the two RAM sticks (limiting you to 48 GB), opening it takes a little bit more work, and replacing some of the more integrated components is going to be harder (SSD is trivial). In exchange, you get 2x USB-A, 2x USB-C, HDMI, MicroSD, and full-sized (not flip-out/break-off) Ethernet. Plus an optional built-in smartcard reader, plus some proprietary docking port extension around one of the USB-C ports. Looking at the Gen 2, you can get that at around 3/4 of the price with a similar or better config (including a somewhat serious GPU) as long as you order on the weekend (when Lenovo's non-ripoff pricing is in effect), and you can also add a fingerprint reader and NFC if you want.
The Framework laptop has upgradeable RAM, but you will have to upgrade it yourself (discarding the RAM that comes with it) if you want more than 32 GB, and no matter what, it won't ever support more than 64 GB. Is supporting at most 64 GB really that much better than a laptop with 64 GB soldered in?
By the time you want to upgrade the CPU, the mainboard won't be compatible, so what's really there to upgrade?
> Is supporting at most 64 GB really that much better than a laptop with 64 GB soldered in?
Yes. Normally some people (like me) can't afford to buy the highest spec laptop. Thus, I'll be going with a lower memory and storage version of a machine, and then expect to upgrade the RAM and storage when I've got enough. That's exactly what I did with my current Thinkpad T440. And for a current gen Apple, I can either buy a 8 or 16GB machine, and the price difference is pretty significant for me.
> The Framework laptop has upgradeable RAM, but you will have to upgrade it yourself (discarding the RAM that comes with it) if you want more than 32 GB
FWIW, if I understand you correctly, AIUI you can order the "DIY" Framework with no RAM at all, so there's no need to discard anything--but also, just looked now & you can also order the DIY edition with 64GB.
> By the time you want to upgrade the CPU, the mainboard won't be compatible, so what's really there to upgrade?
The mainboard! (Well, that's Framework's plan at least.)
And the mainboards can run standalone too, so you in theory you can use it to automate your house in the future or something too. :)
(With regard to the 2-in-1 aspect, I think it's important to remember that this is Framework's first product range, they need to limit their scope to not spread themselves too thin.)
And people are already starting to experiment with hacking together their own modules, e.g. https://www.youtube.com/watch?v=0_uOzNt-xwY who was prototyping with off the shelf "magsafe" style adapters & essentially "rehousing" a wireless mouse dongle.
So with a "universal" wireless mouse + keyboard dongle rehoused in a module you could get two of your ports at least. :)
I don't agree with most of your points, but I do (at least on initial review) agree with this:
> They don't even offer the obvious "2 in 1" dongles where one dongle would contain e.g. two USB-A ports.
I would have thought that by making the expansion ports slightly wider, including 2 USB-A would be possible.
At my side gig running livestreams, I often end up with more than 4 USB-A devices connected, unfortunately, so with USB-C charging and HDMI, I'd need a dongle to use this device even with such a 2x USB-A expansion.
- USB-A dongle for wireless mouse
- External USB-A sound card (to connect the mixer board)
- External USB-A camera
- External USB-A flash drive to load up PowerPoints etc
provided by the presenter
- ... and then maybe I need to plug in my USB-A Yubikey to authenticate. Or a 2nd flash drive.
That said, using a USB-C hub/dongle for cases like mine isn't the end of the world.
I'll give a reply not seen here yet. I haven't bought the framework laptop (yet) but I can see the module appeal. There's all sorts of hacker-ish ideas that I could imagine stuffing in there and the fact that I don't need a dongle means they're always attached and ready to be thrown in a bag. My first idea:
Framework offers a 1Tb storage module for their ports! I backup my root OS via ZFS snapshot to USB every so often now. How great would it be to have a storage port that's all the recent snapshots of your important datasets. And, the possibilities are endless. The fact that they don't change the form factor of the laptop and that they're always attached is actually a big deal.
Good point. For what it’s worth my last MacBook had a TB3 port physically wear out. Thankfully it was covered by warranty, but the connector saver concept is definitely compelling.
This was something that was more prominent during the Micro-USB era. The little metal "tabs" on the male end of microUSB connectors would start to wear out after a thousand+ plug/unplugs resulting in a loose connection that wasn't reliable.
With USB-C, the connector was designed consideration of a bunch of factors, one of which I would assume is lifespan of the end connectors - USB-c has thicker, more resilient plastic hooks built into the inside of the male plug and stronger mating latches in the female end of the connector.
It when You need to adjust cable multiple times until connection happen, and then work very careful to not move anything.
Ports sometimes are very fragile. My old laptop has only 2 of 4 USB ports working.
Connectors are rated for a given number of connect/disconnect cycles. For USB-A it's a minimum of 1500[1].
If your laptop has a cheap connector which isn't rated for more, and you do two cycles a day (start/end of day, start/end lunch), then you'll go through the rated number of cycles in less than two years.
Doesn't mean the connector will fail right away but it might start to act up. Connectors are not forever.
The modularity means you can change your workflow or peripherals without needing to find a dock or dongles long term. You can travel with the HDMI dongle in for putting a movie on a hotel TV or use with an external monitor somewhere. Maybe you need that SD card reader for most of your photo work but only on weekends or trips when you go process images immediately or need to offload them from the SD card.
Regardless, the port can be what you need it to be or just a useful USB-C, you aren't tied to whatever ports the OEM thinks you'll need forever even though it may only be valued by a small number of consumers.
That small number is still enough to drive sales for FrameWork. I'm interested if I need a better laptop and I sit at a desk with desktop in use almost all the time. This appeals to those that interested in more control over their device in configuration, expansion or modifications, and the various ports and IO options. I can't say I'd buy many of the USB modules (rarely would use most anyway) but the mentality is there and I so far have trust in the product. It's not meant to appeal to GAMERS or Enterprise execs, just those that want more control over their devices.
My devices aren't USB-C (e.g. headset, tablet) and I use HDMI cables a lot. So it's great for me. Turns out thanks to the modular port design it's also great for you!
The main advantage is that the dongles are all built into your laptop. With dongles I have a pile of them in my bag (that takes up more room), I have to remember to carry them around with me if I'm in a conference room, etc.
The thing is, that the dongles are integrated into the laptop body. For a portable machine, this is huge. I love my 16" MB pro and do think that 4x USB-C is great for connectivity, but having to carry up to 4 dongles with me any time I move between working at home and the office is pretty much a nightmare. The built-in dongle ports, if you might call them that, are a much more elegant solution - somthing Apple should have invented (and could have sold for a lot of money). On top of that, they seem to provide even more "dongles" than laptop ports out of the box.
It is also probably mechanically much more robust, if you plug cables into your dongle box instead of into the (motherboard mounted) port directly. If one of the dongle boxes breaks, it should be cheap to exchange.
> How is the "modular port" concept any different than a universal port with dongles ...
At least the "modular port" adapters are not dangling from the side of a laptop as the dongles do. Dongles totally ruin the esthetics of otherwise slick MacBook for me.
The thing about dongles is you don't always have it with you. The biggest change I noticed when I got a USB-C based mac is that I couldn't just plug it into every projector through the HDMI, so I had to start planning ahead more. Same with getting photos off of an SSD card
Abstracting the ports makes a ton of sense. I have some hardware laying around - headphones, e-reader - which is perfectly good, but part of me wants to replace it just because it would be much nicer to have USB-C everywhere instead of micro usb. I could see this as something which could significantly extend the lifetime of the hardware by removing those types of compatibility concerns.
Absolutely agree. If I ever had to go back to a "docking station" which I'm sure many on here have used in the past I'd be ripping out my hair.
A single multi-port dongle with Hdmi, extension USB-C and USB-A makes it so that I connect 1 thing to my laptop at my desk. And you don't have to press the whole laptop on some weird device that can scrape that back of your laptop.
If more people had experienced docking stations of 10 years ago they would also be excited for these dongles.
I assume if all your devices are usb c you realistically have no horse in this game - and that's great.
For those who have different needs than vendor provides - permanent or temporary - recessed or not, or in other words part of computer or something I need to carry/lose/forget/misplace can be a huge huge difference.
I would say it's quite similar to having a touch device with an integrated pen holder where pen disappears in the device (so it can't fall out) and not.
It is a "gimmick", but if you are frequently shuffling your laptop around, having a bag clear of dongles and "floating" stuff is a world of difference.
I mean, even thinness in laptops is a gimmick (it's actually the first thing I'd do away with to get maintainability, battery life and better cooling/performance/noise — fanless, anyone?), but it sells like hot cakes.
I don't get it either. I dock my MacBook and my monitor supports USB-C which also provides power and a bunch of other standard USB connections. I don't even really need a dongle anymore.
I think this kind of thinking is what keeping Apple sailing high.
Headphone jack gone? Get the AirPod. Oh BT drains the battery faster of already sub capacity battery? Why don’t you have a power bank yet? And yeah, keep it on you always. Isn’t that normal? Or Apple has a shiny battery pack. Maybe buy two.
Glass back breaks? Well, you gotta lose something for wireless charging. But I don’t do wireless charging. Why not? Go buy another thing.
There was this TV show and there was something like “happy to comply” in that.
Yeah - I think Apple's approach is the right tradeoff, though I admit I think it's cool from a nerd that likes gadgets perspective.
If there's enough of a market for that that they can survive that's cool, but I think there's a reason it's not the default design (that isn't some cynical one about planned obsolescence).
I think the proliferation of usb-c has finally solved the universal port issue, you can just buy a 3rd party dongle with HDMI/displayport, usb-c (with pass through charging), sd-card, usb, etc. and it works great. Meets my needs pretty well.
external dongles? Apple specifically did this to extend the revenue of their boxes. I cant stand apple any longer. Currently I prefer the HP Omen - (the support from the executive escalation support team is stellar).
but the idea of needing a FN dongle whenever I want to do something is FN archaic. Plus they are over-priced, bulky and are much prone to ultimate failure of either the port (from flexing about when youre on a soft surface like a bed or something.
I have Two AOC USB screens that I use - so I have one laptop, three screens and it all fits into my backpack. the external USB screens are the only "dongles" I want.
The modular ports are my least favorite part as well. The fact that you have to buy one USB type c module just to be able to easily plug in your charger is crazy. Another just to have reachable USB port. Maybe 1 or 2 modular spots would be nice, but put in some standard type c ports and monitor connectors without having to pay an upcharge or include at least 2 type c modules free.
It has 4 type C ports, that's what the modules plug into. So the type C port module is basically a one inch extension cable. They do recommend you buy 4 so it's $80 and not $20. Having ANY upcharge to be able to have an expectedly reachable port to plug your laptop's power supply into seems like a design flaw.
Unless they've done research that shows the type C port on the main board is a common point of failure and needs soldering to fix I don't see the point. I've had to clean lint out of ports but I've never broken a type C port on a computer.
The USB-C passthroughs are nominally $9 each, so it's not $80 to fill out the bays.
More importantly, they're also built into the price estimate already, so when it says "$999" or whatever, that's including four $9 cards. It doesn't cost you any more to switch some of them for USB-A's instead, and other choices like HDMI, Micro SD, whatever, will be an upcharge.
A more savvy (sneaky?) approach might be to say that 4 cards are included and then only quote the increase over the base price for the things like HDMI that cost more, but I suppose they wanted it to be seamless in terms of how the pricing appears if you want to order more than 4.
Obviously there's a real sense in which engineering went into having this system and the things take up space, so there's a cost to having them, but I don't think calling them an upcharge is really legitimate; they're built in to the quoted prices.
Sorry, my mistake on the price for the USB-C passthroughs, they are $9 along with the type-A modules. All others are $19.
I was looking at the DIY edition and the price is not included in the estimate so it is an increase in price to get any modules. Looks like you are looking at the prebuilt options and I see those do include 4 type C modules in the standard configurations and price. I think they should do the same for the DIY editions and let you remove them if you want.
No the type C ports are $9 each, so it is less than $20 for two. You can reach the port without a module, but sure they could throw in one or two for free I guess. You should suggest that. I know users do not like to be nickel and dimed, and the two modules probably do not increase the BOM much.
That said they may be planning for a situation where you bring your own modules or buy from a third party if you wish -- it is an open design after all. Consider the enterprise use case. An org could have a batch of USB-C modules for replacement that they source from a third party for a cheap price, and then order the laptops themselves on demand.
Framework is all about reducing waste, so only giving users what they ask for is part of that.
Not trying to be snarky, serious question: How does a design that requires everyone to purchase at least one type C module reduce e-waste? It adds manufacturing overhead, shipping overhead, etc to every laptop.
It only reduces waste if mainboard/laptops are discarded due to a failed charging port. Does that happen often with type C connections on mainboards?
How is the state of linux power saving (i.e., mobile battery life) in 2021? Last time I checked you could expect to get almost 2x the battery life on equivalent hardware with Windows or Mac OS.
With tlp and a 5.10ish kernel, my T480 handles light web browsing at 3-5 W depending on display brightness (wqhd display). Heavier websites, high-res video decoding, etc. will bring it up to 7-10 W. Maybe 20 W when compiling something, or when attempting to use the (worthless) Nvidia GPU (which I leave powered down most of the time).
It has a dual battery setup with a 24 Wh internal battery and a 72 Wh external battery (I use the bulbous 6-cell, but you can get a thinner one if you prefer), so even when setting the charge stop thresholds at 80% to preserve cell lifetime, the computer easily lasts all day on battery.
That being said, it would be easy to find a random laptop that either idles at something stupid or that can't go to sleep properly because of just one wacky peripheral, making Linux power management seem bad.
Half the battle is picking the right starting point (e.g. a Thinkpad), and then setting everything up so you can actually measure the power in all of the different states so that you can actually confirm that your setup works properly before relying on it and ending up disappointed at the airport.
Start with this. If there are peripherals you don't use (something as small as a Bluetooth adapter or as large as a GPU), figure out how to power them down, and get the computer to put them in that state by default.
On ThinkPads (and probably on others), you can monitor the instantaneous power consumption by poking around in:
/sys/class/power_supply/*
There are widgets that can monitor this for you and alert you if anything seems amiss, but I just have a script that puts some text and symbols in my menu bar:
Thanks it looks neat. I'll check out your scripts - especially interested in the power draw part for myself. I had setup the 80% cycling before but that was on my n-1 installation and haven't had chance to put that back on.
I have a friend who I convinced to buy a Thinkpad and stick ubuntu 20.04 on it. Unfortunately he got a more expensive X1 one with an Nvidia Graphics card, it strangely overheats when he needs to charge it and plugs it into the AC. I suspect the power settings are all over the place. I've not had chance to get a log from him yet. How did you "power down" your nvidia card?
First I made sure X was working the way I wanted with Intel graphics. There are a handful of ways to do hybrid graphics on Linux, depending on what you want to be able to do, and how old your hardware is (including "I didn't even want this thing, just keep it powered off forever"). The Arch wiki has some pretty good guides [1] that are helpful even if you're using a different system.
bbswitch seems to work on my system, and the interface is really simple [2]. But there's also this page on acpi_call [3], which suggests that bbswitch is old and unmaintained and that newer systems do something different. From a quick scan, it looks like the Arch wiki also mentions this approach.
As far as drivers go, I know everyone likes to dump on Nvidia for their closed source mess, but on every system I've used with Nvidia hardware (desktops and laptops), I've found that the Nvidia drivers have universally been more reliable than nouveau, so that's what I use.
When I bought my T440s in 2013 (8 years ago now), I could push it to 18 hours with wifi off. And could get 10 hours plus with wifi. I seriously doubt Windows or Mac OS would have gotten much more juice out of it.
I currently use a T440S, with Arch Linux, and fairly recently replaced batteries. I feel like I only get 5 hours or so. Although if I’m on my work computer, with my T440s open next to me and I’m lightly using it throughout the day then it easily lasts all day, with like 30-40% remaining at the end of the workday. My biggest beef is that the replacement batteries these days seem bad - even brand new Lenovo brand ones degrade really fast. My assumption is that they’re not being manufactured anymore, and so a “new” battery is actually new old stock and has been sitting around for years. I am tempted to upgrade to a newer laptop.
New batteries won't lose capacity over time. A lot of "new" batteries are just knockoffs, but there's no way of knowing without opening them.
If you want good aftermarket batteries, KingSener has a solid reputation. Only go for aftermarket batteries that are upfront about being aftermarket and are open about the cells they use and well reviewed, or for sealed Lenovo batteries in original packaging.
I recently bought a brand new, sealed 72Wh original battery. It sat sealed for so long the batteries went down below 3.1V per cell and the pack locked itself. Looking at the self discharge curves I'd expect this to be the case for any actually new battery.
To make it work I had to open the battery and use a specialized charger to give it 0.1A of current until it got to 3.3V per cell, and I then programmed it to charge at 6A, after which the battery pack decided to start up.
Thanks. I’ll try that brand. I buy mine from Encompass, the official seller of Lenovo batteries. So they shouldn’t be knockoffs, nor unsealed. But, one battery went from 90% health down to 30% in the course of six months. And they’re not cheap!
Can't comment on mobile, but I've set my laptop to sleep after 10min of inactivity and 3min after lid close and it lasts the whole day (4.1Ah). I'm not using it for 8h straight, but rather a combined 3h or sth probably but with this aggressive sleep settings it easily stretches some 8h and I don't feel afraid of it just going out at the end of the day.
Expresscard was supposed to replace it. It's literally a pin for pin pci express slot. But the "market" spoke (more like Apple decided they were too good for useful ports) and laptops got slimmer and less able to be expanded on.
Thank you early user, Hopefully people like you would make enough sound that manufacturers come to their senses and put 'Repairability' back on the feature set.
It's not just the compute devices, I've been waiting for weeks for Samsung repair technician to arrive for fixing year old fridge. I remember going to showroom when I was a kid with my dad and the sales person used to pitch availability of parts, repair centers for the consumer electronic products.
> absurd that you'd buy a laptop with a bunch
> of "hardcoded" ports that you can't ever change.
This is great for multi-monitor users. Depending where in _my setup_ the other monitors are, I don't want the power and video cables, or docking port cable, to cover part of the external monitor. Being able to move the ports' locations is genius.
I really wish people talked about this stuff more when it comes to laptops. Hot temps and loud fans are a super turn-off for me. I'm willing to compromise performance for a quieter, cooler runtime, but usually all I can find out about a laptop is its processor clock speed and a horribly inaccurate battery life estimate.
Computer thermals are weird. You can ultimately choose between a gimped device that stays cool under load or a spectacularly hot, dynamically clocked CPU. I normally cut the clock speed of my CPU in half with any new laptop, since I'm really only going to use it for text editing/SSH. That alone is good enough to lock it below 40c, but there are other ways to achieve a similar effect.
Loud fans and hot temps are fine for me, in isolation. But knowing those hot temps (and to a degree those loud fans) are slowly killing my laptop, and likely very quickly killing my battery, much faster than you'd imagine, yeh, it suddenly becomes a bigger priority.
For me, I've never found a laptop as good as the older thinkpads at handling temps.
I had a 16" MacBook Pro that was so hot and loud all the time that I just couldn't stand it anymore and sold it to buy an M1 MacBook Air, which thankfully doesn't even have a fan.
I don't understand how it didn't come up in any reviews, because you don't have to push it super hard to make it happen.
I rely on notebookcheck [1] for stuff like this--best notebook reviews I've ever read. If you go down to Emissions there's a "Temperature" subsection with pretty detailed info.
Yeah, being limited to only 4 ports is a complete non-starter for me especially when you are required to use one of them to charge the laptop. So practically everybody is going to end up filling one of those modular ports with a USB-C module, when they could have easily fit two dedicated USB-C ports in the same space, with no practical loss in expandability.
I think the issue with this is the number of thunderbolt ports provided by the chipset. They have to choose between having only 4 ports that all work the same way or more than that where they don't all have the same features.
In the future if another alternative to USB-C PD arrives, you could presumably swap out to that module (of course, presuming it's otherwise compatible with the internals).
Also, is there any reason to believe someone wouldn't make a two-port USB-C module provided the market for such a thing exists? Maybe off-the-shelf ICs to do this aren't available (which would make this more expensive and thus less likely), but that doesn't mean that will always be the case.
> In the future if another alternative to USB-C PD arrives, you could presumably swap out to that module (of course, presuming it's otherwise compatible with the internals).
The module itself works over USB-C PD, so the only effect of such a module would be to convert the new standard back to USB-C PD with all its limitations, plus the cost of conversion.
True, but that still might be desirable if there's a new cable standard that is otherwise compatible -- beats making the entire laptop obsolete.
As a thought experiment: If the Framework laptop existed prior to USB-C PD, it would have been a very cool feature to be able to add a USB-C PD module and effectively upgrade the laptop to support it.
It'll be interesting to see if the internal USB-C design works out in the long run. My instinct would have been to build a larger (proprietary) internal connector for the modules that included charging, constant and switched power, USB, and as many PCIe data lines as possible.
I was reading some forums on their website a little while ago and they said that seemed unlikely with a decent amount of space being used by the mounting mechanism and whatnot. It would be nice, though.
No one wants all the different ports though, they usually just want the correct ports. If you're plugging more than 4 things into your laptop at once you should be using a dock.
If Framework opens this up for other companies to make modules, you might also see something like a USB port module with an integrated Logitech wireless mouse receiver. So then your mouse isn't even taking up a port.
I know Bluetooth is a thing but you get the point. All of those tiny devices that people keep plugged into their laptop ports 24/7? This is a much better form factor for them, and you don't necessarily need to sacrifice a port for them either.
I don't want all the different ports at once, but I definitely use more than 4 in total. My current laptop the following (and I'd love to have more):
- 2 USB-C (1 of which is used for charging, I'll probably have reason to use the other at some point with increasing USB-C adoption)
- 2 USB-A (1 for a wireless mouse, 1 frequently used for flash drives and whatnot)
- SD (used occasionally - cameras and with an adapter for micro SD in phones)
- RJ-45 (used occasionally, probably more often soon)
- HDMI (used somewhat regularly)
- Headphone jack (also built into the Framework)
So with the Framework I'd be missing out on 3 ports. I could survive with that, but it'd be pretty sub-optimal. Thankfully I shouldn't be in the market for a new laptop for 5+ years, so hopefully Framework will have more options by then.
If you don't want all the ports at once, then it sounds like modular ports is perfect? You can just attach whichever combination you need on any given day / moment. Or am I missing something?
Fair enough, that wasn't really something I was thinking of. Of course, at that point you're really just changing from having to carry around a bag of dongles to a bag of ports. Not having to remember to have to put in/bring my HDMI port to a presentation is a little bit convenient.
My first laptop bought way back in 1993, had a scheme where either or both of the two(!) removable batteries could be replaced with a plug-in module. I had one that gave me a SCSI port and another that gave me a 2400 baud(!) modem.
It was better in theory than practice. You couldn't hot-swap modules (because this was 1993, after all) and driver support was iffy. I sold that laptop a few years later and didn't own another laptop until I bought my first PowerBook in 2002. I've kept dongles in my bag from time to time for connecting to external monitors, but most of the time I never really bothered with it.
I did like my late-90s Dell Latitude where I could replace a CDROM module bay with a 2nd hard drive or battery. It was even hot-swap if I recall correctly.
That said, these days I know exactly what I want in a laptop and most mid-spec+ laptops have an abundance of what 90% of users need.
What I LOVE about the Framework (don't have one yet) -- it hte "Cyberpunk-yness" of the thing....
Imagine the day when our kids are rummaging through a pile of various modules looking for just the right one to plug into their Deck.
This, to me, truly feels like the "deck" from Neuromancer of olde!
What will be great is once modules become a 3rd party aftermarket blast off..
Fiber interfaces, all sorts of other modules and the inevitable future HN post about fake dongles with spy-hardware from china, 'beware of keyloggers on foreign modules' etc...
I hope that certain elements are attached by magnets.
---
Can you directly attach two machines side-by-side with a USBC cable? What if you could chain multiple of these boxes together and have a second deck, which is headless and just swipe between the two desktops on the screen... One KB and Screen and two decks? I have always wanted this.
I truly think its absurd that we havent yet been able to use machines like legos - I think that these decks offer a path to that with multiple decks.
How does the touchpad perform? I'd love to move away from Apple, but don't want to spend the money on a laptop to be disappointed with that aspect. My experience with non-Apple touchpads has been sub-par so far (looking at you Dell), and it's one of the factors that keeps me ensconced in the Apple ecosystem.
I am also wondering. I have seen lots of reviews on how cool the modularity and repairability is (and I love those things, too!), but not much about the day-to-day usefulness. Is the touchpad responsive? Functioning multitouch? Does the fingerprint sensor actually work?
And of all these, the touchpad is by far the most important, and like you, keeps me to my Macbook.
The baseline, preassembled model starts at $1000. Windows 10 Home, quad-core i5, 8 GB RAM, 256 GB storage, a nice 2256x1504 display, thin and light (1.3kg, 11.7" x 9" x 0.6"). Compare that to your other thin and light options at this pricepoint.
XPS 13, $1020
* i5
* 8 GB RAM
* 256 GB storage
* 1920 x 1200 display
* 1.2 kg, 11.6" x 7.8" x 0.6"
MacBook Pro: $1300
* M1
* 8 GB RAM
* 256 GB storage
* 2560 x 1600 display
* 1.4kg, 12" x 8.7" x 0.6"
It's almost a no-brainer, even without considering the repairability, unless you like macOS. Unfortunately, not many people see repairability as a feature yet due to the toxic status quo, but this could change. I think that after brand recognition is established, this laptop could legitimately be competitive in the laptop market, and not just appeal to hardcore techies.
It doesn't look like a no-brainer to me (already a Mac user). The MBP has a better screen, better battery life, performance, has a metal body, is completely silent and has nicer software.
That said, you should get the Air instead which is $999 with almost exactly the same specs, minus the annoying touch bar, and a tad lighter at 1.29kg.
For the record, I also own a Mac: the MacBook Pro, 2019 baseline model. I used to enjoy the battery life (though it has been slowly getting worse to the point where I have considered reaching out to Apple about it since the battery is glued in and I can't replace it myself). I like the metal body. I also like macOS. My MacBook is not completely silent because it's an Intel mac, but the M1 would be, that's true. As far as "better screen", I'm not so sure... on paper, the resolution is nearly the same. I also like the trackpad. But anyway, you need to decide whether these things are worth the complete lack of repairability. As a reminder, your Mac's battery will die at some point, and your Mac will become useless. I'm seriously considering selling my MBP and buying this instead.
Word, I'm pretty fed up with throw-away culture, especially with big ticket items.
I've used pretty much all the developer laptop contenders as daily drivers at some point. You get used to what you get used to, so marginally better isn't so important, I've found.
I'm going to pick one of the framework laptops because they look decent, but my secondary objective is to support a company that at least is walking some of the walk I want to see.
> I'm pretty fed up with throw-away culture, especially with big ticket items.
I love what Framework are doing, but integrated hardware does not immediately equal throwaway culture (saying as something who still uses their late 2013 MacBook Pro).
I think it's fair to say though that even if you don't have a throwaway culture, apple certainly has a throwaway culture and has been cultivating it in their users heavily (often by integrating their devices more than is necessary). Seems you managed to escape their wiles but many haven't.
No I think there are certain aspects about Apple products that are definitely not throwaway culture, mainly software support. The hardware is also holding up really long (I also use a MBP 2013, I have several iPhone 6s users in my family). The main issue is that in case of defects, repairs are not economical and I wish Apple would recognize that. Given the long viability of hard and software, I have reasonable doubt to believe it’s a decision made to maximize profit, but rather a legacy Jony Ive sized blind spot.
Apple makes hardware that is extremely durable, I and others use MacBooks that are getting close to 10 years old. I really doubt people will use the Framework for longer.
The main reason people buy new phones and macbooks is not that the old ones stop working, it's that the new ones are desirable, they are even slicker than the one you've got.
Apple adds features in attempt to lure people to 'upgrade' from a nearly identical product. Touchbar, face-id, 'apple silicon'. Most of this stuff doesn't change the value proposition of the core product. I'd say that the majority of Apple's user base is largely there for status reasons, and those people upgrade to keep the status they feel the brand grants them. As someone who isn't an Apple fan, but who uses some of their products, I am very aware there are exceptions, but I'd suspect that, like me, these exceptions also don't follow the press release driven upgrade cycle.
It's this type of consumerism that I'm personally not interested in. I want a tool, I want it be under my control. I want to be able to maintain it for its natural useful lifetime. That thinking does not belong in the culture Apple is fostering.
(Edit: and yeah, I get that Apple isn't the only culprit here, it's just that if a company seems to be fostering a culture that fits my ideals, I'm willing to compromise on 'slickness')
Apple Silicon was a very significant upgrade, my M1 Air is just insane. It's a ridiculously good computer.
But if you want a tool, why do you want it to be "under your control"? I think most people are more like me, I just want a really nice computer that works well, I have exactly zero desire to replace the RAM or upgrade the SSD.
I own an M1 Air. It is a macbook, and seems fine so far, but it's over-hyped.
I'll agree that it seems that currently most people are like you. They don't think about maintainability when it comes to belongings. If they care about the effects their actions have on the environment, they might want to start.
If you are buying a new phone and laptop every two years, where are you putting the old one? What happens to the toxic materials in the batteries? Did you need to have a new machined aluminum shell, or could you have just replaced the part that was bad inside it?
That all said, I don't have any illusions about changing the course of the average person's thoughts or habits via an HN post.
Like I said in the original post, I'm done with buying throwaway goods if I have an alternative. I'm happy that an option like the framework exists.
I definitely don't buy a new laptop every two years, few people do, and when I do buy one, like most people, I sell or hand it down. Same with a phone, I don't think many people throw away 2 year old phones.
Using a laptop intensively for 5-6 years cannot be called "throwaway culture".
When it's finally unusable, Apple will recycle it for free, at least if you buy a new one at the same time. I don't know what they do with the battery, but I can't see why it would help if it was easily user replaceable.
Maybe you feel that the M1 is overhyped because you only had your previous laptop for a year or two? Coming from a 2016 MBP the difference was huge. And of course that laptop is still in use 5-6 hours a day.
Most importantly, assuming Framework is still around in 10 years, do you think you'll use it for significantly longer than 8 years, which I suppose will be the lifetime of my old MBP?
I have three old macbooks laying around that people just gave me because they didn't know what to do with them - all broken in some way. I wonder how many people actually recycle their laptops, and am worried about the answer.
AirPods die within a few years and then get tossed in the trash. Modern MacBooks aren't built like they used to be. I don't see any reason you couldn't use the Framework Laptop for that long. Just replace anything that wears down over time.
My 2016 MPB is in perfect shape, except the keyboard that was overly sensitive to dirt from the outset. Not sure if that counts as modern? My new MBA seems extremely well built too, not sure what you are referring to.
Using my airpods pro at least 5 hours a day since they were released, also perfect shape. Like the AirPods 2 i handed down. I'm sure they won't last forever, but they've already provided a hell of a lot of usage for 200 bucks.
How is the battery life on your 2016 MBP? Like I said in one of my other posts, my 2019's battery is already slowly degrading. I'm at ~300 charge cycles after ~2 years of moderate usage. Apple says the battery is "consumed" after 1000 cycles[1], so it seems my battery is already 30% consumed, which doesn't make me very hopeful for the longevity of this device.
I wish it was easier to find actual data on battery life of 2016+ MBP models. Googling for "how long does MacBook Pro battery last" yields mostly useless results. There's a macOS app called "Coconut Battery" that tracks battery health over time, with an optional reporting feature that uploads the data to a server for comparison purposes.[2] Unfortunately the online viewer only allows you to view data by each individual model. It might be more helpful to aggregate the data from all 2016+ models, since the battery is the same, afaik. The online viewer also doesn't show how many reports the averages are based on. I wonder if it would be possible to scrape their data, to produce an evidence-based answer to the question "how long will my MacBook probably last?", rather than relying mostly on anecdotal evidence from Mac users.
Anyway, I think the idea that a $1300-$6500+ laptop ships with a glued-in, non-replaceable component that Apple themselves admits is "consumable" is just absurd, regardless of how long it may last.
> Apple offers a battery replacement service for all MacBook, MacBook Air, and MacBook Pro notebooks with built-in batteries.
I believe you probably mean “user replaceable” in your comments above. I do see the value in that for myself, which is why I still have my 2012 MacBook Pro. Nevertheless, I continue to my iPhone 6s after 3 battery replacements (I paid Apple for two and one was covered by them due to a recall).
I point this out because I think your criticism of Apple designing disposable hardware is less true of them than their competitors, at least in my experience. My Apple laptop and phone are the first Apple ones I’ve bought, and have outlasted my PC laptops and Androids by at least 2x.
If you want to make a criticism about Apple designing hardware that’s costly to repair, that’s something I could agree with.
The batteries on 2016+ MacBook Pro are not Apple-replaceable, either. They replace the entire top case because they can't replace just the battery.[1] (If anyone has any source for the contrary, please provide it.) Assuming it's not user-replaceable nor Apple-replaceable, can we agree that it cannot be called a "replaceable" battery? Yes, technically it may be possible, but if even the company who made it thinks it's too difficult, it's not very useful or accurate to call it "replaceable". To be honest, I disagree that it should be called "replaceable" even if only the company who made it can replace it. It's a linguistic debate at this point.
By the way, the 2018+ MacBook Air does have a replaceable battery, in that Apple can replace it, so by extension, users can too (if they can track down a replacement).[2] Apple must adopt the adhesive pull tab approach on the MacBook Pro. The glued-in battery is completely unacceptable.
> I think your criticism of Apple designing disposable hardware is less true of them than their competitors, at least in my experience.
Keep in mind that, as the most profitable tech company in the world, Apple deserves to be held to a very high standard. Time and time again other companies follow their lead. Also keep in mind that Apple uses this profit to actively lobby against right to repair, making things worse for everyone in the long run.
Though laptops mostly don't care about repairability and any repairability they have is usually an accident, I don't think most other laptops on the market are quite as egregiously anti-repair as the MacBook Pro.
As a user, my main concern with a battery replacement is that I get a new battery, that the device continues to work, and that it doesn’t take too long. If Apple decides the best way to meet those objectives is to swap the top case, I’d still consider the battery replaced.
> Assuming it's not user-replaceable nor Apple-replaceable, can we agree that it cannot be called a "replaceable" battery?
I could agree to calling these batteries “non-user replaceable”. What you’ve described is that Apple can do it, but they replace more components at the same time.
For what it’s worth, I like MacBooks but I skipped that generation of MacBook Pros because it didn’t seem worth it to me. I bought a non retina MacBook because I didn’t like the soldered on comments. After having 3 iPhone 6s battery replacements done by Apple, I guess I’m somewhat okay with glued in components as long as _someone_ can service it.
> If Apple decides the best way to meet those objectives is to swap the top case, I’d still consider the battery replaced.
The only reason Apple "decides" this is because they designed the laptop with a non-replaceable battery. It would cost less money and be much less wasteful for them to replace just the battery, but they can't, because it's too difficult, because they designed it that way. This isn't the "best" way, it's most likely the only way for them to do it at their scale. They replace just the battery in the MacBook Air because they designed it with a replaceable battery.
The top case is replaceable -- Apple can replace it, and so can the user, if they can source a replacement. But the battery, individually? It's not replaceable. That iFixit guide you sent is not news to me. This is what I was referring to when I said:
> Yes, technically it may be possible, but if even the company who made it thinks it's too difficult, it's not very useful or accurate to call it "replaceable".
Most people (including myself) would never consider performing this procedure on their device. Apple won't do it. You would be hard-pressed to find a repair shop that would. It's far too risky and time-consuming.
Your definition of "replaceable" seems to be "theoretically possible to replace, even if extraordinarily difficult (by design)". Under this definition, basically any part in any product is "replaceable", so it is not a useful definition. Apple could probably not have made it any more difficult than it already is to replace the battery, and they don't even do this themselves, so it's perfectly fair to call it non-replaceable. Definitions are subjective; we must collectively decide how to use the word "replaceable" in this context, and I see no reason to adopt the almost meaningless, Apple-friendly definition. For example, depending on the definition, you might even be able to say the MacBook Pro battery is "user-replaceable". Does "user-replaceable" mean "designed to be replaced by the user", or "easily replaceable by the user", or "theoretically possible to replace by the user"? If the latter, then the MacBook Pro battery is user-replaceable, I guess.
I think a better definition of "replaceable" would be "feasible to replace, at least by the company who made it", and an even better definition would be "feasible to replace by the user". Either way, your definition is not useful.
> After having 3 iPhone 6s battery replacements done by Apple, I guess I’m somewhat okay with glued in components as long as _someone_ can service it.
The iPhone 6S battery uses adhesive pull tabs like the MacBook Air, so it is (easily) replaceable (by Apple, or by users, or by independent repair shops). The screen is lightly glued on, but that's standard procedure for phones these days, and the removal process isn't too difficult, risky, or time-consuming with the right tools [1].
In fact, it has been relatively easy to replace the battery in every iPhone released since at least the iPhone 4, and every iPhone since then has scored at least a 6/10 on iFixit [2]. So, afaik, the non-removable battery is currently only a problem with these Apple products: MacBook Pro (1/10 iFixit), AirPods (0/10), AirPods Pro (0/10).
The battery life was never stellar, compared to previous and of course current Airs, but my daughter can use it a full school day without plugging in.
>Anyway, I think the idea that a $1300-$6500+ laptop ships with a glued-in, non-replaceable component that Apple themselves admits is "consumable" is just absurd, regardless of how long it may last.
This is where my view differs, since I can just hand it in for replacement, I don't have any issues with that. To me it doesn't matter at all if I can do it myself or not. Well except if you have to leave it at the Apple store for several days, then that's very inconvenient.
In general I just see it as a practical and financial question, and since I use my laptop 10 hours a day, and earn a very good salary from it, 200 dollars more or less every three years is really insignificant.
In years past you had to bring up the MBP but now the Macbook Air tips the scales.
I got my 16 GB Macbook Air for $1000 at Microcenter. For development purposes it's the same as a Pro, except it doesn't have a fan (and has never throttled) and doesn't have a touch bar (which is great)
The MBA is an insane value with the advent of the M1, a complete turnaround from the old days.
Also it has a trick up it's sleeve when it comes to this:
> the complete lack of repairability. As a reminder, your Mac's battery will die at some point, and your Mac will become useless.
First off, Apple will replace the battery in a MBA for $129 (vs $200 for the MBP), so a little alarmist...
But more importantly the new Air uses stretch-release adhesive (think command strips) and doesn't require removing the logic board for battery replacements. The MBP didn't inherent this improvement.
It's no Framework but it makes it the value proposition that much sweeter...
The battery is indeed doable, but the keyboard and screen on the macbook air are more of a challenge, although the parts shouldn’t be hard to find in a few years. With the Framework laptop those parts are easy to replace but bespoke, so framework’s repairability will be determined by parts availability in half a decade.
This is a good point. "Repairability" is a lot less important to me than lifetime. Right now, I have a MacBook with AppleCare. The author's criticism with AppleCare is that it can take a week for Apple to repair the computer. That's reasonable for me.
So the real test of the Framework computer is not in the first week. (Although the initial impression is very impressive!) The real test is whether I can, 3 years after buying the computer, replace the battery more easily than Apple could replace the battery in a Mac. Modular hardware is limited by the availability of the parts, and Framework doesn't have the brand to convince me that they'll be around for longer than I can get my computer serviced by Apple.
Do you mean the criticism is reasonable or 1 week repair time? Because the latter is absolutely not for anyone that uses their laptop as their primary working device. A week downtime is completely unacceptable.
I'm a bit conflicted about this, because on the one hand: yeah being unable to work for a week is obviously bad, but, devices are going to fail and it could sometimes be the difference between _you_ wasting a week debugging or not.
I like the idea of being able to fix my gear if I'm away on vacation or there's no apple store around. In fact, I'm overall a big fan of repairability because I'm a bit of a tinkerer.
But if it's a work machine? Then I'd rather have a replacement on standby- and Apples time capsule/time machine stuff works better than other backup/restore systems I've used.. replacing a machine is a 2hr process if you have something usable in stock.
Eitherway: I work in Europe, so these machines literally cost more than a monthly salary, it's cheaper for the company to have me sit on my hands for a week than to have a spare laptop for me.
There are a lot of likely failures on a laptop/computer which shouldn't require any more than an hour of downtime, because they should be trivial to fix, but can take your Apple device out of business for weeks even. As much as love my Apple devices, this alone might drive me to the Framework laptop.
Things which should be fixable by the user/IT personal on site:
- the battery
- the fan
- the keyboard
- storage
Luckily I have discovered an independent certified service provider for Apple devices, which is as flexible as the Apple rules allows for repairs, but just my experiences of getting the fan of my Mac Mini fixed by Apple were horrible.
If you were to buy a proper business computer like a ThinkPad or a Dell Vostro/XPS, you would have a next business day or sometimes even 4 hour on-site warranty where a technician would come to wherever you are in the world and fix the part for you right there.
1 week is an unacceptable wait for a business computer.
Agreed. I had a tech come to my hotel room in Hong Kong and fix my Alien (Dell) computer. He replaced the screen and keyboard under warranty in 2016. I had a Dell that I dropped off for same-day service. I have a Thinkpad T430u I bought back in 2012, and it is running great. Very solid. I run Kali Linux on it, and I do a lot of coding or writing on it. I love the keyboard, and I am one of those fans of the thumbnub! I have owned all sorts of computers from 1978, and this is by far my most rock solid one. The original battery is still in it, and it is not holding the charge it did, but hey, 9 years is a long life for a battery in these things. I am now using a MSI Stealth G65, and it is great, but let's see how it holds up. I've dropped it twice in two years, but it seems fine. I use it for my 3D work, CAD, and UE4 fun. I may buy The Framework to see how it goes, although like some others, I worry the parts will not be available two years down the road, and if they are, at a normal price. It's a chick-or-the-egg scenario: people need to buy them to create a market for this to happen.
>I'm a bit conflicted about this, because on the one hand: yeah being unable to work for a week is obviously bad, but, devices are going to fail and it could sometimes be the difference between _you_ wasting a week debugging or not.
These devices are sold as "Pro" presumably meaning Professional. Not being able to work for a week is not professional. There are companies that give you a repair time in hours, not days.
>it's cheaper for the company to have me sit on my hands for a week than to have a spare laptop for me.
I'm self-employed so a week downtime cost me a week's salary. But even so, I wouldn't want to sit on my hands for a week, because I consider that unprofessional.
Resources can be shared so even for a company with 10 employees it quickly becomes viable to have one spare laptop/Mac mini/whatever at hand just in case.
If you can't wait a week, go buy a new one and restore to it. When your repair comes back return it. Apple's return policy is 2 weeks and is actually 45 days when you ask nicely.
Another thing to consider is that, if everyone is reluctant to try the Framework laptop because they're not sure about long-term support, it will never receive said support. I'm willing to take the risk here, for the greater good! I don't even perceive the risk to be that high, to be honest.
> if everyone is reluctant to try the Framework laptop because they're not sure about long-term support, it will never receive said support. I'm willing to take the risk here, for the greater good!
I dunno, this whole idea of 'trickle down innovation' that ends up coming at the expense of the working class is a very bad deal when you consider the amount of advanced technologies available today yet which have been enclosed/commoditized as 'intellectual property'.
Another example is Musk's scammy 'secret master plan' hustle that tells a feel good (yet misleading) story to the propertied class that they should 'buy a $170,000 Tesla to help make mass produced Tesla's possible for poor people'. Which is a rich story when you see that big oil companies, together with governments, suppressed viable electric car technologies for years ('Who Killed The Electric Car?' documentary). My point is that these are deep systemic issues that should be remedied at their root, instead of them being presented as something the non-propertied -class should plan and pay for, especially when you consider that we have actually already paid for it (remember we gave them those big juicy low-interest government loans).
At this point in time, buying a modular laptop like a Framework computer should have no risks involved with it.
We need to just grow many more open source standards. All that proprietary hardware and software does is remove valuable feedback loops and lessons from the commons. It criminalizes cooperation, interoperability, repairing and repurposing. Only the propertied class wins here.
All the above arguments only compound and multiply when you consider that most of the technology that exists today was developed with taxpayer backed government loans (Mazzucato: The Entrepreneurial State), meaning that, as she puts it, "we have ended up creating an ‘innovation system’ whereby the public sector socializes risks, while rewards are privatized".
Even if Framework goes out of business, most crucial parts that are likely to fail are standard (hard drive, memory, battery), so you'd still be able to replace them yourself. And even for the ones that aren't (like the modular ports), the schematics are available [1], so anyone would be able to make new ones.
> The author's criticism with AppleCare is that it can take a week for Apple to repair the computer. That's reasonable for me.
One of my biggest reasons for using Apple laptops was the now-defunct Joint Venture program. I enrolled all our company laptops under that program expressly for the loaner laptop during repairs and priority support. The ability for employees to walk into the nearest Apple Store to the current client account and walk out with a loaner the same day to restore from backups and return to the client the following day was a no-brainer business expense.
Now that program has no functional replacement for that benefit, switching to Framework looks extremely attractive, especially as more of our work finds ourselves in containers.
I'm currently working in China. My 2015 xps has been getting slow.
So I took it to a nearby dell garage for them to replace the 9550 motherboard with 9570 for roughly 300$ while keeping all of the other components unchanged.
I've honestly been shocked at the level of service.
Sadly, the 9500 has moved to a different chassis so further upgrades are impossible..
Does anyone know if Apple is even capable of replacing the battery, or if they replace the entire top case assembly, on 2016+ MacBook Pro models? Apparently the Air's battery is indeed replaceable (...by Apple) but I'm not sure the Pro's is, which is incredibly wasteful, if true.
> The real test is whether I can, 3 years after buying the computer, replace the battery more easily than Apple could replace the battery in a Mac
No, the real test is whether you, 3 years after buying the computer, can replace the battery more easily than you could replace it on the Mac. Applecare just covers the cost of a new laptop when your Mac breaks during warranty. "Repair" is a generous way of putting that.
Agree with everything said here, as a MBP user (2010->2015->16->19) for the last however many years (PC before then and as a secondary now)
The closed ecosystem and continued hardening of even access to inside of my machine is troubling and obviously anti-consumer. Those points coupled with the lack of even 32GB of RAM in an M1 model (not to mention the cost of the brand) makes my next computer a probably-not-apple machine.
I also have the 2019 MBP, although not base model because I'm not a monster. Didn't the base model that year still have like a 128g SSD or something comically small? (I'm only jokingly being judgemental here, although it does baffle me why someone wouldn't upgrade at least a tiny bit to a level of practically if you had the choice to buy the computer in the first place) I'm kind of satisfied but not pleased at all with it, and I only have it because my 2018 non-touchbar model had a hardware issue which they replaced it with, and the 2013 model before that was stolen. It's marginally uograded to a 256gb SSD and 16gb of ram. I'm not pleased, specifically because of how loud the damn thing is. I was able to score a deal on an old gaming PC that runs quieter (or at least less annoying) on more demanding tasks, meanwhile I can't browse Instagram or watch video without creating a ridiculous noise. The 2013 model did not have this problem, and yes I'll acknowledge that video is a demanding task and modern websites are garbage, but cmon.
I did at least upgrade to 256 GB storage (...for like $200 or something ridiculous), haha. I just didn't think to mention that minor upgrade.
Edit: I'll also mention that the fan noise hasn't been a problem for me (I rarely notice it), but that may be because I don't use too many demanding programs.
Ah I see. A friend of mine who definitely has the money and makes it as a developer bought the real base model with baseline storage, and it's a bit funny how little he can do with it. With 256 it gets me by most of the time, but I'm out of luck if I need VMs or games really. I can pick one. Didn't have the budget at the time to upgrade past that though, for the absurd prices you stated.
Sometimes the fans will spin up in the middle of the night for some kernel process presumably.
> As a reminder, your Mac's battery will die at some point, and your Mac will become useless.
You can replace the battery yourself for like 50 bucks, or have Apple do it for 2-3x that. We have MacBooks from 2013 in perfect working order at home.
You technically can, in 66 easy steps and about 1-3 hours [1], all the while risking damaging your laptop. Compare that with 3-6 minutes [2] for the actually replaceable battery on the Framework.
Replacing batteries for MBAs and MBPs are completely different experiences.
I did the same with an MBA, replacing its battery in less than an hour by virtue of the battery not being glued to the case, nor having to worry about hidden clips built into the case, nor having to carefully dig out the guts of the system and putting them back together.
The MBP is in a totally different class -- as iFixit shows, you don't start removing the battery until step 51! And they're not kidding: to get to the battery, you have to do things like remove the trackpad assembly and pry out the logic board assembly.
Sure, you can amortize the 1-3 hours of labor over years of device ownership, but at every step, you're dealing with delicate parts, putting you in danger of turning your expensive, trusty daily driver into a brick.
Batteries are a wear item, guaranteed to have to be replaced. Apple's managers, designers, and engineers could show more empathy for their customers by making it easier and less risky to replace their wear items.
The thing is, I don't even know if you're paying Apple 100-200 bucks to replace the battery, or if you're paying them that much to replace the entire top case assembly, along with the battery, since it's so damn difficult to replace just the battery, I don't know if they even try. This is at least what they used to do.[1] I tried to research and find if this is still their practice but it was proving difficult. Very wasteful if so, and all likely just to make it difficult for the user to repair their own machine. (For what it's worth, recent MacBook Airs reportedly have Apple-replaceable batteries.)
Whatever else, I can guarantee you that they are not doing it on purpose in order to make it harder to repair them. They have just been prioritizing thinness.
> or if you're paying them that much to replace the entire top case assembly
The link says that "previous" MBAs had replacable batteries, and I know that 2012-2014 had it, so I don't think it was ever the case.
> For comparison, the previous-generation MacBook Air has a screwed-down battery that can be removed and replaced by Apple and its service providers without a top case replacement, in line with other non-Retina notebooks.
Also, this article was written in November 2018, and here's what it says:
> the battery can be individually replaced in the new MacBook Air [...] In all other MacBook and MacBook Pro models with a Retina display released since 2012, when a customer has required a battery replacement, Apple has replaced the entire top case enclosure, including the keyboard and trackpad.
This implies that, at least from 2016-2019, they were replacing the entire top case assembly in MBP. I have no reason to doubt they still are. The article is mainly about the MBA so it is a little confusing, to be fair.
One final thing I'd like to add is that you can have a slim computer with a replaceable battery. You probably typed this comment from one [1]. I don't see why they couldn't adopt this adhesive strip approach for the MBP. Perhaps it would create a teeny less room for the battery, shaving off 30 seconds of battery life on a laptop that already has significantly better life (when new) than most? I don't think this """trade-off""" (if it even exists) is worth it. There are also plenty of non-Apple examples of laptops with replaceable batteries and a MacBook-level slim design. The XPS 13 apparently beats it: 15.35 mm vs 15.6 mm. The Framework Laptop is not far off at 15.85mm.
Of course, having the ability to replace everything else is great, but the battery should be a given. None of the other components are "consumable", though they might still fail eventually, or you might want to upgrade them.
Consider also that anything that Apple does to make their product less repairable by the end-user also makes it less repairable by independent repair shops. I'm not sure why anyone would waste their time and money taking their Framework Laptop to a repair shop just to have them replace the battery when they could do it themselves in a few minutes (and you don't need to "live on HN" to follow basic instructions), but it's still an option. Whereas this is usually not an option with MacBooks.
I don't know if this was directed at me, but I'm not saying it isn't easier and cheaper to replace a Framework battery. My point is that almost nobody cares. You replace your battery at most once, after like 4 years. 100 dollars every 4 years is insignificant for almost all laptop owners.
I agree that they don't care. Perhaps they should, though? It's just unnecessarily wasteful (assuming they replace the entire top case assembly, which they probably still do) and expensive. By the way, it is currently $200 [1], but it could change. I don't think it's too unrealistic to imagine that Apple does this intentionally in order to get you to buy the latest model. Why keep investing in an old, dying laptop rather than just get a new one? It makes sense to invest in the Framework Laptop because everything is replaceable, including the mobo/CPU. But it doesn't make sense to invest in an old MacBook that might have other unforeseen, unfixable issues (unless by Apple for a fortune) in the future, even after a battery replacement.
If MacBook users could either replace the battery themselves (or take it to any repair shop if they somehow don't have a few minutes to spare), they wouldn't have to face the "repair or upgrade" dilemma until much later in the laptop's life. For Framework users, it isn't a problem at all.
> I'm seriously considering selling my MBP and buying this instead
Apple's trade-in program is pretty good. My 2012 MacBook Pro was still tradeable in 2019 for something like 400 bucks. I would consider that also if you don't want to abandon the Apple ecosystem.
Before having to use a Mac for work I preferred a P53 and an X1 as work laptops, but honestly now that I'm forced to use macOS again I don't miss Linux at all. Would be great to be able to choose though, but I'm kinda locked on Xcode because it's our build system.
Do they go arbitrarily far back, or cut off at some point? (I have a 2013 Air I've barely used for years... Should've thought of it sooner.) The site just lists model names for the Macs, which is oddly non-granular compared to the iPhones by number: https://www.apple.com/uk/shop/trade-in
There's an estimator below where it says "Select your device for an estimate", and it walks you through a serial number thing and then gives you an accurate estimate. I think it does cut off at a point but I'm not sure when, and it's not the same for all devices.
Perhaps, but I think it’s a fair comment to say that more effort has been put into making it nice, objectively, than any Linux desktop alternative.
Maybe some folk prefer the alternatives, and that’s fair, but if you did some pretty objective metrics of bugs, visual inconsistencies, user workflows for standard tasks (like adding new hardware…) well, it’s impossible to deny that Apple has significantly invested in making these things “nice”.
Its really quite difficult to argue the software is really the compelling offering with Framework laptops.
You can already get that on other systems if that’s your thing, and that’s never been enough to do more than barely raise a few eyebrows from folk who are particularly keen.
I consider package management and selection of packages as something nice. I also consider interoperability and freedom as something nice. A lot more effort for those has been put into the Linux personal computing ecosystem than MacOS.
... By subjective, I do not mean KDE vs Gnome vs Windows Desktop vs MacOS Desktop and all the Desktop Apps. FOSS is more than these types of things, I don't care about that stuff, it's just noise to me, the less of it is the better... Values are subjective, I value different types of things, but at the same time I am aware the primary values that DEs and big desktop apps concern themselves with are the most important to a large set of users, and that's ok - but you are ignoring the values outside of that area.
It's like I'm using my bike to ride down a mountain... you are using your bike to do city racing and don't understand why skinny tires and fixed gear and foot straps are not important to me.
I think you'll find that the effort is measurable, and not subjective.
How can you possibly argue that the army of programmers Apple has devoted to working on this one stack in one domain do less work than the widely distributed work done by a far smaller number of programmers on a far larger number of desktop stacks?
Don't be ridiculous.
I didn't say that the outcomes were better, just that more effort has been put in, in terms of hours-of-human effort.
> GP: Its really quite difficult to argue the software is really the compelling offering with Framework laptops.
That's a very different statement to "Apple has more person-years." In addition, by that measure, Windows is just as good, and works just fine on a Framework.
Also comes from a company that does its best to make products irreparable, irrespective of the environmental damage it causes, to pad their bottom line.
If we don't vote with our wallet, for the right thing, we deserve the worse future that we are going to get.
I do like MacOS more than Ubuntu (which I also use), and lately, customizability of Ubuntu has gone down too (As an old Mac-user, I want a menu bar, I actively dislike menus on each window) and the M1/M1X look real sweet compared to intel.
But Apple is the most anti-consumer company there is and I consider buying a Framework to give them a boost and hopefully purchase a laptop that will not need a full replacement for years. The ecological footprint of electronics is bad enough as it is, no need to reward companies that want you to throw away whole computers after 3y.
The fact an i7 scores the same as M1 in those benchmarks already tells you they are not very representative of reality. Plus power consumption will be massively different between them which is crucial for a laptop.
I like Mac OS and apple products ecosystem. For me, those things plus the nice premium build quality are key and worth the extra costs.
I don’t use windows and not really a fan of spending like 24 hours a year just dealing with Linux driver issues and other headaches only to have a worse user experience.
That’s just my preference though so I’m probably never getting the foundation laptop but think it’s a cool product.
I mentioned in another comment, the Macbook Air is the same price, has the similar enough performance for all practical purposes (better than an i5 by miles), no touch bar, and a more user-friendly battery replacement procedure
If it was 1920 wide of any height of at least 1080, I can now display 1080p media with no scaling (its really hard to get players to not scale while full-screening, and scaling from 1920 wide to 2560 wide can never work well, even with high quality Jinc or even cutting edge machine learning scalers); but also, just streaming desktops that are already on standard monitors.
But really the TLDR is that “non standard resolution” is a bit of a red herring. There are many sizes and shapes of monitor available and the concept of “standards” is an anachronism.
Even something as seemingly obvious like 1080p is a flawed standard because many TVs enlarge the input slightly to crop away junk on the edges, resulting in a physical 1080p screen showing an approximately 1060p image.
The single-thread performance of the M1 doesn't get close to the competition at its power level, even a year after its release. [0] The top item in the list is the M1, a 10W CPU. The second is an Intel requiring 125W. The highest-scoring i5 also requires 125W, and is 15th in the list.
Just a reminder that the M1 MacBook Air has no fan, and is still at the top.
When choosing a laptop you of course look are more factors than just performance, but for many, that alone will be an extremely important consideration. Not to mention that - incredibly - there isn't even a conventional battery life tradeoff for that top performance. In that sense, the M1 is a no-brainer.
M1 does beat every other laptop processor atm in single core speed, but the latest Ryzen 5X00U (and 4800U) are better in multithreaded perf (both absolute and per watt) - https://www.cpubenchmark.net/power_performance.html. Hopefully the Framework Laptop offers mainboard upgrades with these at some point.
I very nearly bought an M1 Macbook, but realized I didn't want to live with currently early stage linux compatibility. I'm still thinking of buying an M1 mac mini as a little home server though - that tiny power consumption combined with such good CPU performance is perfect for that use case (although it would be nice if I could somehow attach a bunch of hot swap HDD bays to a mac mini too).
To me the M1 one is the differentiator not even because of its performance, but because of what it allows the laptop to be: Absolute silence with 20 hour battery life. To have it also be screamingly fast is very nice to have though.
You forget the computer is there! I have been so overwhelmingly happy with my M1 (with RAM upgrade) as a dev machine. It's THAT good. My co-workers cry when they see how fast my docker containers build.
Would love to get some feedback on M1 for development.
I am used to work in Linux OS VM(Virtualbox and Vagrant in MacOS) and do most PHP/Python web development. It seems that Virtualbox won't be supported and There is only one Linux VM option available [UTM](https://github.com/utmapp/UTM)
I would hate to invest too much time for a new dev environment just for M1. How's your experience?
That's very odd, I compile a TypeScript react project practically all day while I'm working on it and haven't noticed anything. Could be legitimately faulty if you're seeing a load of issues.
Performance of the M1 is indeed impressive, and I don't mean to imply that these CPUs all perform relatively the same. But I have a 2019 Intel MacBook Pro with an i5, and it does everything I need a laptop to do: take notes, browse the web, and do some light programming and gaming (e.g. Minecraft at 60 FPS). It also runs x86 programs directly rather than through an emulation layer (though this will become less and less of a problem as time goes on).
I have thought a little about this: It is not _that_ much better than the last gen ryzens. How much of that is because apple bought more or less all of TSMCs 5nm process?
Early adopters, hardcore techies may love repairability but they will buy whatever comes out next.
What counts is the majority of buyers after these techies.
It's similar the early buyers of a Tesla who could say "ahh it will be so eco I don't have to buy another car for 10 years" and, 3 years later they buy themselves the new model as they would have normally do with a combustion engine car. Consumers first.
The key point is that repairability is important and is a marketing ploy but isn't a real fundamental issue for wealthy techies. They will just get another laptop in a year or so.
It's the non techies that are important, not us! It's the majority and long tail, not the early adopters that this laptop should be for!
As a "wealthy techie", I have the opposite view: as much as it's not difficult to me to buy the things I need or want, I hate the amount of waste generated by replacing things before they need to be replaced. I currently have a 2018 Dell XPS13. I really want a Framework laptop, but my current laptop works well enough (I wish it had more RAM and a larger SSD, though, which is a big part of why I want a new one), and it's hard to justify that waste, even if I can sell or give my old laptop away to someone else who needs it.
The 51nb "Thinkpads" achieve three goals you may find useful: reuse, upgrade & repairability. In effect, you are recycling while upgrading!
In addition to repurposing some of the old electronics & the entire chassis, they have massive performance, complete repairability & upgradability, a max of 64GB RAM & 3 storage disks, all in a smallish (but thick) Thinkpad form-factor. Oh, and they come with two power supply options (USB-C and barrel jack) and have functional VGA ports and ethernet jacks. Batteries are a bit of a problem though.
So you're not a techie in the sense it's being used.
Techie can mean someone who's an expert on technology, but when it comes to talking about techies as consumers, it's about people who always chase the latest and greatest.
It's tautological, you're not a techie in this context if you don't consume new tech.
> It's the majority and long tail, not the early adopters that this laptop should be for!
Why?
This is the way we end up having only mass-market, lowest-common-denominator products. Not fighting to get quality tooling for our niche is one of the important reasons we don't get any.
The real problem with the Framework laptop is that the ordinary consumer will not assemble their own laptop or upgrade it later. They will just go and buy a $500 cheap laptop with a 15" screen and if it breaks, buy a new generation.
> Unfortunately, not many people see repairability as a feature yet due to the toxic status quo
I’m struggling to see the feature in OPs article considering that they have gone through more laptops since I bought my MacBook Pro 2016 than I’ve owned laptops in my life.
Why was it necessary to replace a thinkpad every year, and what makes you think that the author isn’t just going to replace this one?
I have a feeling I’m going to have my MacBook Pro 2016 for longer than the author has this one.
I do support repairability, and I really don’t see why we let companies get away with ducking it over, but at the same time, there are now 3 different versions of the fairphone, sort of defeating the point of it. I’m sure frame.work is better than the fairphone, but I think you get the point I’m trying to make.
Unless you get faulty hardware, you’re probably better off taking good care of it. I mean, I still have an old 14” iBook PowerPC that works perfectly well with Linux. Its hardware is obviously not up to my current needs, but I don’t see why my current MacBook Pro 2016 won’t live with me for another 5 years at least. And by then I hope legislation has forced Apple into making things more repairable.
But there isn’t actually a very good reason to replace your hardware unless it stops working. At least not in my mind, and none of my Apple products have stopped working on their own. I’ve been luckier than some people in that regard, but I also still own a Sony tv from the early 00s that’s perfectly fine when hooked up with the MacBook so sometimes it’s also about buying the things that are tested so you don’t end up in one of the famous Apple recalls or the issues that come from them.
It’s very anti-consumption not to buy the newest thing, I know, but I sort of think the swap to frame.work for the sake of buying something repairable defeats the entire purpose of wanting something repairable unless you make the swap after your old machine literally breaks
It would be a no-brainier if Macs were still on Intel. But post M1, I’m not so sure…
The Air gives this thing a run for its money at a (albeit $21) lower price.
That said, I’m 100% behind the Framework concept though - I like the direction they’re heading in. If I needed a Linux/Windows laptop it would be a serious contender for me.
As a hardware nerd and standard laptop user I love the M1.
As a dev, I feel it's not ready yet on the software support side. It will eventually come, but if we follow the "don't buy now on the promise of future updates" principle, right now not being on AMD/Intel is a downside.
So I’m currently on Intel Macs and looking to move to an M1 imminently (holding out to see what MacBook announcements come this autumn).
What are the biggest issues you’ve seen so far from a dev perspective?
The Framework laptop has seriously got me considering moving away from macOS to Linux as a daily dev machine and keeping my Macs for home use (and music production for which I don’t want to leave Logic Pro).
But by the same token I’d love the crazy battery life, no fan noise and raw speed of an M1 for my main coding machine….
I also own an M1 Mac and it's subtle or large compatibility bugs with specific development tools and applications.
Two cases where my work was impacted
1) I work with laravel running on a couple of docker containers. There's complicated bug where a memory leak caused by some low-level package all PHP docker containers rely on which is exclusive to Rosetta. It caused all my php containers to be out of memory permanently.
It's not an issue I had in the beginning, don't know the exact cause but I wasn't able to fix it once it started happening, so had to move away from docker for development (which was not problematic luckily)
2) It can't build OBS. OBS, like many other open-source software packages need a maintainer to add support for M1 build-steps. There was a major dependency that didn't have M1 build support yet, so simply not possible until that gets fixed.
This is the case for many packages, it just needs time before everyone catches up.
But for the rest it's perfect imo. The battery life is insane, the no-noise fan is insane, the no-heat is insane. Only battery/heat issues I ever get is when I game on it, which it isn't meant for necessarily either but just saying.
Edit: Oh I forgot to mention that while the keyboard feel is great, its build quality is.. questionable. One time a key just got stuck out of nowhere. Just stuck, nothing to do. I tried repairing it myself but oddly 3 months ago (not sure about now) there were no guides on how to safely tuck it loose, clean it and put it back.
I broke it unfortunately. This has not been my experience with any other laptop or keyboard ever. I hate this part of Apple's philosophy the most, "It's broken? Please give us a lot of money."
For perspective, I moved to the mac because I didn’t enjoy compiling from source or debug install logs.
I use homebrew, and a number of stuff is still on x86 only, needing to run in a Rosetta terminal. My ruby install is one of these.
For php, php-brew couldn’t build the version I need till the end, so I bailed out of it and moved it to docker. Same for mysql 5 (upper versions seem to be available for the M1).
That was right when Hashicorp announced the Desktop pricing change, and I didn’t know what my org would do so I started checking the other options. Except docker-machine/engine is supported independently only on Intel, M1 chips need docker desktop to run containers (at least that’s my take, I’d love to be wrong).
Basically right now my local dev relies on Rosetta and docker, and I seriously think about buying an intel NUC as a side machine where I’d remote into.
I disagree. If you configure it, it will cost you a ton of money. The base configuration doesn't have even basic ports.
At contrary, today at work we bought for 900 euros (700 without taxes), plus less than 100 euros for an extra 16Gb RAM module, a thinkpad T14, that has all the features of the base model, but with 512Gb SSD, Ethernet directly on the laptop, a better keyboard, a trackpoint, more USB ports, a fingerprint reader, and a Windows 10 Pro license.
To me it doesn't make a lot of sense this laptop. Regarding repairability, it's just like any other Thinkpad, the modular IO, what is its purpose? Also you are adding components that can break, consume power, and waste space. And still you don't provide an ethernet integrated on the laptopt itself, so you have to always carry around a USB adapter that doesn't work as reliably as an integrated one.
> The base configuration doesn't have even basic ports.
The preconfigured model comes with your choice of 4 ports, the options being USB-C, HDMI, microSD, USB-A, and DisplayPort. The default config is 4x USB-C, but you get to choose.
> a fingerprint reader
Framework has that too.
Anyway, as far as I understand, if you configure it yourself, it will actually cost less money. This is the cost of the preassembled version, but for example, you could buy the "DIY edition" with everything that the preassembled model has, just without Windows, for $909 (~775 euros). The price drops even further if you remove the SSD, RAM, or Wi-Fi module and buy your own.
I understand that everyone has their own priorities when it comes to a laptop. I'm only comparing laptops within the "thin and light" category here. At 1.5kg, 13" x 9" x 0.7", the ThinkPad T14 probably falls into this category, but it is slightly heavier and bulkier than the options I listed. One of my top priorities is thin/light, but I understand those more concerned about performance or # of ports and Ethernet connectivity.
Additionally, I like this company's dedication towards right to repair, and buying the laptop supports them. It's a huge bonus that the laptop has competitive specs, though.
You should be comparing it to the MacBook M1 Air which is $899. Battery life & smoothness/performance they achieved with M1 is phenomenal. Golden handcuffs.
Let's be honest here, windows is going to be slow as a moonwalking Michael Jackson with that processor. Windows is, I kid you not, an order of magnitude slower than linux/macos.
What? Windows 10 isn’t particularly slow on my i7-4770K. You’re telling me that the i5-1135G7 (which benchmarks faster) is going to perform worse than my 8 year old CPU in general OS use?
If you’re only used to windows you might not notice how inefficient it is, and you may also be doing the “right” kind of workload for windows. On an i7 6700hq thinkpad I did the experiment and ran both windows and linux for an extended period. In my experience linux was about 30% faster for typical web development tasks (npm install, angular build, …), but performed around the same for web browsing, and was much slower in video calls.
Solaris (commercial and open-source derivatives), freebsd, openbsd, at least a dozen linux distros (from debian to things like gentoo, sourcemage, LFS...), MacOS, Windows... a handful of "toy" OSes... I've run a lot of different OSes over the years. I've used heavyweight desktops, lightweight desktops, straight X, straight terminal... across a large variety of hardware from the 90s until maybe 2016 or so. The point being, that I actually do have a lot of experience in a wide variety of environments with a wide range of interactive experiences.
Currently, my primary working environments are some sort of Unix, with CDE, stumpwm, or dwm - pretty ultralight environments by today's standards. I do think these environments can be described as "fast".
As to the actual machine in question (with the i7-4700k), that's my gaming desktop. It's run mostly Windows over the years, with some OS X thrown in there. For the past year or so it's been pretty much just OS X, but I do boot into Windows from time to time for this or that. Neither of these OSes are as fast as the stripped down environments that I prefer, but neither is appreciably faster or slower than the other, either, in my experience (unless something is broken; I could of course tell you horror stories about both platforms). From my perspective, both of them are enormous inefficient monstrosities, but the hardware is also really fast.
> In my experience linux was about 30% faster for typical web development tasks (npm install, angular build, …)
Yeah, these types of things aren't great on Windows. Especially if you run into corner cases, a lot of tools that were written for Unix environments go to dog on Windows. I don't know what most of the individual issues are, and I don't really know that it's even a matter of CPU (vs. operative latency, deadlocks, etc.). But the poster I was responding to seemed to be saying that Windows itself is too much for a modern i5 to handle, and that just isn't my experience.
You don't know what you're missing out on. I have a ryzen 3600+nvme and it takes less time to boot into Linux, unzip a large file, and boot back into windows than it takes to unzip the file on windows, even if anti-virus is disabled.
Unzipping is a single threaded task, so it's going to run at the same speed even on a 5950x.
On the hardware side, sure, it would be great to have cutting-edge tech. Money is a factor. Realistically, my hardware is good enough for my present needs.
On the software side, I am absolutely aware of what is available.
File operations do seem pretty slow on Windows. What I've noticed is that latency for individual access seems really bad. Does that latency actually scale with CPU speed?
Edit: btw, what were you using to unzip the file on Windows? How about Linux (unzip?)
What Linux kernel/distro, what sort of zip file (large files vs small, encrypted?)
I have an XPS 9560 and yeah the build quality ain't great. Had to replace the motherboard and battery after two years, and while the lid is aluminum, the bottom is not, so if you have it hanging over the edge of a desktop or a non flat surface, it flexes enough that the touchpad button no longer registers. These issues might be resolved in newer models but I probably wouldn't go with a Dell again, unless it turns out that they are the best of the worst after a comparison.
Anyone know of a Linux equivalent to Alfred and BetterTouchTool? I can get by with Gnome but these two tools are such a workflow gamechanger. Workflows and copy buffer with seamless integration in particular.
I almost feel like I have to buy this since it really is the type of laptop I’ve always wanted - but the allure of the M1 is still there, making me second guess myself.
Hear, hear! When coding (or reading) vertical space is much more important than horizontal space. Reading 300 characters wide is impossible, consequently also code also has to have shorter lines. So if your screen is ultra-wide, you either split it into areas (for which laptop screen is not big enough) or just waste a part of it. Vertical space, OTOH, is almost always used 100%. That's why I have all my widescreen displays in portrait mode too - still enough horizontal space and amazing abundance of vertical space!
I just got one of these last weekend(I ordered it in early August) and so far it’s really great. The modular I/O and general mission of the company was what initially sold me on it, but now actually being hands on with it, I definitely feel secure in my decision to get one. I can’t overstate how good these modular ports are.
I also really like that you can bring your own hardware in a lot of cases. For example I had an extra M.2 SSD laying around, so I ordered mine without one and installed it. You can also do this with the RAM, and even the wifi card.
The only thing I’ve disliked about it so far is the arrow keys on the keyboard. Having full size keys for left and right but split keys for up and down feels weird, I would have preferred all full size arrow keys and a small right shift(because let’s be honest, when was the last time you used the right shift key?).
For anyone curious about Linux on it, I’m running Arch and had basically 0 problems specific to the device. It’s my understanding there were some incompatibilities with certain kernel versions before so maybe some of these problems exist in distros like Debian with an older kernel, but I have had no issues.
All in all, it’s just an exciting project and nice to see innovation in the space that isn’t just rounded corners or a sleeker edge or something where they take modularity or performance away for the sake of aesthetics.
> because let’s be honest, when was the last time you used the right shift key?
Worth a try if you don't have the habit yet: use the pinky of your right hand when typing capitals with the left hand and the pinky of your left when typing capitals with the right.
Yeah, this is a weird comment from the OP. I use the right shift for 99% of my shift typing. This is probably because my version of "homerow" for keyboards is left shift, a, w, f, spacebar and spacebar, ., p, [, right shift. Which likely stems from years of gaming.
That's probably what I was taught, but in practice why? What's my left hand going to do with all its hundreds of milliseconds of free time while my right types a capital letter?
Eh? If I hit shift with my left and the letter with my right, that can't possibly be more stretching (it's probably less) than hitting both (the other) shift and the letter with my right?
For me it's more a matter of putting as little stress on my hands as possible. Pressing two buttons (shift and key) with one hand is just a little less comfortable than pressing just one key.
>when was the last time you used the right shift key?
I generally use the shift key closest to the key I'm typing. For keys near the center, I favor the right shift key.
The keyboard would absolutely be the show stopper for me, if I didn't just get a new laptop last year. The lack of dedicated Page Up and Page Down keys is unacceptable.
Finally somebody else mentioned it. I will never buy a laptop without a dedicated Home, End, Page Up, and Page Down keys.
I've used Chromebooks without them before, and I still to this day can never remember how to select while jumping to the end of the document, for example. It's like four keys all pressed at the same time in a very awkward way.
From the Mac side, Fn-Up/Down support seems to be universal for replacing the missing Page Up / Page Down keys. I bet Chrome is Ctrl-Alt-Fn-Down, copying Mac:
Down - Cursor trajectory
Fn - Page instead of Line
Command (Alt) - Document instead of Page
Shift - Text selection mode
(But I had to hit the keys and then look at my hands to figure out what they were, because I'm just used to keeping the modifier layers in muscle memory, so I could be wrong.)
On the contrary, I have those dedicated keys, and I am constantly accidentally hitting Insert when I want Home, which screws up line editing (which is 99% of the time what I'm planning to do after hitting Home).
For what it's worth, here are also [1] the Framework community forum thread about Arch and [2] the Arch wiki page about Framework in case you're interested.
I'm still waiting for mine to arrive (in the next batch) but I plan to install Manjaro when it does, and am cautiously optimistic that it'll be mostly painless.
Might be worth noting, I forgot about this when making my original comment, I wasn't able to get my USB drive for it to boot without disabling secure boot, but secure boot isn't something I care about so it wasn't a problem for me. I've heard it works but I can't comment on the specifics of it.
I wish that I needed a laptop so that I could buy one. I sincerely hope this company succeeds so that they are around when I do need a laptop.
> Yesterday, I put my 2019 Thinkpad on my pile of "laptops to refurbish and donate." I've bought a new Thinkpad almost every year since 2006. I think that's over.
It's addressed in the article and elsewhere in these comments, several times. They donate them after a year in order to upgrade. Seems a perfectly reasonable and responsible use case if you want a new laptop every year.
Having read the article, I don't think he really does. He buys a new one, refurbishes it and gives it away. Why that's a reasonable idea isn't explained (or at least it's not to my satisfaction).
Buying himself a new laptop every year was his motivation for quitting smoking, as he realized 17 years ago he spent about two laptops worth on cigarettes every year. It’s in the article, near the end.
Please get checked for glaucoma if you haven't already. Goes for anyone reading this comment who feels their vision is slowly getting worse. I started getting it in my 30s, it can strike early.
Yep, this is where I'm at too. 13" is just not enough space for me to get anything done productively. On my 13" MBP with my standard font size, my code editor can't show a full line of code without me having to scroll (VSCode with mostly default settings, font size 12).
"Self described early adopter consumer sees early adopter friendly project and supports it"
One might see some cognitive dissonance between the state of mind of being pro recycling and supporting reduce and reuse and being an eager consumer. However early adopters dance this line and lead the way for the masses to come after.
Only one of them has broken so far, and it was only an issue with the display. I repurposed it into a homelab/Podman host and it's been able to work just fine!
As a quick aside, if you're ever one of the 15 people who will likely do this, buy a Thinkpad dock. They're cheap, and it basically triples your I/O!
Just to continue the question, what do you do with the other laptops? You should have at least 4 or 5 more which are unaccounted for. I imagine only one is used currently.
Not them, but I keep a long trail of old laptops and generally do in fact keep them all in active use on a regular basis. For me part of the appeal was that I like distro-hopping, so multiple machines made it easy to keep rotating OSs without much trouble. The core bits (browser profile, password manager) are synced, and my projects live in version control that's easy to pull to any machine that happens to not have it yet, so I just... grab the closest machine when I want to do something and go. (And I tend to have them laying around multiple rooms so there's always one at hand)
I gave an old workstation to my mom and a spare T440p to my brother, now the x201 and T460s occupy my tinker station and bedroom respectively. Oh, and there's also a T420 that my other brother uses as a media server, but that's not really mine anymore :p
This is kind of a weird piece that really doesn't mean anything at all. I understand being in the "honeymoon phase" with a new piece of technology, but I don't feel like Cory is aware that he's doing that here. But there's some signs: He loved Thinkpads originally, but over the course of several years the company and the quality of the product went down the tubes. Now he's got a brand new laptop that he's only had for one month and is declaring it the best thing ever since sliced bread. But it's not really a fair comparison, a brand new niche product that hasn't been battled tested in any way versus a long-term established brand that he used for years. What will the Framework be like in many years? He offers extremely optimistic ideas, but obviously nothing concrete, because he just got the laptop.
I dunno, it just felt weird to me to be like "I loved this product I used for years, but it sucks now" and then say "I love this new product I've barely used!" without a hint of self-awareness that all the optimism and initial love for a product in the world won't keep it from turning into a pile of junk. How long until a "I went back to Thinkpads" article? A year, two, three?
Why being so negative? He seems to like how easy it is to tear down the machine and how easy it is to install ubuntu. That's it. I don't think this will change in 10 years, and I don't want to wait 10 years to hear about his experience.
I did think it odd that he was talking about the durability of a product that he'd barely unwrapped. But his other points seem cogent, and are orthogonal to how long he's used any of the products.
I don’t think he really talks about the durability, as much as the repairability.
For example, he notes that he hasn’t road tested it yet:
> However! Most of my use of this computer was from my sofa, while I was recovering from hip-replacement surgery. I haven't road-tested it at all.
> But I'll note here that if it turned out that a component failed due to my usual rough handling, I could replace it with a standard part in a matter of minutes, myself, in whatever hotel room I happened to be perching in, using a single screwdriver.
That reads to me as a pretty specific disclaimer and that he hasn’t stressed the durability, but more that he has a plan if it isn’t as durable.
(Though it’s a long article, and maybe he said something about durability that I missed)
I have a Lenovo Carbon X1. Can’t get more than 16 GB of RAM. Can’t get larger than 1 TB hard drive. Really crappy wifi chipset that blows up when I use a VPN.
I kinda feel like he explained what happened to ThinkPad. They went from IBM which for $150 per year would send out a tech anywhere in the world to fix your problem to Lenovo who… let’s just say isn’t as good.
How is it weird? He very clearly described the degradation in quality of ThinkPads. For what it's worth, I agree.
Also, using a laptop for a month is more than enough time to get a feel for how things might play out over the long term. He was clearly impressed enough to commit to an opinion in a short space of time. If his opinion changes, I'm sure we'll hear about that too. I think that's fair enough.
This is just how Cory writes in general. You’re not going to get a lot of measured maybes. He is an opinionated guy.
And in this case, what’s the harm? If a laptop doesn’t work out, it can be replaced. This one just got released, so a long-term reliability test isn’t even possible yet.
Or better yet, after 5-10 years of frame.work laptops being wildly successful, they will slowly start killing off the upgradeability/fixability of the laptop.
- we had 5 - 10 years of great laptops (or more likely 7-12 since people will put more effort into repairing their current models if the new are worse.)
- if they do this the market had already been proven and anyone with a few spare million $ can go and grab it.
I like what they are doing, but I am sticking with my 2 primary thinkpads for now, which is fine as well and certainly eco :) They are still a perfect level of repairability (t430s and a t470p) even if I can't upgrade the CPUs on them.
I am excited over seeing this project, but these things have stuck out for me:
1. I am concerned over long-term screen hinge strength. I can't see the build on it, but I will not cheap out on that after dealing with bad hinges (screwed into plastic, not a metal frame on a Dell)
2. I am concerned over the durability of the screen. I am not an expert at this, but I have no worry over the screen if I chuck it in a bag or my cats who sometimes stand on the top of the laptop when I put it on the floor.
3. I watched Louis Rossmann's takes on it - I was hoping he would go into the tactile feel of the keyboard. I can wait though for any commentary on that from a ThinkPad user. I can try to stop using the TrackPoint (would hate unlearning that) but a good laptop keyboard is essential. (I declare the best laptop keyboard I ever experienced is the one I have on my t430s. Lenovo has made the key travel lower and lower over the years. The t470p and others around that year I used is OK, but doesn't compare really.)
If anything were to go wrong though, hey, I can at least repair it :D But, I continue to stare in wonder over this. I'd love to be proven otherwise on these points.
The case/hinges has always been the first thing to fall apart in all my passed laptops. This is the third and the plastic case is showing age. The other two are still running well but they can't be moved around at all no more. Basically I want something as tough as an Apple Macbook but without Apple crap on it (I need Linux).
My next laptop will have the best case and hinges or there won't be another one. I can't stand seeing perfectly runnning old machines made unusable because cheap assembly.
I've been using a Thinkpad T470 for the past 2 years, and the hinges are really solid... they look like they'll last many more years. I had a couple of T400s before which basically disintegrated around the hinges, so I know whereof you speak, but at least this series of Thinkpads don't seem to suffer from that problem... looks like they learned their lesson.
> I can try to stop using the TrackPoint (would hate unlearning that)...
Same here. I’m addicted to it and can’t get over the tactile feel. I guess this will go the way of the mobile phone keyboard, assuming it hasn’t already.
I guess a workaround for now (until someone designs a trackpoint keyboard that fits in the framework) could be Lenovo's bluetooth trackpoint keyboard, I use it with every laptop and it's been great
Annoys the crap out of me that Apple tries to claim how "environmentally friendly" they are and yet the biggest problem is they make all their computers be disposable and extremely difficult to repair. They've gone out of their way to do this by soldering in memory and SSD, gluing batteries in, etc. Shame on them.
Putting upgradeability aside, Macs typically have longer usable lifespans as evinced by their relatively high resale value. Anecdotally, it's common to find 5-7 year old MacBooks being used by their original owners (I'm typing on one right now), and Apple will offer around ⅓ of the original value on a 5 year old machine as a no hassle trade-in because their refurb partners are able to sell them (you can usually get more selling privately).
It would be nice to see some objective stats on this though.
Macs can have a very long usable lifespan - if they don't require a repair. Then even young machines are quickly totaled as soon as they are out of warranty. I wonder how many macbooks ended up in landfill because of broken keyboards which where too expensive to repair (like 600$ ?), though keyboards shouldn't just break and not cost more than 100$ to exchange...
My 6 year old iMac has a fan which sounds horrible, would be easy to clean if I just could access it... and while the HD is actually still upgradable, I can't reach it any better than the dirty fan.
And of course there are batteries, which do fail quite often after like 5 years and are also not meant to be exchangeable. In Europe, it might be difficult to get Apple to exchange the battery of older laptops though.
Though apple don't encourage DIY battery replacement it's not that hard. On my old 2013 air it's really easy apart from sourcing a decent battery, and on the new M1 air it seems quite doable if you don't mind glue. https://www.ifixit.com/Answers/View/675334/Is+it+possible+to...
Yes, it is possible, a colleague of mine replaced the battery of a 2012 MB Pro. It is possible, but not easy and involved chemicals which should not be handled by anyone without basic lab experience. Things have become somewhat better but still the question is: why isn't it easy and why doesn't Apple offer battery exchange for a longer time?
If they really care about the environment, they should at least offer to fix your old MB at a reasonable price.
(At their typical battery replacement prices, there should be a healthy margin anyway).
The design of the MacBook Pro changed in 2016, and it became less repairable and easier to break (see: issues with butterfly keyboards). It would be difficult to convince me to buy a used old 2016+ MacBook Pro.
I remember the days when you could actually EASILY replace your portables battery (after 3+ years), EASILY upgrade your ram and HD.. now if any of that fails or the system needs more memory then you have you have to replace the entire computer. I'm sure this is by design so you spend more money and buy new stuff instead of upgrading what you already have.
Yeah, Apple was really good at replaceable batteries, keyboards, hard drives, RAM, and wireless cards... right up until they decided they didn't want to be. IIRC it was sometime around when they started calling PowerBooks MacBooks... 2006 or thereabouts.
Ehn.. I used to be like you. Once I used Pixelbook, a superthin fanless device, I was in awe. Pixelbook Go was similar fanless device as well. But, then I realized that they have their limitations. With a fan, one can achieve much better performance when needed. So, now, I prefer a laptop that can passively cool for normal day to day work, and then for heavy workloads take advantage of the fan.
I got the framework laptop, which should be able to push the intel chip to the max 28 TDP. I heard the fan is big, and thus not annoying. I am curious to see how it will turn out (My batch 3 order gets delivered tomorrow).
Yeah I guess I don't have a ton of experience here outside of mac laptops, but the idea of a PC laptop that only kicks in the fans when truly intensive tasks are being run sounds pretty nice.
My experience has mostly been that when any work is being done, the fans spin up. It's jarring when you're trying to focus on the problem at hand.
You need to find your fan rpm/noise threshold. On my laptop the fan becomes noticeable above 2500rpm. Someone commented about adjusting the fan curve (I have no idea how), I just manually capped the CPU performance with a powersave profile instead.
You're right about cooling for intel/amd, with the note that the M1 is a whole different beast - it can deliver good performance with only passive cooling anyway
Same here. I went from a thinkpad to an m1 air, and although the noise wasn’t something that bothered me before, I do appreciate its absence now. The battery life also is a big draw. I can go to the office without a charger, and it’s not a problem. I’ve never had a laptop that made it past half a workday.
It's a shame ARM isn't quite a first-class desktop architecture quite yet. I'd probably use one if my software ran on it. Otherwise, I figure in 10-15 years, when it is "fully featured," the architecture will have already been usurped by RISC-V.
I've never met someone before that cares about cpu fan noise. Genuinely. It never bothers me so I have never thought about it. Perhaps also because my workflow involves playing music when I work to make me more productive.
My MacBooks have generally been quiet, but not silent. Having lived with the M1 MacBook Air since it came out, I _never_ want to go back to a fan. Silence is a luxury.
Getting a noise less and fast laptop is a good improvement. It's like going from a fossil car to an electric one. So much less noise and vibrations. The noise is acceptable but it's annoying to go back.
At 47:10, they mention that they haven't found anything evil. Ofc, this isn't hard proof, but if I trust anyone's answer, then it's theirs. I think the likelihood of it being malicious is nonzero, but small enough that I'd condemn active backdoors into the realm of conspiracy theories.
There's always the possibility of it being exploited by others, but c'mon: Basically ANY other exploit would be way easier to distribute and activate than one in the PSP.
As much as I love speculating about backdoors and NSA wiretapping, I seriously doubt these MEs are malicious. At this point, managing a modern x86 is tough work, especially if you want to run virtualization, complex threading and maintain high efficiency. It makes total sense that there are mandatory supervisor chips at this point, and without any evidence that these chips are "phoning home," I simply have to assume that it's purpose is virtualized KVM for remote management. Worst case scenario, the CIA wakes up my laptop while I'm asleep, big whoop.
It does not even matter if they are actively malicious. They are closed, non-removable, with proven vulnerabilities (which not only CIA can use). What else do you need?
There is an important value, implied by the repairability, that gets little attention here.
When I went to Malaysia, My 7 years old Asus N550 was malfunctioning and I needed to get a new laptop. In my previous company I got a zenbook 13" and loved it, but the ones in Malaysia were only sold with 8GB ram. I've been using 16 for 7 years and there was no way to go back. I wanted 10th gen intel (For the improved integrated GPU) and my only options which sold 16GB were Dell xps (500$ more expensive) and Lenovo (1000$ more expensive), but the latter would arrive in 3 months.
I got a Dell xps as my only option, and I hate it. One reason is that the specific cpu I got did not get the improved GPU (Intel's shady marketing and lack of transparency is to blame). Another reason is that the fingerprint reader has no drivers for Fedora, while Dell advertises the laptop as Linux ready.
Now I need more ram for my new activities and I want to bump it to 32 but I cannot do that.
Manufacturers tell us that they need to solder ram to MB to be able to give high performance for a thin and light form factor. Framework shows that this is not true.
Also, see how Framework tries to be transparent and informative? It clearly tells you that the tiny performance improvement by i7-1185G7 over i7-1165G7 is not worth the 400$ difference for most people.
I will get rid of this Dell Xps and get a Framework as soon as it's available in my area.
Not the person you are replying to, but I have a XPS 13 9380 (the 2018 model), and the fingerprint sensor is still not supported on Linux. I'm subscribed to the relevant issue on the libfprint bug tracker, so if there was news, I'd hear about it.
It's 7390. When I bought it there was no Linux driver. And then they released an Ubuntu only driver naming it "Linux driver". My device is 27c6:5385 which is not listed in https://fprint.freedesktop.org/supported-devices.html. I remember people trying to reverse-engineer the Ubuntu driver, or re-pack it for Fedora. Unfortunately, I did not dare to install the repacked version.
Very exciting. Hoping they will make something similar with:
. 15 or 16 inch chassis
. 4k display (OLED, if possible)
. largest battery they allow on an airplane
. no keypad
. 2x speakers facing up
. AMD, if possible
. arrow keys as Apple does it
. charge a premium for this - ppl will pay it!!!
Basically just copy what Apple does but with a little customization. I know this would be difficult but hoping someone from framework will read this. No one wants to buy a dell XPS but its the most popular windows laptop in its class for a reason.
This is the one thing keeping me from buying a Framework. That resolution is so close to perfect... but not quite. Too high for 1x, too low for 2x. I'm hoping they will have a better screen option some day, or that there will be a way to replace the existing one.
Really, OSes and applications need to be better at doing arbitrary DPI scaling. I should be able to say “scale everything 1.2x on this monitor”. On Linux, using GDK_DPI_SCALE & QT_SCALE_FACTOR works reasonably well on a single monitor if apps are respecting them.
Personally I think Apple’s PPI target for laptops is too low for 2x scaling.
Lots of legacy apps use bit-mapped graphics instead of vectors. Xaw and Motif are both not resolution-independent toolkits but getting them to support integer scaling (with line-doubling) would be far easier than trying to implement fractional scaling plus anti-aliasing.
Apple literally doubled the resolutions of their pre-HiDPI screens for their current screens and then scaled their interface by 2x for the default OS builds, so if their interface is too big for you then it probably has been since Mac OS X came out.
Sure, and older software not designed for newer paradigms will always be a problem. All major desktop OSes currently make some compromises to handle HiDPI displays.
Apple doubled the base display from pre-Retina laptops (1440x900), but they offered a higher-resolution display (1680x1050) and that’s what I used then. I’ve always adjusted the resolution on Retina MacBooks to be the one-notch higher-PPI option.
It's funny to me that you'd want what I consider one of the worst thing about the Apple keyboard: the arrow keys. Having the half-size up-down buttons means I have far more mistakes on those. Thankfully CapsLock-p and CapsLock-n works as well (I remapped CapsLock to Control of course)
The left, right, and down arrow keys all on one level, and the up arrow key alone above the down arrow key. Makes it easy to find the right key by touch alone.
What they've got now on the Framework is just "arrow keys as Apple did it": they used this full-height/half-height combo for a while but have since returned to the "inverted T" with all half-height as you describe.
I think the sense that space is wasted by those deadzones above the left and right keys is a powerful motivator of these "hybrid" keyboard designs. I do prefer to have them all half-height if any of them are, as it does make it easier for me to "touch type" the arrows.
If everything works then it feels like a luxury, but if something breaks it feels like a necessity.
Being able to upgrade components also lets you buy at a cheaper price initially and then grow it to suit your needs later. Also the laptop will be probably end up being usable for a decade or more.
I recently upgraded my 2015 MBP's SSD from 256GB to 1TB (it's the last upgradeable macbook). It's great to get a significant speed boost on read/write times and more disk space. And it only cost me about $100-150 to make it happen.
The 2015 MBP is still fast and snappy for my purposes, it's hard to justify getting a new laptop yet.
Most people have never had to replace the starter in their automobile, but the ones that have sure are glad it's not spitefully welded in place just to make things difficult for them.
If I remember correctly the replacing the alternator in a 2006 VM Touareg is something like that. Something like a 6k job if you have a shop do it because it requires dropping the engine. Also the alternator is liquid cooled and costs over 1k just by itself.
A vehicle starter's lifespan relative to the lifespan of the whole vehicle is probably very different than that of a component of a laptop relative to the laptop itself. As well as the cost of repair relative to the cost of buying new.
I switched to a Dell XPS + Linux when my Macbook crapped out after only a few months. The Apple store shipped it out and it took over a week before I got it back.
If that laptop had been a Framework laptop, I could have just ordered the part, swapped it out, and moved along.
> I've bought a new Thinkpad almost every year since 2006
But why?? If the current one isn't broken, why would you need to buy a new laptop and go through the hassle of setting it up?
My laptop is from 2013 and is still fine. Parts of my desktop computer are from 2007 -- the case is from 1993, and nothing is more recent than 2015, and everything is running super smooth.
Getting an entire new machine every single year sounds hysterical. But maybe I don't know what I don't know.
If you read later on the author gives their rationale:
"I started buying a new laptop every year as a reward to myself for quitting smoking. ... The environmental consequences of that system weren't lost on me, even given my very good track-record of re-homing my old computers with people who needed them."
If you read a little lower in the article he mentions it being a reward mechanism for stopping smoking. So each year he doesn't smoke, he rewards himself with a new laptop. Good strategy!
You don't quit smoking once. You quit smoking every day. I haven't smoked in >10 years and I'm still quitting. Physical dependence on nicotine takes a long time to recover from.
That's not my experience. I was a heavy smoker for over 15 years (around 2 packs a day); then I quit. It was hard. Extremely hard. In fact I stayed in bed for 2 weeks because I didn't dare stand up and go into the world, and I couldn't think of anything else.
But after the withdrawal symptoms receded, that was that. It was over. I can't even remember what it was that I liked about it.
Of course that's just me, and experiences differ. But quitting smoking is its own reward, and I don't mean physically or health-wise. You're free! You don't need to spend time thinking about where the nearest tobacco shop is and if you have enough change to buy more cigarettes to get through the night.
I would argue that if you need to reward yourself every year for quitting, you didn't actually quit. You stopped putting cigarettes in your mouth, but you're still a "smoker".
I thought it was pretty much a fixed medical aspect of quitting smoking that it would take your body years/decades to regain the ability to regulate adrenaline production properly. That you would suffer occasional adrenaline surges during that period.
I was beginning to wonder if they'd ever subside. So it's good to know someone doesn't experience it. Though I have become used to them over the years and, in some situations, are not all bad.
Nicotine dependence is personal and genetics do play a role in that. Not all people react to nicotine equally. I can cut smoking abruptly and not be too upset about it. The other day, I ran out of vape liquid but didn't feel like driving to the Vape store. I know a friend who if he doesn't smoke when he wakes up, he goes crazy.
Once an addict, always an addict. I haven't smoked since 2007, but I'm humble enough to know how easy it would be to pick it up again. In fact, if I lived in Poland again, I very well might have. But in the US being a non-smoker has never been easier.
Well, computers used to be more interesting. A 2006 laptop was at least 2x times faster than a 2003 laptop, with lower power consumption and better thermal design, better screen, etc.
Sometimes people describe their problems with some technology and it makes me realize that we live completely different lives. I can't imagine buying the same device year after year, whether that's a phone, laptop, or whatever.
Not only that but when they were using Macbooks they were apparently _buying two_ and keeping them in sync? I get some people can't be without a laptop for long periods of time but that seems absurd to me. Keep a backup disk. They even say they have a pile of old laptops that are presumably still usable.
I'm still curious about the motherboard itself. The old PC cases didn't have a rectangular cut-out for IO panel, and probably didn't have the motherboard stand-off screws in the correct locations.
I suppose with time and care you could modify an old case to take an ATX motherboard, but I'm not sure how well that would work. You'd also likely need to mod the power supply mounting, add cooling and fan mounts, and more.
Over the years I did move the stand-off screws but didn't need to make new holes, plenty of holes were already there. I changed the power supply many times but it always fit, I think it's standard... Fans are on the mother board, there are no fan mounts directly on the case?
Traveling 27 days a month might sound impressive but it shouldn't really put extra wear on laptops in general. I frequently work from cafes and such and put my laptop into my backpack on an almost daily basis, and it's fine after years. I don't even use a padded sleeve or other special protection.
This is one of those comments I will never understand. Every single time I have tried to use one of those things the only thought that goes through my mind is, "This is the dumbest mouse interface I have ever seen. Why would anyone ever use this thing?"
And then I see comments like this. I don't get it. The ergonomics of those little nubs are awful.
This is one of those religious arguments where no one will every change their mind, but I feel like a trackpad is the dumbest mouse interface and can't use anything but a trackpoint (or a mouse of course).
Consider dragging and dropping: with a trackpoint you click with your thumb, then use your pointer finger to move the mouse cursor any distance you like, without moving your hand, then release. With a trackpad you press the pad and then move your finger some distance and then... oops, you hit the edge of the pad. Better abort and try again, starting the motion from the opposite pad. Except if I'm on a non-apple trackpad, and I have to click near the top of the pad, it's nearly impossible because the click hinge is near the top. Or maybe I do a finger swap to reset the drag position, and hope that the OS doesn't interpret my moment of two fingers touching as some kind of gesture.
Anyway, it does take some time to get used to, so you won't get it in a few minutes or even a few hours. Try a week. Also make sure you're using a lenovo one (the dell/toshiba ones aren't as good), and very importantly, that you have a fresh cap. Worn out caps make it really frustrating.
While I'm not going to argue your preference for a trackpoint, dragging with a trackpad is not as you describe. You just leave your thumb "clicked" and reposition your index finger.
IME, using a trackpoint is okay for smaller laptop screens, but going from monitor to monitor kinda sucks.
macOS has an option (in the accessibility prefs) to enable three-finger trackpad dragging.
It's incredible to use, especially in concert with Magnet (a window management app). Three-finger-drag a window to the top of the screen and it's instantly maximized. Selecting text is also a breeze.
> Three-finger-drag a window to the top of the screen and it's instantly maximized.
Sounds like a lot of work compared to any good tiling window manager. Windows should be automatically maximized on opening, unless there are others on the desktop, in which case just hitting a two finger key chord should do it without requiring contorting a wrist or moving an entire forearm away from the home row.
Gnome is all about touchpad gestures—every time I reinstall fedora from scratch (admittedly not very often) I have to go fiddling around to get the “traditional” trackpad buttons and what not working.
Agreed. If I were to switch to a Framework laptop, the Mac trackpad would be the loss I'd feel the most (closely followed by the DisplayPostscript-powered macOS window manager).
I was a big proponent of the trackpoint for the longest time - the first laptop I used in the mid 90's (a Toshiba Satellite) had one and then I used Thinkpads for years.
Then I got a MBP at work in 2016 and haven't looked back - but they were far better than most other trackpads for the longest time.
Counterpoint. I won't use a keyboard without one. I use a mechanical keyboard with a trackpoint. Not having a trackpoint means that I don't even have to think about whether I will buy a laptop, I already know that I won't. Touchpads have gotten better, but nothing beats the nub.
That being said, every laptop I have ever owned has had a trackpoint, so I'm a definitely biased.
When I am on any laptop, the first impulse for me has always been to reach for the TrackPoint. I always get disoriented for a moment when it isn't on a laptop keyboard. I take the device for granted :)
I really really do hope framework will add a trackpoint option in the future. I would hate to unlearn this comfortable feature now.
> I use a mechanical keyboard with a trackpoint. Not having a trackpoint means that I don't even have to think about whether I will buy a laptop, I already know that I won't.
Did you build that keyboard yourself, or buy it pre-made? If the former, where did you get the trackpoint? I have a Dactyl-Manuform that is nearly perfect, modulo the trackpoint (and wireless, better firmware than QMK, and a few other things), and I'm looking for one that I can use without cannibalizing an innocent thinkpad keyboard.
I started out building my own (story for another account) but now that TEX makes the Shinobi [0] it is my daily driver. There are some split mechs that have been modded [1,2] which are closer the Dactyl, but that would definitely be a custom job.
There are many other replies about how the trackpoint is ergonomically superior to a trackpad.
I'll add some nuance: it's entirely possible that a trackpoint is most valuable when your workload is majority typing with some mousing, as opposed to the other way around - so maybe you just have a different workload than most trackpoint users.
Separately - how long did you actually spend trying to use one? Most things have learning curves. You can't unlock most of the potential of a keyboard until you've spent dozens of hours learning how to type, so why shouldn't the same apply to a pointing device?
I personally used to swear by the trackpoint because it was vastly superior to the touchpads of the time. A decent touchpad (definitely not the plastic shit they put on most thinkpads these days — they are unusable) blows it away though. Now I find using a trackpoint feels like moving my mouse with arrow keys. It’s just repulsive to me.
But once you've got used to it in a couple of days, it's way more efficient than a trackpad for type and pointing activities. I don't have to move my arm compared to when using a trackpad.
This is an interesting video:
IBM introduces "Pointing Stick" (TrackPoint) (1990)
I use these exclusively and have the trackpad disabled on my Thinkpad (an old x230).
The #1 reason I use it is to not move my hands when I need to do a pointing operation. You can get very precise with it too, but it takes some getting used to. I'm a klutz with a trackpad now, since I've been using the pointer for so many years. I use a tiling window manager and very few GUI apps (often even using the web in w3m), so mousing isn't as important for me, which is another factor.
I wouldn't play a video game with it, of course, but I'd imagine trackpads suck at those too.
Another opinion: my first laptop was a Toshiba with a trackpoint. That was awesome. I was able to move the cursor much faster and more accurately than with a normal mouse, even disregarding the time it takes to move the hand between a mouse and a keyboard. When I first had to use a touchpad, years later, I was stunned that people put up with such an experience. Nowadays I am on a Thinkpad, but I still feel like Toshiba had it figured out much better. Somehow it isn't as accurate as I remember it from back then.
For touch typing (like I am doing for this comment), using it is much easier than moving the hands off the keyboard to get to the trackpad. I had to get used to it, but when I did, it's hard to not think of it there.
When I am doing casual reading or I am doing more hurried work, I tend to use the trackpad more. It's feels more suitable for me as a mouse interface when the focus is not typing primarily but more about finding things.
I learned to use a TrackPoint with some motivation that I would be using it a lot. I have instead found a comfortable middle point with using both of them.
Welp, I used to play Tremulous[1] with it on my ThinkPad X61, slaying people with granger(= builder) on "X" server. I often spammed the chat when I get a fancy kill (something like "you got killed by a trackpoint lul", with color). I wonder if anyone remembers it. Hahaha
Trackpoint is like a joystick, very short but sensitive and accurate. The best experience was with a SoftRim cap, which significantly reduced the force required to register.
Even better, using VIM and tiling WM completely eliminates any reason for my hands to leave the resting position on the keyboard. This was simply fantastic, and that's why I still have my X61 around.
I've never had a laptop with one long enough to get used to it, but I assume it's because it's right there in the keyboard where your hands already are, rather than because it's any better at moving the pointer around than a trackpad/mouse.
Maybe it's an acquired taste? I have fond memories of using nubs on ThinkPads. I think Apple's done a great job at improving the trackpad experience on Macbooks but it still doesn't compare to not needing to leave the keyboard and also never needing to lift your finger to continue moving the mouse.
For all my love of chording and old macs, for that matter, trackpoint touch typist I will ever remain. I rever Doctorow, but I reward myself otherwise. Smoking Cuban on occassion. A new used thinkpad every 2 years or ssd. The builds run in obs and writing can nicely obtain on a light x220.
I used Thinkpads many many years ago and liked (but did not love) the trackpoint. One thing it can be really good at is small, precision movements. Once I got to try a laptop with a decent touchpad, anyway, I saw no reason to go back.
With a trackpoint you don't need to move your hand to move the mouse, so it's great for people who frequently use the laptop in low-elbow room situations like on a plane or train.
The author spent some time explaining the need to have a reliable laptop and spending $150 for 24-hour service and having two Powerbooks at once. But there was no mention of anything about reliability and repairs (well, replacement parts) for the Framework. I'm curious if he plans to keep a second Framework laptop in case parts are sold out, take too long to ship, etc.
He might have to. As far as I can tell they haven’t even started selling individual parts yet. They all just say “Coming soon” on the website. I’m not sure what they are waiting for.
My guess is that they are using up all their manufacturing capacity to sell new full laptops (and DIY kits). Their current manufacturing setup seems to be that they periodically batch a bunch of what are essentially pre-orders and then get things manufactured after the fact.
Once they're more established and have money in the bank (hopefully that happens), they can think about ramping up manufacturing and actually keeping inventory before it's been spoken for.
Until then, at least the commodity parts (RAM, storage, WiFi card) can be replaced with off-the-shelf parts bought from NewEgg or wherever.
Has the Lenovo warranty really gotten worse? The author doesn't specify but speaks of the "incredible" warranty on the Thinkpads in the past tense. I purchased an X220 in 2011 with the 3 year next business day warranty plan and it was awesome the couple of times I had to use it. I recently purchased an X13 to replace it with the same warranty plan, assuming it would still be as good. Not true?
(By the way, Kubuntu installed and ran almost without issues on this 2021 X13 out of the box. Only minor issue was with sleep / suspend but that ended up being a BIOS setting. Apparently there is a "Windows" and a "Linux" sleep state.)
> Apparently there is a "Windows" and a "Linux" sleep state
IIRC, Linux uses an actual S3 suspend, where NT has moved to a low-power mode that never actually halts the processor - "always on sleep" or something?
I am excited to see a company pursue user repairable hardware.
As a security and privacy researcher I care more about being able to trust the computer when it is powered on and in use. That means user controllable firmware, and the Librem 14 has no equal in this regard.
It is a real shame I have to choose. I hope these companies will shamelessly trade ideas or merge.
A Framework laptop style hardware with a neutered ME and Heads firmware would not only take my money, but become my top recommendation for all of the companies I provide security advice for.
The Framework is definitely on my radar for next laptop. I kinda want to wait and see how it pans out in the long-term, though. I'll be eagerly awaiting for the Gen 2 version of the laptop, or the "Framework Laptop - 1 year later" reviews.
For the moment, my Dell XPS 13 laptop from 2015 is still going strong, and I'd hate to ditch a brand/model that's been working reliably for me for six years for new-shiny thing from a company that might go under in a year.
No axe to grind here, but this article contains a lot of misinformed, over-generalizing FUD about modern Thinkpads. It sounds like Cory's experience was with the X series: these Thinkpads are the thinnest, lightest, and most difficult to service.
For a small thickness and weight penalty, the T series are much easier to repair, upgrade, and maintain. My T450s (bought new in 2015) and T480 (bought new last year) both have an easy keyboard swap and Intel graphics (not a proprietary Nvidia GPU). Cory implies that Thinkpads still need a proprietary dock, but any USB-C-enabled Thinkpad (i.e. any Thinkpad made in the past few years) can use any cheap USB-C dock out there.
Nearly every component on the T series is as easy to upgrade or replace as the Framework claims to be. I've swapped my RAM, storage, display panel, and battery. The T480 and older external battery are even more serviceable than the Framework's: just pull 2 tabs and it slides out.
Keep in mind that's only true for the T series. P series ThinkPads (like the P52 albatross round my neck) are a pain in the ass, unserviceable and finnicky to even install Linux on because of the Nvidia GPU you mentioned.
Don't touch a Thinkpad unless it's a T series. But a T series ThinkPads will also not be any sleek form factor that most people would want either.
> Another caveat. I really miss my Thinkpad Trackpoint (the little nub in the middle of the keyboard) and the three hardware mouse buttons on the trackpad. I'm finding it really hard to reliably hit the right region on my trackpad to get the left-, center- and middle-buttons.
Yes! This is one of the main reasons that I will stick with Thinkpads. there are times that I have a mouse plugged in, but still use the track point since I then don't have to move my hand. I hope that they will add a track point to their laptop.
I bought one the same day I heard about it via one of Andrew Gallant's Twitter threads [1]. With the exception of the screen (I promised myself my next laptop would be 15" instead of 13"), this is pretty much exactly what I've always wanted in a laptop, and I'm very excited for my October ship date.
I can't help but feel as though the people who agree with Apple's (and many copy cats) insistence on USB-C only and refusal to put a single USB A or HDMI port in the computer are simply out of touch. The only people I know who like it always cite the same thing: "docking stations". But if you had a laptop with 1 USB-C you could still use your docking station, but the _vast_ majority of users wouldn't have the struggle to find/buy/maintain/pack up/not lose their USB-C to HDMI and USB-A dongle. Every non-hardcore laptop user I know _hates_ not having a HDMI port and having to use a tiny dongle to plug in their mouse. And the desk docking stations they have all suck. They end up using 4k monitors at 30hz because their docking station can't handle it. And _they don't notice_ because their accustomed to computers sucking.
When I see a fat laptop with ports. I am happy.
The framework wins for me because of repairability. But I know countless people will love the customizability of the ports.
I guess I just don't use my laptop the way you do. I have 3 USB-C ports (and a micro SD slot), and that's it. Right now I have a power cable plugged into one, and my USB-C Yubikey plugged into another. I'll very occasionally use a USB network interface, but it's USB-C. I've replaced any micro-USB to USB-A cables that I still need with micro-to-C cables (I even have a mini-to-C cable for a serial converter breakout board). I don't consider this replacement much of a waste, as I've mostly just replaced them when the cables wear out or break.
I do have a USB-A wired mouse, but I only use that when I'm first-person gaming (not often, maybe 3-4 times a month). But for that I just have the A-to-C adapter permanently attached to the mouse, so it's not like I need to go find an adapter when I want to plug the mouse in.
Only other thing is HDMI, but I keep one of those white Apple dongles I got with a work laptop in the drawer under my TV (I probably pull it out once every 6 months or so). I don't use an external monitor for the most part, but for that I have another dongle permanently attached to the cable plugged into the monitor, so, again, that's not an inconvenience.
When I'm traveling, it's really nice that I only have to bring a single charger that charges both my laptop and phone. I'll usually also bring a Chromecast with me so I don't have to deal with cables at all if I want to watch something on a hotel/Airbnb TV. But that's pretty much all I need.
I don't think I'm "out of touch", though I certainly allow for the truth that other people have different needs than I do.
I never said anything about myself owning or using a laptop. I only spoke about the general public's use of laptops. You seem to have completely missed my point.
for ethernet you'll want to get a usb-c dongle right now, sadly. But the expansion cards look like they could support an ethernet NIC. It just has to be made. If framework is successful, I expect we'll see more expansion cards.
The CPU is not socketed/replacable because intel simply does not sell socketed laptop CPUs. If there was a socketed alternative, I'm sure it would be in the framework.
What is the distinction between 'laptop CPU' and 'desktop CPU' if not BGA (or whatever it is) vs. socketed packaging though, really? Power consumption?
Wouldn't it be nice if there just 'CPUs', and you could pick whatever was appropriate for your desktop or laptop. Sure some would maybe only make sense in one package, but there must be some considerable overlap. I use my desktop for work because it has upgradeable RAM and I needed to do that, not because it has a beefier CPU than is available in a laptop.
Yes, it's mostly power consumption and efficiency, with the puny heatsinks found in most laptops, you can't cool significant amounts of power without throttling. Having the CPU on the motherboard also allows for a thinner assembly. I don't think having a socketed CPU just for the sake of it would have been a good choice here.
Yes, power consumption, and, relatedly, heat dissipation. You'd probably get unusable battery life in a laptop with a desktop CPU, and need a lot of fans to cool it.
ive seen a lot of requests for an AMD version but i always see people complain about AMD's linux support. Do you plan on running windows? If you are planning on Linux is the support really as bad as people complain.
I say that because intel's linux support seems to me as top notch. ive never really had driver issues that i can think of
Hey, AMD (Zen2 3800X) and Linux (NixOS - 5.14.6) user here; other than a weird bug once on temperature reading I haven't had a single issue between Linux and AMD.
Oh I’m glad I don’t. You know what I dreamt it all up. Thanks for clearing that up.
But honestly just subscribing to the Linux subreddit I see tons of amd gpu issues and never hear a peep out of Intel. But yes nvidia does win the prize for worst.
I've been runnning linux with AMD cpu/gpu for years, they tend to be late with initial stuff but other than that it's been a smooth sailing.
That being said, it will differ from distro to distro, as some are unreadably slow with driver/firmware updates.
> i always see people complain about AMD's linux support
I've always heard the opposite, that AMD's linux support is amazing and that it's windows drivers are lacking. Especially in the GPU department where it struggles with minecraft.
This is true, AMD's GPU Linux drivers are open source and in superb shape nowadays (with the exception of freshly released stuff as there's always lag)
Since it’s a laptop then I could see that being an issue though. Intels embedded GPU I guess are slow and clunky but work super well in Linux in my experience.
What are the long term prospects for this company? I'd like to have a repairable/upgradable laptop, but if they go out of business in a year, it's not much of an improvement over a Macbook... granted, it'd be some improvement since I could replace/upgrade commodity components like RAM or the hard drive but if the motherboard or display fails, I have to buy a new laptop.
Not great if we all think like that and 'wait to see'!
It's still better though, a lot of it is standardised stuff - you don't need Framework to exist to be able still to replace your M2 SSD, WiFi, DDR4 RAM.
Seems like what they have is in high demand and they only have an ANSI keyboard, which is only really used by the US and China (and people that don't mind limitations)
Once they offer an ISO keyboard layout, you would expect demand to be even higher.
Random feedback to framework on the configurator: it would be much nicer if all the options were just visible on one screen. Getting to the expansion slots page and not being able to remember how much base storage is in the configuration I chose means I have to go back two pages (and the page-loads are quite slow; hopefully that means you're doing great business?). Also, I'm sure limiting SKUs is a business decision, but it would be really nice to decouple CPU and RAM so I could choose an i5 with 32gb, for example.
This got me curious, I feel like if any laptop could allowed for a customized keyboard layout, it would be this one.
I'm really yearning for a modern laptop with a keyboard layout that doesn't seem to exist anymore in modern keyboards, something like this [1], where there is a set of 6 nav keys at the upper right corner and a set of full size arrow keys at the lower right corner. If the Framework laptop has an option for a customized keyboard with that layout, I'll literally instantly buy it without even considering other specs.
This was my first thought as well, I'd love to see an ortholinear keyboard, and ideally one I could config with QMK. Not sure how likely or easy either would be though.
I'm just surprised the Framework doesn't provide a customizable graphics card slot. Or offer a dock that can support an external desktop PCI GPU.
It's a laptop for enthusiasts offering flexibility and repairability and future-proofing, and I think that overlaps with people who want to swap out GPUs and not rely solely on the CPU-integrated one.
Perhaps they are waiting for USB-C pass-through GPU enclosures to become a thing.
A discrete GPU comes with thermal and power requirements that change the form factor for the entire laptop. It doesn't make sense to have the same chassis for the with and without GPU use cases.
That is why I suggested a compatible dock with eGPU capabilities. That they haven't considered any way to support it all (even if it's only in the future) is a bit baffling.
Perhaps they will come out with an eGPU enclosure that can just be plugged into an existing swappable expansion bay, and won't need any special design changes? One can hope.
I hope this starts a trend! For the past 25 years that I've been building machines I've never upgraded the processor or motherboard - but I've added memory, added more storage or replaced a power supply. This fills that need just fine! Ultimately it supports up to 64 GB of RAM and 8 TB of storage! This is a machine that could last for years! Imagine now if we start standardizing on a laptop chassis, laptop motherboards, laptop keyboards, and laptop displays! We could have a builder's market just like we do today for desktops!
Honestly, this is one of the best new ideas I've seen in a long time!
P.S. Imagine ten years from now - we may be doing the same thing with phones! Oh yeah, I really hope this idea takes off!
I just bought my very first MacBook Air ever. But seeing a laptop like this, with everything being customisable (in contrast to my Apple laptop: nothing can be customised or replaced by me) I feel a tinge of regret...
I find it kind of strange that they don't have a dual-usb c card. There seems to be more than enough space and putting two usb c ports on a single card would allow to carry a wider variety of cards without loosing out on standard ports.
Is this possibly because of charging? It seems like such an easy and obvious thing to do.
From what I read, it was because they couldn't fit the charging circuitry for both into the dongle. Better to not deal with the confusion of one not charging
Maybe the chipset only supports allows up to four USB-C ports? Seems like USB-C hubs are few and far between (I finally snagged one last summer after looking for over a year), so maybe hub chipsets are hard to come by and difficult to design around?
Thank you for the writeup, it seems like they have done a great job with The Framework. Does it come in that beige-looking color shown on the yellow background?
I think I might want a beige laptop, something with the fine lines of The Framework and the raw aesthetic appeal of Lappy 486.
> The little modular port attachments seemed like a novelty at first, but now it feels absurd that you'd buy a laptop with a bunch of "hardcoded" ports that you can't ever change
That's funny I remember laptops from the 2000s with those swappable cards with different ports.
One I distinctly remember because it was clever way to keep the card ~3mm thin were the Ethernet cards, where the Ethernet port was hidden inside and you'd press it to make it pop, similar to handleless kitchen cabinets.
I found a picture of those on Wikipedia: https://en.m.wikipedia.org/wiki/PC_Card
The framework expansion port seems to be a 2021 version of this, although I don't know how standardized these new ones are.
EDIT: From the framework's configuration page:
> > Will you be adding additional Expansion Card types?
> Yes! We'll be adding new Expansion Cards over time, and we're also opening up the design to enable third parties and community members to create their own versions. We'll be making these available in the Framework Marketplace
That's awesome, then in theory Ethernet expansion card could exist (and use similar design to the PC cards above, where the Ethernet port can be retractable)
I've been using Linux Mint on Lenovo since 2011. I've never had a problem installing. The biggest ongoing issue is the Nvidia driver will crap out on some upgrades. Sometimes I can't display on external monitor, sometimes the sound won't go out the HDMI. Hint on all day laptop use:
1. External keyboard. I always use one even when travelling. There are many models like the JellyComb that will allow one to use the same keyboard with multiple laptops, just hit a button.
2. External mouse. If your hands aren't on the laptop then everything stays fresh on the laptop. Spill coffee on your cheap external keyboard or mouse? no problem just replace them.
3. I buy the 5 year onsite warranty for the $500 and it is great here in Maine. The come to the Starbucks and replace the keyboard or screen.
4. I use LVM snapshots and rsync my important files to a second disk and external disk daily/weekly.
So I just wanted to pushback on the notion that Lenovo today is not viable with Linux Mint. I've never had a problem with the installer. I keep up with the upgrades and with major upgrades I install from Live ISO on USB each time and restore my important files from backup.
I am growing increasingly unhappy with Thinkpads in general, but as far as I know there is no real alternative for TrackPoint users. There are some that have a TrackPoint but they lack three physical mouse button that are in reach of ones thumbs.
What would be the requirement for someone to build a trackpoint for the Framework laptop? Would they need to design a whole new keyboard plus a touchpad with the three buttons?
I don't really know, but since there are no manufacturers that built good TrackPoints I think its either hard or there really is no demand (sadly more likely).
I don't need a touchpad though. Actually, one of the best laptops I've owned was one without a touchpad [1].
> I don't really know, but since there are no manufacturers that built good TrackPoints I think its either hard or there really is no demand (sadly more likely).
Those are good points. I'm thinking there could be patents involved too.
> I don't need a touchpad though. Actually, one of the best laptops I've owned was one without a touchpad [1].
That's true! I don't really use the touchpad on my X230, even though it's there.
A few things. The newer TrackPoint caps are not good. I actually had to order some 3D printed one from some Japanese guy. Also my Thinkpad T14 is throttling to the extent that I had to install tools [1] (?) that fix this problem. Battery seems to be bad as well. Intel by the way, I wanted to have Thunderbolt for an eGPU.
> I actually had to order some 3D printed one from some Japanese guy
Thank you for mentioning this alternative! Just found him on Etsy. I have always preferred the old-style concave nibs for TrackPoints over the new convex ones. This will be a serious improvement for my Thinkpad.
Work gave me a dell XPS last year. I had it for a Week and gave it back because when I turned it on. The fans spin full throttle. And the laptop still throttled itself. So I use my X1E for work. The nipple is as good but it’s flat for the thin profile.
I upgraded all 3 laptops with more or better ram, wifi, and ssds. And currently ordering a Traditional Chinese keyboard for the L5P for the wifey. But I took AMD for the legions.
I regret not opting for the Ryzen variant, but I really wanted to replace my old desktop computer for the occasional gaming session (although they are rare these days) and hence the need for Thunderbolt.
I've bought it without any OS installed. But of course the general practice may be good reason to avoid Lenovo (if you can live without a TrackPoint, that is).
Some of the latest models in the "classic" lines with one or more RAM slots: P14s Gen 2 AMD/Intel (a.k.a. T14 Gen 2 AMD/Intel); T15 Gen 2 / P15s Gen 2; P15 Gen 2; T15p Gen 2; P1 Gen 4; X1 Extreme Gen 4. Plus some of the slightly more "budget" ones like L14 Gen 2 AMD/Intel and E14 Gen 3 AMD.
Tip: search "<model name> psref" to quickly get to a PDF with specifications.
This looks really cool, but the author... is... replacing their laptop every year? Like, I'm writing this on an 8 year old MBP that has survived as my round-the-house driver because it still does everything well. My daily driver is getting on a bit now (3yr) and my desktop only just got replaced after 5 years.
As the other comments have noted, Cory addresses this further down in the linked post. He further expanded on this in the post he wrote when he quit smoking[0]:
> That was my homework: go away and think of an immediate reason not to smoke. When I came back, I had my answer ready: “I spend two laptops per year on smokes. That money goes directly to the dirtiest companies on Earth, the literal inventors of the science-denial playbook that is responsible for our inaction on climate change. Those companies’ sole mission is to murder me and all my friends. I’m going to quit smoking and I’m going to buy a laptop this year and every year hereafter, and I’ll still be up one laptop per year.”
Well, maybe he's happier this way? There's a classic joke about a lifelong smoker talking to a stop-smoking councillor:
"With all the money you've spent on cigarettes in your lifetime, you could have bought a Ferrari."
"Do you smoke?"
"No."
"Then where's your Ferrari?"
It's a good question. Most of us have the financial capability to be extremely extravagant with a few select areas of our life, but instead we average everything down to boring mediocrity.
The labor and physical footprint needed to produce modern electronics is completely insane. You're comparing little league basketball to major league baseball, and it's not like a player like Framework is going to change this at all.
There is a severe ecological impact to the wider environment that comes from electronics, let's not kid ourselves. That doesn't mean buying electronics makes you like, a terrible person, but if you're sitting around prostheyzing on blogs like Doctorow about how these companies are killing you, it's a bit funny to essentially go from a thing that kills people you know in the first world to one that only kills people in the third world you never cared for. Modern comforts like cutting edge electronics have extreme externalities. Like, okay, let me just throw the "murders people I care about" problem over the fence, where it will surely not be an issue for all those people halfway across the planet from me (that I coincidentally do not care or think about.)
In general I'm not trying to be too hard. It's not like anyone else deals with this level of cognitive dissonance much better, and I say that as someone who mostly quit cold turkey over a year ago...
>it's a bit funny to essentially go from a thing that kills people you know in the first world to one that only kills people in the third world you never cared for
I feel like you're not really representing his argument on why he quit fairly. He does talk about the effects of tobacco on the developing world for one and also his overall reason seems to be more relating to the wider idea of tobacco companies being pioneers in the misinformation industry.
My mom smoked for about 45 years and stopped the day she found out she was having a grand daughter. She didn't want to smell like smoke around her. Hasn't touched a cig in years. The whole family is better for it.
Kudos to him for quitting. I quit, oh, about a dozen years ago. When I decided to quit, every time I smoked, I told my self they taste like shit; every drag off the cigarette, I told myself that. Eventually (about 2 or 3 months as I recall) it worked and I could no longer stand the taste and haven't touched one since.
I don't care enough but I want someone to fact-check him on the environmental impact of a MacBook worth of cigarettes versus the MacBook itself. It'd be funny if the MacBook is ultimately worse for Nature.
Back when I was making crap wages, I would get the cheapest laptops I could afford that would more or less give me decent performance (on the order of ~$500-600). It's not too hard to find a new laptop that performs well at a reasonable price, but you always run the risk of them reclaiming those costs by cheaping out on all the mechanics of it, and it's not like I was able to afford paying ~$2000 for a high-quality machine. Usually within 2 years, the laptop would just start falling apart, I would get sad, and then I would repeat the pattern.
After the fourth or so time of doing this, and after getting higher-paying jobs, I ended up biting the bullet for a more expensive computer, and it lasted me five years, and I only replaced it because I wanted more RAM.
Point is, if you're lower-income, it's fairly easy to get stuck in the "one laptop a year" trend, because, while probably a better deal in the long term, it's really hard for lower-income to justify a multi-thousand dollar expense. I'm a proper tech bro now so buying a good computer isn't the worst thing in the world for me, but that wasn't always the case.
I bought a low end laptop back in 2005, and I used it for about 2-3 years until it started to fall apart. It just didn't hold up (hinges started to disintegrate). It's performance was terrible, too, and it couldn't be upgraded.
I got a business class laptop in 2007 for probably 3 times as much. That laptop lasted me until last month. I maxed out the RAM and replaced the HD with an SSD about 7 years ago, but it was ultimately the now-anemic CPU and graphics that got me to buy a replacement. I'd have replaced it last fall but laptop stocks were too low.
Looking at your trend, you've got $500 laptop / 2 year, or $2000 laptop / 5 year, which reduces to $250 laptop / year vs $400 laptop / year. Getting low cost laptops isn't necessarily a worse financial outcome, although it depends on how fast the processor updates are moving; when a 2020 intel cpu is about the same as a 2015 intel cpu, it would probably have been better to pay a little more in 2015 for a faster one; when a 2015 intel cpu smokes a 2010 intel cpu, incremental updates every year or two mean a low cost 2015 cpu is probably better than a high cost 2010 cpu. Plus, you get a battery refresh (even if it's small).
I think there's more junk at the low end to avoid, but it's not as if the high end doesn't have a lot of junk to avoid. Either way, you have to do careful shopping.
It's like just my opinion, but a lot of higher end laptop spending seems to be on increasing the screen's DPI, which is then run with scaling, at the cost of more CPU, more RAM, more GPU, and more software BS. Buying a cheaper laptop with fewer pixels that just runs 1:1 saves all that extra computation and BS, and maybe looks a bit less nice. Sometimes glossy screens are reserved for the high cost laptops, which is like wait, I want a matte screen, so I have to save money to get one, great!
Yeah, I've actually done this math too, though I don't think it's quite this simple. When a laptop started falling apart, I usually tried to just put up with it until I couldn't.
For example, I used to have an Asus computer whose plastic surrounding the screen decided to start coming detached from the monitor flap. This made the laptop substantially more fragile and annoying to use, and after a certain point I tried to remedy this with gorilla glue and it led to this ugly mess on the bottom left corner. The laptop still "worked" in the sense that still did computation, but it was crappier. Then the 7 key broke off the keyboard, I was unable to put it back on, so I just decided I didn't need the 7 key, since I didn't type 7 that often, and when I did I could still hit the little switch. Again, the laptop still "worked" in the sense that it still did computation, but it was crappier. A bunch of other stuff ended up happening (e.g. the LED for the backlight started to go out and become this flickery mess, the connector to the battery didn't always seem to make contact, etc).
Stuff like that starts to add up, and "experience" is substantially more difficult to quantify. I bought an expensive Macbook, and I never had any issues outside of the inevitable "moores law" depreciation.
> I bought an expensive Macbook, and I never had any issues outside of the inevitable "moores law" depreciation.
I hope that keeps going. I used a macbook for work for almost 8 years, and they did OK, but I had one that decided not to take external power and the hard drive wasn't removable, thankfully I noticed it wasn't charging while it was near full so I could pull a backup to a spare work hand. And then there was the year where iTunes would have a 25% chance of spewing high volume digital noise at me instead of playing music. I guess that was a software problem because it went away with the next major OS X release, but no useful forum contents. I think there was something else bothersome too, but not sure anymore.
I have an apple macbook air from mid 2012, that i paid 1200$ for. If it survies 6 more months, then I've spent 120$/year on laptops over the last 10 years.
I have a 2015 air that was $1k. I expect to get down to $120 per year in a couple years, but I would have to add $10/year for replacing the battery every few years.
Instead of buying a brand-new potato, consider buying a used or refurb'd laptop. Your performance stance doing this is much better now compared to.... any time else in PC history because PC hardware performance gains have flattened out. Plus, corporations get rid of perfectly good PCs like, every year because they want the latest model for their staff and especially their executives/management. Know where to shop and you'll find a glut of cheap and even free computers. I've been poor myself; used machines is how I got by. That and building my own.
Either way, you'll pay about as much for a used ThinkPad in good condition with good specs as you would for a new HP Stream or other cheaptop.
> Plus, corporations get rid of perfectly good PCs like, every year because they want the latest model for their staff and especially their executives/management.
Tangential, but a bit of a lifehack I figured out awhile ago is that corporations dump off old servers on eBay for basically nothing, and most servers allow you to install a regular desktop graphics card in there. Servers usually have a lot of CPUs and a lot of RAM, so 9 years ago when a broke me needed enough power to do cool stuff on the computer, I would go buy a used server on eBay, and it was good enough for video processing and editing and gaming and distributed computing experiments...as long as I remembered to turn it off when I wasn't using it. Whenever I would accidentally leave it on for a few days, I would end up increasing my power bill by ~$40, a lot of money when you don't have much.
Still, it's a trick I still use occasionally, even now that I make decent money. I semi-recently bought a 48 core, 128gb RAM server for around $400, which I use for any big computing experiments. Could I just spin up an AWS box with these specs? Probably, but I think there is value in being able to have the hardware locally.
I once scavenged an HP workstation from behind a dumpster. It was just sitting there in the rain. I brought it in, dried it off, and checked the innards for rust or damage. All looked nearly brand new, so I let it dry out for a couple of days, and powered it on -- it worked. Put a hard disk in and it was ready to go. It's a fairly powerful machine, with four cores and 12 GiB of RAM, a real powerhouse for 2012 when it was new. Probably chewed through many a spreadsheet back in the day. Now I'm making it into a build server.
That's awesome. I think my wife would punch me if I got into the habit of dumpster diving, but there have been multiple times where I've seen what looks like awesome equipment (monitors, computers, surge protectors, etc.) being thrown away near universities and office buildings, and I always have to resist my hoarding nature to take them.
Four cores and 12gb of RAM would make a pretty solid build server, with enough room left for a Minecraft and video streaming server to boot! Sounds like a pretty awesome find.
I found it's better to buy a second hand top model, or even last year's best on sale, than brand new low quality stuff.
It's a little less visible for laptops than for, say, kitchen appliances, but even there my thinkpad x220 was bought and upgraded for €400 in 2015, and it did its job well untill half way this year.
I usually buy top quality laptops second hand from shops that give at least 6 months warranty. Best strategy. You get a $2000 laptop for $500. And honestly, Intel did not do too much in the last decade, so these are of great value.
I was going to say the same - you are often better off buying a quality not-too-old used than buying a crappy new low-end machine. Better for the world too. However I tend to keep my gears for a very long time (hello my well-loved 2007 MacBook Pro) so I can justify buying new (w/warranty).
Yeah, I got my fancy macbook pro now because I used to work for Apple and had a pretty substantial discount on it as a result. When I need to replace this one, I'll probably get something decent in the used market and just install Linux on there.
> The environmental consequences of that system weren't lost on me, even given my very good track-record of re-homing my old computers with people who needed them.
If it causes other people not to by new laptops, it kinda is addressing it. (As long as we assume the people getting the old laptop would have bought a new laptop, which might or might not be the case.)
And apparently it made sense for him to pay 150$/year to get his laptop fixed in 24h if needed, and buy two powerbooks at once... I guess what he really should have bought is a Toughbook instead of a ThinkPad?
Buying a new thinkpad every year is especially confusing to me given that Lenovo’s switch from mobile to ultrabook processors in the x40 series meant that for around 5 years, buying a newer thinkpad than the x30 series meant getting a speed downgrade.
Ah, hehe, I got confused opening the tweet. I wonder why I had in my head that this was a woman writing the story, must be the monica-byrne in the url :)
The author, Cory, links to a previous explanation of when he quit smoking he converted the cost into getting a new laptop annually. As he mentioned in the article he typically finds a new home for the used device. Laptop appears to be his primary device and critical to his work so updates annually makes sense, though a new device is partially due to the construction framework elimates(i.e. riveted or glued components).
He traded smoking for buying a new laptop every year. Now that it's been years, I guess he could quit and not buy a new laptop. But also people do more wasteful things. I do understand though, I drive laptops into the ground over many years but still 4-5 years per laptop
2013 MacBook Air daily driver here. MagSafe? Usb A? Sd card reader? User replaceable battery? Runs Linux? All checks. It’s light on RAM (which for just chrome and light app use honestly it’s fine).
yeah 1440x900 does kinda stink, but the screen is so small it doesn't really bother me too much. plasma does a good job shrinking itself down enough and virtual desktops help. 99% its a full screen chromium window so who cares.
I would guess that MacBook Airs, especially the current ones, are sufficient for the needs of 80%, maybe even 90%, of the entire laptop market, who I presume just need to be able to use a browser and spreadsheets.
And they last for years and years, and I doubt the cost:performance:longevity ratios can be beat.
What you're supposed to do, apparently, is buy a brand-new laptop, use it for a few months, then flip it on eBay before it gets too old so you can recover most of what you spent on it and buy the next new laptop.
I worked with a guy who practiced this with all his personal hardware.
It's very, very normal for wealthy people to replace their daily-use tools every year, or even more often.
I replace my phone and laptop and iPad every year. I know people who replace their car and wardrobe and luggage every year, too.
In laptops and mobile devices in general, annual updates make a lot of sense as power efficiency is still regularly increasing. The M1 Air, is, for example, a fucking marvel. It's been out for way less than a year. I have an M1 Air, and will upgrade it again in less than a year when the Mx (where x > 1) Macbook Pro comes out.
> The environmental consequences of that system weren't lost on me, even given my very good track-record of re-homing my old computers with people who needed them.
They can just sell the laptop and someone else will use it. For example, I almost never buy new laptops, as perfect Linux support generally lags behind.
Ignoring the (potentially substantial) environmental costs, if you do it "correctly" the total cost of ownership is about the same.
If you buy a $2000 Mac and use it for as long as reasonably, it's going to depreciate by several hundred dollars (let's say, roughly $300) a year. At a certain point it's worth nearly zero, and you must buy a new laptop. After 6-7 years your total outlay is $2000.
Alternatively, every year or two you can sell the old one for a few hundred dollars less than the new model, and buy the new model. You always have a new laptop. And your total outlay is still only about $2000. Plus you are covered by free AppleCare every time you buy the new one.
Plenty of people do this with mobile phones and automobiles and other things as well.
Please note that I am not advocating it. I was still using my 2015 laptop until very recently. But economically it is not necessarily insane.
(Assuming you are selling the old laptops, that is. It's not clear to me that the author is doing that. He says he's donating/rehoming them. Not sure if that includes selling)
Though, with Macs, it's trivial - their migration tool is peerless. (As one would certainly expect and demand: since they control the whole software/hardware stack)
Given that the expansion ports are just USB C ports, the real innovation (one of them) is making an integrated dongles. The only issue I see is that the adapters are a big too large but also a bit too small; the adapters are large enough that there are only 4 but also small enough that making a 2x USB A adapter would be quite challenging.
I think the idea could be expanded upon. Dongles that extend beyond the chassis such that they protrude a bit but make a flush contact with the laptop chassis. This would let you basically turn a typical USB dock into a permanent extension. Perhaps instead of having an two expansion ports on each side, each side of the computer including the front & back would have one larger port. I don't think having two USB C ports does anything to increase overall bandwidth available since it's probably all shared anyway.
Also once Thunderbolt 4 and USB 4 are available it should make creating standardized expansion adapters a lot easier since you won't have to worry about an adapter needing Thunderbolt on an AMD system.
>the adapters are large enough that there are only 4 but also small enough that making a 2x USB A adapter would be quite challenging.
This was my thought as well, but they're planning on doing more interfaces than what they're listing now, so it may be they went for universibility with the size.
Also USB-A is going to slowly die off anyways, so I doubt it's good to build the dongle standard to two USB-As.
But why the dongles? The go against all the principles of the rest of the design. They’re proprietary, consumable, and they waste space in the chasis. They limit what you can do with the USB-C they pass through too.
They are actually open-source[1] so you can print your own case in a 3d printer and put the circuit you want inside. About the USB-C limitation, I initially thought it was a limitation too but Im not sure after learning that you can use them as thunderbolt ports. Anyway, I think the notebook seems to be thin enough and the possibilities of dongle are exciting[2] - like magsafe chargers[3] (which seems to have some patent problems).
Only a guess but they also act as a sacrificial device in place of the motherboard's USB-C connector. I've read here on HN that Macbook USB-C adapters regularly fail.
Louis Rossmann recently reviewed the Framework laptop and basically said the exact same thing, about them taking a lot of the stress off the soldered USB-C portion and transferring it to the chassis instead.
I think I read somewhere that they were planning to open up the specs of those modular ports. That may alleviate some of the points you bring up. (Can't find the source right now though.)
In what way do they limit the USB-C they pass through?
> In what way do they limit the USB-C they pass through?
GP means that if you use one other than the USB-C pass-through module, it's (trivially) a proper subset of the USB-C functionality that you started out with.
I don't really see the issue: the alternative is that you do that in a dongle more external to the laptop anyway?
As a long time thinkpad user I love mine. Only problem I have now is my go-to OS Ubuntu does a really bad job of fractional scaling out of the box and none of the workarounds seem to really solve the problem. Sadly the crappy screens on my thinkpads more or less "hid" this problem from me for years.
I agree. I really like the keyboard on the original Google Pixelbook, the Pixelbook Go, as well as the Framework. Nothing compares to the pre-chiclet Thinkpad keyboard though (IMHO).
I might try that. I've tried Ubuntu 20 and 21 and they still look blurry to me pretty much no matter what I do. maybe I can figure out what PopOS includes in their distro and I can set that up on Ubuntu.
It feels like for years we were constantly told by these corporations "it's impossible to have slim, tiny laptops and make them customizable", and I guess at some level I believed that, and just accepted that laptops with upgradable components were a thing of the past.
After seeing the framework, I'm more than a little annoyed that I fell for this. They proved you can have a slim, clean laptop that's somewhat modular, and more impressively, with something like 1% of Apple's budget to do it. Had I known about it, I probably wouldn't have paid an arm and a leg for a maxed-out Macbook Pro a year ago. MacOS is nice, probably my favorite consumer operating system currently available, but Apple's walled garden approach is beyond annoying.
Linus (the "tech tips" one) said something that sticks in my mind: "The only reason other companies can't do this - and Framework proved it - is because they don't care."
"Imagine being an engineer at a company at Apple, and it being your job to design the mechanism that makes it so that machine cannot start up unless the chassis is fully sealed. Apple spent actual fucking money making sure that product would not work unless it is in the exact chassis they shipped it in."
Apple's consistently demonstrates that their most important customers are their shareholders. They are experts at walking the line between maximizing profits and alienating their regular customers. If they felt that a modular computer would have a higher ROI, they would be all over it.
Honestly, I would not be surprised if Apple 'invented' the idea of 'integrated dongles' before their next keynote so they could sell you a $95 usb 2.0 port.
For what it's worth I don't think Apple has actually ever done this, and whatever made him believe they do was probably some other oversight during their disassembly/reassembly of the laptop
To be fair, if a company wants to produce something that only works on one set of hardware, that should be fine. We simple choose not to use it, right? And many of us /do/ choose to use it? But why do we choose to? Because we find that we're too busy to maintain a Linux-based workstation.
While there are questionable practices by Apple and many other machine producers, what you can't argue against is that in limiting the hardware that MacOS has to work with, they're able to deliver a level of stability and user experience that you don't get with Linux.
Sure, it would be great if we could replace the batteries, if we could upgrade the memory, and easily fix broken parts, but that isn't the company's ethos. The company produces devices that are plug and play, high grade consumer electronics. Nobody forces us to buy these products.
Anyway, that being said, the framework machines look super interesting and if they were UK available, I'd probably get one for a non-critical Linux-based workstation.
As if choosing a $1k+ computer to use for years was equivalent to choosing the flavor of ice cream scoops.
The "voting with your wallet" argument doesn't work when there's several variables in play, and the optimal configurations don't exist on the market. Like e.g. I'd like to buy a computer that's just like Macbook, except with repairable/swappable/upgradeable components. Or a phone that's just like iPhone, except with replaceable battery, a headphone jack, and repairable home button. But I can't have them - even if I'm ready to pay a bit extra, and if I'd welcome a thicker device. These options literally don't exist. Nothing similar to them exists. Particularly on the repairability front, every vendor is choosing to just not offer it.
> I'd like to buy a computer that's just like Macbook, except with repairable/swappable/upgradeable components
That's the thing, making something plug and play and mostly "driver-free" would be very hard to almost impossible. Framework laptops look amazing but they will require at least a bit more maintenance and knowledge, and that is fine too.
You say all this, but you would agree; We really can't be telling private companies or individuals what to and what not to do with their technologies, right?
1) How do we enforce that at smaller scales?
2) How would we prevent our regulation from squashing innovative solutions to problems, or enhancing safety in critical applications?
I agree with your larger point, regarding limited hardware support, etc.
I agree that Apple shouldn't have to support random mods / hardware components / etc and that their selling point is "it just works".
But then again, they don't have to be dicks about it. If they're able to detect that the hardware has somehow been modified, maybe just show some message along the lines of "you've modified the hardware, we're not supporting this anymore, you're on your own" instead of bricking it.
Where is that quote from? I wasn't able to find it via Google. Anyway, a computer that refuses to turn on after been tampered with does have its uses, particularly if your threat model is government secret services.
Realistically, if your threat model is government secret services, and you're using unmodified consumer grade electronics, then you're in 'danger' no matter what. You can't effectively mitigate a threat at the state level using resources produced under the watchful eye of the same state. All they have to do is ask the producer to swap out the device they gave you with a device that comes compromised out of the box. And that's assuming the tech is perfect. Most likely they just hire someone to defeat the countermeasures. However many resources Apple has, I assure you even the most janky state has more.
Special hardware seems 007 childish to me. What's better, having a high-tech tricked-out phone/laptop, or to just have a random stock Android with an inoffensive sim card in it? It seems obvious to me that if you're being targeted, tailed and tracked and probed, you've already lost.
No, I mean if you're buying a laptop off the shelf and not ripping telemetry components and whatnot (WIFI card/airgap for example). Customizing hardware to foil any out of the box attacks, rather than some sharks-and-lasers config to 'protect it'. Governments do this all the time for even slightly sensitive information.
Commenter above was saying though that the device's anti-tamper tech would save you from state level attacks. I'm just getting at the fact that that's not going to work, since if a proverbial 'they' want to take you out, there's other ways to do so you can't overcome. Just a few examples that came to me about how easy it is to foil anti-tampering measures.
Your "random stock Android" likely has a boatload of exploits open unless it's a Google Pixel.
> It seems obvious to me that if you're being targeted, tailed and tracked and probed, you've already lost.
Depends on which government agency watchlist you are. If you are some sort of Islamist terrorist, the tools that are open to the government are far more capable than if you are some sort of low level drug dealer.
There are legit security reasons you’d want this. Giving the owner of the equipment the ability to manage this would have been the appropriate solution.
There are legit security reasons to employ platforms that accommodate in-house repair. 'Security' can also include requirements for traceability at the component level.
Well, the other reason is that customers don't care either. I certainly don't. The last desktop build I made 5 years ago hasn't had a single component changed in that entire time.
It's not just about upgrading components, but also about having more choice during initial specification. It's a real PITA trying to find laptops that have almost everything you want, and inevitably you need to make multiple compromises.
And it's also about replacing broken components without having to ditch the whole laptop.
I'm really excited about how Framework could potentially shake up the whole industry!
> Apple proved that most people don’t want too many choices
They proved that taking away choices is still better than the shitshow their competitors are running.
Apple proved that a few simple product names are less confusing than literally 6 different "brands" of laptops from a single company. That doesn't mean people don't want choices though, they just don't want to feel like they're getting trolled by badly designed websites throwing all the possible laptop configurations in their face. Even if you know what the specs mean it still feels like a major waste of time to try and compare the 20 devices on the screen.
I want to configure every detail of my laptop, not 2 details on one of 200 laptops.
Apple also proved that making devices difficult to upgrade, maintain and repair is harmful for everything and everyone other than Apple.
That doesn't make it impossible to get such configuration options. If I can choose how much RAM I want, they could just as well offer the "no RAM" option so I can keep the two modules of my old laptop. Instead they lie to our faces claiming RAM has to be soldered on for some reason, making that impossible. Same with SSDs etc.
I would understand it if a manufacturer offered some ultra high-end model with custom storage like in the PS5. But if the Framework laptop can have swappable RAM and SSDs with almost the same thickness as a (insufficiently cooled) Macbook it's obvious why that stuff is soldered in.
I'm not pushing back too hard on this idea, because in general you are likely right about almost anything, that most people don't care that much. However, I'm not sure that Apple really proved that most people don't want too many choices. The choice to buy an Apple computer could be for any number of reasons. Like for example, I really like macOS and the integration between my iPhone and my MacBook for things like iMessage. Anytime I've bought an Apple computer it's felt like I have to compromise on the hardware options, but I still do it because I like other aspects of the overall ecosystem.
I don't have market analytics, only single cases, but nobody who has ever asked me for help about a computer has wanted to know what the difference is between this 'Intel' part vs 'Celeron' part vs 'AMD', carried on to graphics, disk technology, etc. They typically not only don't indicate a desired minimum amount of memory, but cannot reliably talk about system ram vs storage.
What they want is to know they are getting a good deal and that they aren't buying a lemon, something that cannot meet their needs.
What Apple did is decide they should really distinguish on classes of identifiable hardware differences, eg. a better larger screen for a "pro" class, have good/better/best distinctions within that, and customization for those who are picky.
I assume the intersection between people who have particular hardware requirements and those who do not understand their hardware requirements is extremely small these days. Apple doesn't sell computers which really fall short these days, so I'm able to focus the conversation on usage, user-impacting hardware features, and long-term budgeting (e.g. planning even as far as the replacement for the machine they are buying)
Look back to Steve Jobs’s return. Long before they had ecosystem lock in, before even the iPod was released, he simplified the range drastically to make it easier for people to make the buying decision.
Also, anyone who says “it’s felt like I have to compromise on the hardware options” is an outlier by definition.
I feel like this is a bit disingenuous. The general populous/average consumer prefers simplified options. They aren't tech savvy as many here are. When you throw a bunch of specs at them their eyes glaze over. And then ask you if they can get on their Facebook.
It makes sense from a business sense to have fewer models with small changes between them. You could have tech workers that assemble every custom order. That costs a lot more than a simplified inventory of a few different models that are already pre-assembled, with no hardware customization.
I like the Framework laptops as well, but I think there's a problem I'm not sure is gonna work out in the end.
If I understood it correctly, you can buy replacement parts only from Framework. They will have supply issues, and customers will be unhappy.
I guess if the laptops don't ever break, or if the customers' need for replacement parts is more theoretical than real, it could work out. Or they somehow manage to get over the small, niche manufacturer hump and become a Lenovo with massive scale. I doubt that's gonna happen.
> If I understood it correctly, you can buy replacement parts only from Framework. They will have supply issues, and customers will be unhappy.
RAM, wireless, and storage aren't chained to Framework, and are effectively the only parts you can reasonably buy for any existing laptop in the current day.
I would not be surprised if battery and screen replacements start popping up, but that's just a guess not something I'd bet on.
That Framework will be the only suppliers of parts that other laptops don't even attempt to make replaceable is not a worrying situation, it's a hopeful one.
AFAIK they generally use the same industry standard interfaces for RAM/SSD/etc as all other non-shit laptops.
The problem would come in if a Framework-specific part breaks, but at least those generally seem to be pretty simple (apart from the motherboard, at least).
I haven't checked everything but the RAM looks like it is standard: "For memory, the Framework Laptop has two SO-DIMM sockets supporting DDR4 DRAM at up to DDR4-3200 speeds"
People care about Right To Repair; that's why the FTC has been pressured into action recently on the matter (not that I entertain any hope that that bunch of bought-and-paid-for bureaucrats will actually achieve anything.) Framework has blown a vast hole through the false arguments offered in opposition. One must simply care. That's all it takes. Every manufacturer that has opposed RTR has the means and talent to do at least as well has Framework has done, and probably better. They just don't care.
Thankfully the vestigial remains of our free market are sufficient to run the experiment.
Do customers care though? Perhaps customer advocates do. And that's probably the best place for it, since it's such a niche and wonky idea.
And that's why "free markets" will never solve this. (And that's whether the "free" in free markets means freedom from regulations, or freedom for people to participate in the market).
IMHO this is why the European system of strong regulatory bodies tends to work better than the US system of "wait for a customer to experience damages, then recoup through the courts, and then the companies learn their lesson."
And yet here we are; despite the 446e6 strong market place the EU supposedly represents their regulatory power has not delivered what we see here. No, instead we have an American company motivated by only the belief that their product will succeed in the market kicking open the door.
Well the question is do you want every laptop to have the customizability of Framework's laptops, or should there instead be a minimum bar set for warranty/repairability? I think the second is probably what's needed, and the EU has been better both at imposing standards and ensuring warranties and repairability.
The answer is I want a competitive market filled with options that range from a completely sealed, disposable monoliths to machines like this Framework product where components are easily replaced and/or upgraded by me or any qualified or unqualified person I choose. And I want that _without_ the easily circumvented bureaucratic hellscape of lobbyists and captured regulators incestuously welding down the status quo in perpetuity.
Have you considered that the current landscape is a product of a competitive market? Historically laptops were never as repairable as desktops. Most parts of those bulky 1990s Powerbooks, Latitudes, and Compaqs were hard to access due to proprietary screws. Every laptop manufacturer had non-standard components and non-standard ports and those components and ports would evolve every 6 months. If you wanted replacement parts and you weren't a corporate repair shop, you were shit out luck before Ebay existed. The adhesive-sealed laptop that you resent is a product of the standardization that corporations and suppliers eventually sought after going through the Wild West phase of the mass market PC.
A competitive market isn't a marry-go-round where every idea gets its turn under the sun for all eternity. It's an arena where some rise and many perish. In the '90s and '00s, many ideas fell through, many companies collapsed, and many technologies become outmoded. What has come out of that is the sealed computer of today.
"Have you considered that the current landscape is a product of a competitive market?"
I have. I note that large numbers of people build PCs from components and that this market is large enough to be a primary concern for a constellation of manufacturers and has been for decades. You can buy an IC with 1200 contacts and install it yourself on the kitchen table. There is no other segment of the microelectronics world were this level of commoditization exists and yet it has stood the test of time. Transferring this behavior to mobile machines seems like an inevitable and long overdue step to me.
"Historically laptops were never as repairable as desktops."
History is a poor yardstick here. A number of forces have emerged that change the landscape. Among these are amazing design tools that enable a startup to go from zero to a complete, shipping modular design in 18 months (establishing a defacto standard, btw), tooling that delivers rapid fabrication in small volume, standardized, high performance serial busses that enable simple yet powerful architectures, robust solid state storage devices and the integration of some difficult components into CPUs. It used to require the resources of major manufacturers and their proprietary knowledge and capabilities to pull off marketable mobile designs. That era has passed and the commodity era is here.
"The adhesive-sealed laptop that you resent"
I do not resent monolithic products. I own several. I will buy more. I resent the lack of a choice. I expect that modular mobile machines will take their place among the equipment I acquire, and that these will become the major focus of my concern, whereas the monoliths will be relegated to ancillary tasks.
"What has come out of that is the sealed computer."
And they won't go away. The question is how much room is there for modular systems. I believe there is a lot. I imagine a Newegg filled with commodity mix and match mobile components from a vast number of vendors.
Exactly. And in particular, I want companies showing the absolute limit of what's possible if you don't worry about modularity, and other companies like Framework showing how much of that they can provide while also using modular components. That's two different directions of innovation that both need pushing, as useful competitive forces that people care about.
Ideally I would agree with you, but reality demonstrates that markets tend to converge on one standard rather than let two coexist. CISC vs RISC, Firewire vs USB, Floppy vs Zip, IrDA vs Bluetooth, etc. Now it's modular vs integrated.
That happens with technologies where there's a strong benefit to standardization. Standardizing on USB and Bluetooth means your devices can interoperate.
There's already no "standard" laptop design, just a set of desirable properties people want. And there's already no push to converge; there are many laptop vendors. There's plenty of room for a new vendor with different priorities (like modularity); there's more room for such a vendor than there is for one more undifferentiated vendor.
The minimum bar is a theoretical idea. The reality is yet more audits and auditors and internal regulatory staff to produce more documentation that "proves" compliance. It's a huge weight, not lightly welcomed.
If we increase the warranty requirements for companies, the repairability will necessarily increase as well.
However, people do keep in mind this still may not result in better third-party repairability - it may be things like easier reclaiming of components off of boards by the manufacturer to put into refurbished swap-out units.
Except that regulatory burdens are the kinds of barriers to entry that prevent a company like Framework from existing in the first place.
That just leaves you with entrenched companies doing the bare minimum for compliance, and lobbying for loopholes to protect their own market positions.
Here in the U.S. any kind of basic regulation == communism, but I when I imagine "right to repair", I imagine a free market where Apple is allowed to sell glued-in batteries and Framework is allowed to sell repairable products, but all companies must publish the private internal repair documentation they already have and sell the replacement parts they already have, if available. Apple may legitimately not be able to sell replacement batteries, if even Apple themselves can't replace them, but at least there's transparency for the consumer. Eventually I imagine Apple could no longer get away with this practice, not because they are legally forbidden, but because people would become aware of it.
All those companies want to hold out for the idea that they can monopolize some corner of the industry - none of them want to be turned into purveyors of commodity products in the face of heavy competition. But the future likely lies down that path.
A desktop you built using standardized components you sourced from a competitive market with a plethora of alternatives specifically designed for easy assembly. Should any component fail you can obtain a replacement and perform the repair yourself. Doubtless these affordances are a part of why you chose to assemble you're machine yourself.
You do indeed care. The inability to extrapolate this to laptop machines seems obtuse.
No, I still don't care at all about the customizability. I did it mostly because I had to use Windows but didn't want to deal with an OEM's crap ware. The customizability was actually an impediment to get what I wanted: a box.
I don't doubt that a huge market of interchangeable parts made this easier. But it's important to separate the ends from the means here, when it comes to customer concerns. The customers that want customizability and upgradability are a vanishingly small slice of customers. (Just as are the ones who want to run Linux, and that small slice does include me.)
"I did it mostly because I had to use Windows but didn't want to deal with an OEM's crap ware."
Can you not imagine the vast market of people that might want a laptop not loaded with OEM crap ware? Because that is exactly what could emerge if Framework manages to establish a market of commodity mobile components.
The simpler route would be to have an OEM that didn't install crap ware, rather than having to order a basket of parts and assemble them.
I went the basket route because it required less research for me, because I just wanted to get to my end result as quickly as possible.
Grander goals about establishing ecosystems that serve other eventual end goals is not the way that most money is spent. (Though I do spend my money that way in other areas, such as with climate action, the PC market does not matter that much to me.)
You have just cited two more excellent reasons for modular laptops and commodity components; component selection and the environment. That brings the total to three, including the "OEM crap ware"
You're a potential customer of this product, your cognitive dissonance on the matter notwithstanding.
I've only had laptops for over a decade. And I had system board failures.
Honestly, I'd rage if I had to throw it out together with the perfectly good CPU, GPU/VRAM and maybe half the RAM and pay for a used replacement board with all of those integrated.
Or what, buy a rework station and risk damaging them or having them work improperly due to shit soldering skills?
I guess I could learn to fix the board itself. But that's pretty hard, there are no schematics, no components for most laptops, failures are not evident and one component can lead to a cascade of failures across the board. A used board was $100. Now they're ~$500 because there's a CPU and GPU there.
My next computer will be a desktop in a handcrafted case (I'm also trying to fit a Li-Ion battery/UPS between the PSU and components).
> (I'm also trying to fit a Li-Ion battery/UPS between the PSU and components)
This should totally be a thing. I wish it was a thing. It doesn't even have to provide hours of runtime, just needs to be enough to handle the occasional stupid California brownout.
You might want to reconsider. I've found a separate UPS to be invaluable because, as you'll find, just getting power to the box is not sufficient. Other devices need power as well, especially network switches and displays.
Yeah, it is the most efficient way to power everything (no conversion losses). 30 minutes is enough tbh.
The battery will fit in an empty PSU case, I just need some custom cables and connectors for the passthrough, my biggest problem is charging and switchover. Looks like I will need a custom board for that. I thought it'd be easier heh
Why handcrafted instead of the framework laptop?
Are you just itching to do that project, or is there some other issue you had with getting the laptop from them?
It's more future proof. And performance is unmatched. Handcrafted because I want the smallest, lightest microATX case. Should I share the design? I have it ready in SolidWorks.
I will still need a laptop away from home and/or as a portable display, that'll be one of my old 17 inchers or the cheapest one I can get (Haswell gen lol).
Whereas any dating of when my last desktop build was would be deceptive because they're replaced piece by piece. The closest occasion I can give is when I went from one to two desktops, but even then half the guts of the one I had before went into the new case, and were replaced in the old case with new purchases.
I mean, I've met people that only wear underwear once or twice before throwing it away, too, but I wouldn't say that's normal. People would generally rather replace a drive, processor, ram, or screen than spend 10x as much on an entirely new system. They don't because the manufacturers make that option difficult or impossible.
People didn't love VCR/TV combos, and people don't love this. Manufacturers love this.
Your desktop doesn't have a battery in it, like laptops do. Batteries are the one component in a laptop that is guaranteed to degrade over the course of a few years and eventually make the product unusable. The battery is glued in on the MacBook Pro, so it will eventually become useless. It's as simple as that, and very unfortunate. It's nice to have the option to upgrade the other parts too, though, and why not expect this, if it's clearly possible, as the Framework laptop shows?
Reminder that Apple isn't the problem. Their customers are. I know movie quotes are seriously lowbrow for this audience, but somewhat relevant here, and correct:
"That system is our enemy. But when you're inside, you look around, what do you see? Businessmen, teachers, lawyers, carpenters. The very minds of the people we are trying to save. But until we do, these people are still a part of that system and that makes them our enemy."
Not to Godwin the thread, but this absurd apologia for anti-freedom products from the standpoint of convenience and apathy reminds me of the old quip about Hitler making the trains run on time.
I hope a decade from now you love the soul-crushing Snow Crash-esque dystopia your choices will have created, because you will have absolutely no moral authority to complain about it.
The problem is that for most people there are no clear incentives to making the "good" choices in those matters, while there are many incentives to make the "bad" choice, like gaining an edge in the local competitive market by virtue of a more efficient tool. Prototypical tragedy of the commons.
But imagine if your next desktop build required you to also throw away your SSD, monitor, keyboard, mouse, and speakers. You can also re-use a case in future builds, and re-use a high quality PSU in future builds.
I also don't really care to replace individual components in my systems, but being able to upgrade laptop hardware without throwing away the chasis and storage seems pretty nice to me.
My next desktop build will require entirely new parts, and the prior build will be demoted to other uses, or given to someone else who has a better use for it. As a whole unit, it still has utility. If I scavenge an essential piece, all the rest go to waste.
If it were a big energy hog (it's not), then it may make sense to put it out to pasture and scavenge the parts for others.
The exception is my keyboard, is like my toothbrush, if toothbrushes could last for two decades. That I will keep and move to the new computer, as keyboard technology is not advancing.
For laptops, I see even less utility for upgrades than for a desktop, but perhaps that's just me.
Easy upgrades mean easy repair. The opposite is also true.
You can move your keyboard to a new computer because you are able to detach it without melting half the device with a heat gun. With laptops it's not that easy. When the MacBook keyboards broke all the time a few years ago, a keyboard replacement meant also replacing the speakers, battery and touchpad. Not for any technical reason, but because Apple doesn't like screws. The MS Surface Pro and Surface Laptop couldn't be repaired by anyone, not even MS themselves - if a single $20-30 part fails you have to spend $1000 again. Doesn't sound like a great deal if you ask me.
You might not need or want upgrades and maybe you're lucky and nothing ever breaks. But having the choice only comes with upsides.
> The last desktop build I made 5 years ago hasn't had a single component changed in that entire time.
My last desktop build is still going strong nine years later. Swapped out a broken motherboard, upgraded to OCed DDR3, and stuck in a PCIe card for NVMe: good as new.
The only a tiny minority of people probably care about the upgradability of their laptop. Many many more probably care about the repairability of their laptop, though.
I think you're right. Many people "care" about repairability in the market sense of getting pissed off that a keyboard replacement costs >$500 for some reason. The right kind of marketing could bring them onboard.
That's only the half of it - everyone cares when they have to repair their device. You start to feel the unfairness of it when companies charge you exorbitant prices for common components because they've designed it in a non-standardised way for that particular device. Or when you face the reality of (for e.g.) having to pay to replace the whole board because of a malfunctioning soldered RAM or soldered SSD, and realise how shortsighted it was to buy a device that is designed not to be repaired.
>The last desktop build I made 5 years ago hasn't had a single component changed in that entire tim
I upgraded my monitor and changed 2 keyboards(I am hard on them) with a laptop if you fuck your keyboard you probably have to use an external one or hope that replacing your laptop keyboard is cheap enough and you can find a spare.
My laptop has a single thunderbolt port. I use it for my 5k monitor. If that port goes, I need to get a new laptop. (That new laptop will be the Framework laptop, if they offer a 15" hi-DPI option.)
If my laptop was a Framework laptop, I'd just need to buy a pretty inexpensive new port and swap it out. It's a pretty big deal, in my opinion.
I care. I often swap out parts. My family and friends care, because I help them swap out parts when needed, especially during critical failures when they need it working ASAP and can't risk some corporation formatting the hard drive for no reason.
And my coworkers care because they're fellow techies and do this stuff, too.
A friend of mine has a fairphone and broke its screen while he was staying with me. The fairphone is quite similar insofar as it is designed to be long lived and user serviceable. He ordered a screen, next day delivery, and changed it himself for about £60.
Soldered ram significantly decreases repair incidents (from unseated laptop ram) and increases runtime reliability (from direct electrical contact of said ram). It allows for the machine to ship with an optimized ram configuration (lane count, timing). It also reduces the part cost and device footprint.
Most RAM comes with a "lifetime" warranty. IMO if the soldered RAM is so great, they should give me a lifetime guarantee they'll replace the MB if the soldered RAM fails. Then I'd be ok with it.
And it's impossible to buy an upgrade for just one component. If you want 16GB of RAM you better be ready to buy an i7. If you want a nice 4k display you better be ready to buy a totally maxed out machine.
Just being able to use my own NVMe disk in something like a Framework translates into savings for me because I don't need a huge disk in my laptop and can reuse one that's too small for my server or desktop.
Linus has now invested in Framework, which constrains how he is allowed to review laptops on his channel because of the possibility of a conflict of interest. He says it's worth it to support what he believes is a great company with an awesome product vision.
"invested" is a bit of a misnomer here. He bought several of them for his employees. He was courted by Framework to buy into the company, but as far as I know, he noped out of that deal.
He posted a video a week ago indicating otherwise and that he is now a stock owner in the company. It's called "I'm Legally Obligated to Disclose This".
Linus has shown his flagrant disregard for impartiality or any form of integrity over and over again. Look through the LTT back catalogue and you’ll see that Intel is big sponsor of theirs. The fact that he gives “honest” reviews of Intel products doesn’t magically make it okay.
You’d never see Dr. Ian Cutress of AnandTech or Steve from GamersNexus pulling this shit.
Of course he makes much more money than both of them combined. Make no mistake that he is an entertainer and a businessman.
GamersNexus is all the time reviewing stuff positively and then taking on the same enterprise as sponsor. They can do that because their viewers know that they will still be very critical with the next product.
LTT also really always had the proper balance. Sponsored reviews are marked, sponsored segments are marked, and they are not holding back on negative reviews for long term channel sponsors. They totally ripped into Intel again and again for the failure to compete with AMD, and at the same time have Intel sponsor new hardware upgrades for team members in a sponsored mini-series. Totally fair.
> at the same time have Intel sponsor new hardware upgrades for team members in a sponsored mini-series. Totally fair.
Of course Intel did a big marketing push right as their products were the least competitive, and I'm sure LMG was paid large. Putting that kind of stunt in the same league as GN reviewing a product from a company that previously sponsored them (which is, of course, all that LTT does) is simply ridiculous.
"Balance" is such a weasel word in this context. They're playing both sides, plain and simple.
AnandTech quit doing SSD endurance testing as soon as vendors started selling trash TLC and QLC. Is that a coincidence? Can you really trust them?
There's a point where you need to put some trust in reviewers because the industry is set up to make them dependent on the manufacturers. However, there's a huge difference between traditional reviewers where employees are doing reviews and new age reviewers where influencers are doing the reviews.
People like Linus and Steve have way more incentive to put their own integrity over short term interests like pleasing a manufacturer, so it's very unlikely you'll ever see them shilling for anyone. Getting caught doing that once would ruin their brand (and credibility) because they are their brand.
In other words, there are no scape goats in the influencer space so they have way more incentive to be completely honest and transparent.
I remember when Tom's Hardware was new and I think the current generation of influencers / reviewers are going to obliterate the traditional media companies that have turned into affiliate marketing shills.
I think a lot of people underestimate how financially important it is for groups like LMG and GN to maintain the trust of their core audience.
In the videos Linus does breaking down LMG's revenue, about a third of it comes from a tiny fraction of their audience - merch and direct subscriptions. I'm sure it's a similar chunk if not more for GN through modmats, mousepads, and Patreon.
Even if they sold out their integrity and still got millions of views, it's that "hardcore" audience they can't really get back. In an enthusiast space where a large chunk of the audience are professionals with disposable income, it's a lot to lose.
I haven't seen anything from Linus or Steve to suggest that's the only reason they care so much about their integrity, they both seem to genuinely care, but y'know parasocial relationships etc.
There is a conflict of interest there, but as a counterpoint, pretty much every time he does a build video it seems like he picks AMD (at least, post Ryzen).
This effect exists all across society. Some culture seems to drive laziness / selfishness across the (irl) social network. Every would care but nobody can pull the whole network in the right direction. That's how your company doesn't have the right tool, the right app, the right something.
As a student, I do want my laptop to be slim and tiny, but I was misled by Apple into thinking slim and tiny is only possible if the laptop isn't at all repairable. Might sell my MacBook Pro for this thing. Though macOS is a guilty pleasure I will miss :(
I reserved a Steam Deck because I wanted to support Valve's efforts to expand support for Windows games (and by extension, apps) to Linux through their open source Proton project. I think the future of Linux is looking very bright.
I did a little research on this, and it seems the main (perhaps only?) problem is that all mobo configs come with an 11th gen Intel CPU with Iris Xe graphics.[1] Since the last Intel Macs used 10th gen chips with Iris Plus graphics, and Apple isn't making any more Intel Macs, it's likely that macOS will never support Iris Xe. What a shame. While I do intend to switch to Linux eventually, the ability to run macOS would have made it easier to switch from a MacBook.
Slim and tiny is what I wanted when thinkpads were smaller and lighter than average.
Once we hit five pounds and I had a bag that stopped caring about smaller laptops? Well that was about the time that desktops died and I could have used a workstation class laptop with some more flexibility.
But I opted for simple and put my energy somewhere else instead. Seems a lot of people did.
It's probably an important distinction between laptops and tablets that a laptop is free to expand into three dimensions when in use.
Strictly speaking, the throw of the keyboard when in use is not limited by the dimensions of the laptop when it's not in use. There is air above and sometimes below that the keys can occupy. Having the keys raise up when opening the lid might be mechanically impractical, but having the lid depress all of the keys is a matter of ignoring key presses until the lid is opened past an angle where it stops touching the top row of keys.
Based on the shape of the smudges on my screen I'm pretty sure that already happens to an extent.
I don't even care about slim. Light yes. Slim doesn't really do a whole lot for me past a certain point (which for me was a decade ago). Being able to replace parts is way more important.
> we were constantly told by these corporations "it's impossible to have slim, tiny laptops and make them customizable"
Because it’s true. The thing is “power users” and “regular users” look at that tradeoff differently. The bad part of economies of scale is that they reward conformity (you can pick your model T in any color, as long as that color is black).
The framework would a hard time competing in the general laptop market, but luckily for them, they don’t have to. There’s a niche for specialized products and they are taking advantage.
I’ve looking at their laptops since they announced, I’m just hoping they can release a Ryzen one, and then I’ll be on the fence between theirs and whatever Apple has to show for a ARM pro laptop (14”-15”).
The only reason these two groups look at it differently is because people in the "regular users" group don't know how bad things are, and how good things could be. What makes you think the Framework would have a hard time competing in the general laptop market? The baseline, preassembled model starts at $1000 and comes with Windows 10 Home. It has a quad-core i5, 8 GB RAM, 256 GB storage, a nice 2256x1504 display, and it's thin and light (1.3kg, 11.7" x 9" x 0.6"). Compare that to your other thin and light options at this pricepoint:
XPS 13: $1020
* i5
* 8 GB RAM
* 256 GB storage
* 1920 x 1200 display
* 1.2 kg, 11.6" x 7.8" x 0.6"
MacBook Pro: $1300
* M1
* 8 GB RAM
* 256 GB storage
* 2560 x 1600 display
* 1.4kg, 12" x 8.7" x 0.6"
This isn't even accounting for repairability as a feature. Consumers don't care about that as it stands, because they don't know they should. But once they realize, it will become a selling point, too.
>I’ve looking at their laptops since they announced, I’m just hoping they can release a Ryzen one, and then I’ll be on the fence between theirs and whatever Apple has to show for a ARM pro laptop (14”-15”).
I'm in the same boat. While I in principle would love to invest in a powerful AMD laptop with tons of upgradeability with decent Linux support, Apple's next offerings which may give upgraded displays, my favorite trackpads, impressive power and ~20 hours of battery life is very hard to ignore.
We suspect the Framework's high-brightness, high-resolution display is the culprit for its relatively poor battery life—the XPS 13 at the top of the chart is a 1080p non-touch model, as is the Acer Swift below it. Directly comparing 3:2 resolutions with 16:9 or 16:10 resolutions is an exercise in frustration—but the Framework's display offers noticeably higher pixel density than its competitors here, and that does not come for free.
As battery tech gets better, you can replace it. And the glued-in battery in the macbook pro won't be 100% capacity after a bunch of charge/discharges.
It's a compromise in the short term maybe, but long term it's so much nicer.
I've replaced a glued in MacBook Pro battery. It isn't a big deal and very similar to the Framework, except the Framework has mechanical connections (screws and tabs). The battery replacement kit came with everything needed. It didn't come with newly additional capacity because the underlying changes in battery chemistry aren't there. The improvements in battery life mostly come from CPUs with lower TDP.
It's not just a status symbol, it's also a fantastic laptop. I honestly don't find it reasonable to recommend normal people anything other than an M1 Mac at this point.
As far as I can tell, this laptop has a 55 Wh battery. A macbook pro of the same size (13in) has a 58 Wh battery and the dell XPS 13 has a 52 Wh battery. What am I missing?
Capacity isn’t the only metric for batteries. The faster you draw down the more power converts to heat. Different battery chemistry changes that a bit, but also aggressive power management to flatten (and lower) the curve matters a great deal.
Apple nailed that during the same generation they introduced the unreplaceable battery. Better density, less packaging, and improved power management virtually doubled the run time on that laptop versus the previous. That was a huge deal at the time.
> We suspect the Framework's high-brightness, high-resolution display is the culprit for its relatively poor battery life—the XPS 13 at the top of the chart is a 1080p non-touch model, as is the Acer Swift below it.
And the XPS has an i7-1065G7 [1] vs the Framework with an i7-1185G7 [2]. So the Framework has a better screen and a better CPU. I'm not sure I agree with running that benchmark without other data alongside it like a score or the average clock rate.
For example, I put a 2nd battery in a ThinkPad once and it had the effect of locking the CPU clock to <1GHz. The battery was predicted to last much longer than normal, but it was useless as a computer.
The only laptop for which that seems to be true is the MacBook, but no x86 laptop will come close to that. From what I've seen the Framework has a mostly uninteresting battery life, outperforming some likely competitors (like Dell's XPS 13 and MS's Surface Laptop) and outperformed by others (like HP's ProBook x360 and ASUS's Zenbook 13).
Interesting. Tom's Guide says pretty much the exact opposite, with the XPS coming in at roughly 78% the battery life of the Framework.
> In our battery test, which sets the laptop’s screen brightness to 150 nits and tasks it with endlessly browsing the web via Wi-Fi, the Framework lasted 10 hours and 17 minutes. That’s better than the Dell XPS 13 (7:59)
Maybe the XPS 13 configuration was different? Or maybe the tests were different in nature? Ars used PCMark 10, which is a standard benchmark that Dell could have specifically optimized for.
The Ars review does have this to say later on:
> The Framework also manages surprisingly high battery life under Ubuntu—in our semi-scientific video playback test, Framework runs neck and neck with the outstanding Acer Swift 3 at just over five hours, with everything else (including the XPS 13, which in this case is hampered by a 4k touchscreen display) trailing well behind.
I'm impressed that they have managed to take this product to market, and I'm glad that people who value modularity will finally have a viable option.
at the same time, I personally don't see what all the fuss is about. you can upgrade both DIMMs, which is cool, but not exactly unheard of these days (I guess it's getting there in an ultraportable?). you're still stuck with DDR4, which is almost EOL, and the max capacity it entails. it's neat that you can customize your IO options, but how many people are going to do that more than once? being constrained by the chipset, it's not like you're going to be able to "upgrade" your IO in the future.
the most likely parts of a laptop to fail are the SSD and the battery, both of which are fairly easy to replace on almost all laptops. past that, you aren't really gaining that much when you're locked into whatever CPU/chipset was current when you bought the laptop.
>Newer laptops are starting to come with soldered SSDs
Which newer laptops? Other than Macs and crappy $199 Walmart grade
tablet-chromebook thingamajigs, I don't know any mainstream PC laptop that does that (thankfully).
Even super light and super slim laptops still have replaceable storage. Even niche Pocket Computers like the GPD and Valve Steam Deck still have replaceable SSDs.
If Macs are doing it, others are soon to follow. Apple has been a trend setter for years. They were belittled for getting rid of the 3.5mm jack, only for flag ship phones to begin doing so.
Soon when? Apple has been doing it for 5 years now and the rest of the industry hasn't even started.
The truth is, unlike with RAM, it's still cheaper for the other laptop manufacturers to have a single motherboard SKU which they can later plug whatever cheap COTS SSDs they can get from various sources rather than waste effort tayloring a motherboard for a specific SSD controller, specific DRAM cache chips and specific Flash chips, as that gives them way less flexibility in component sourcing during production lifecycle and more expense in board design resulting in more expensive products with no extra margins for them.
Apple can do this economically as they have a very tightly controlled supply chain with high volumes and due to the little variation in SKUs so they can just use the same SSD controller on all their products and just change the amount of Flash chips soldered on the board and call it a day.
IMO, the selling point isn't that you can carry around the ports you might potentially use and swap them out whenever you need them, it's that you can buy a machine that's tailored to your setup and peripherals.
I bought a laptop at the beginning of 2020 after the GPU on my old one fried. What I wanted was something with a Cat6 and DisplayPort built in for when I'm in my office, and multiple USB-A ports for the peripherals I use (mouse/keyboard/mic/speakers). I had to settle for one with a single extra USB-C port, an additional USB-C hub to get enough USB-A ports and a Cat6, and an adaptor for the built-in HDMI port to hook up to the DisplayPort on my monitor, which set me back a total of like $100 on top of the cost of the laptop itself. The laptop also has a headset jack and large card reader that I have yet to use, so that's just wasted space that could have potentially been something I would have actually used.
I have a MacBook Pro w/ Retina Display from Mid-2012. It cannot be fixed for a reasonable price, despite it being still mostly perfect for my daughter's school computing.
This computer definitely interests me (as someone who moved back to Ubuntu / Regolith this year)
we'll have to wait and see whether this actually happens, but if so, that would certainly invalidate my biggest criticism. if they could pull it off on a 14"-15.6" chassis with a discrete gpu, they would probably get my money.
also, pretty sure that is correct english, why the [sic]?
> also, pretty sure that is correct english, why the [sic]?
Perfectly correct English yes - I just meant that I was disagreeing with that, you're not locked to it. (I don't think it's an incorrect use of it, but thinking about it it's not a common one - can just quote and say 'that's not right' after all - so I don't know I bothered, sorry.)
> Perfectly correct English yes - I just meant that I was disagreeing with that, you're not locked to it.
fair enough :)
and I saw that guide too. my skepticism is regarding what happens when the next generation (or an AMD variant) arrives. will the new mainboards be drop-in replacements for the old? if nothing else, this would make it difficult to radically change the cooling solution, which could be a big problem for the dGPU machine I'd like to see.
maybe my initial comment was too harsh. they have delivered a fully user-repairable machine, which is a great thing. but what I want is a fully upgradable machine, in the sense of a DIY desktop build. they have made some vague promises around the latter, but I'll reserve my judgement until I see it actually happen.
To be fair to "these corporations," the modules available for the Framework are basically built-in USB-C dongles. If you really hate the look of dongles, then thats great. If you only use one thing, like HDMI, then you don't have to cart around a bunch of modules as if they are dongles.
The three internal upgrad-ables are nice if you think that things will drastically change in RAM, SSD, or Wifi before the CPU, mainboard or faster connections to faster RAM and SSD make the effect of said upgrad-ables to be gilding a turd. Otherwise periodically buy the midrange storage and allow the secondary market to absorb your environmental guilt.
Do you actually have any example of a corporation that makes laptops saying that it’s impossible to have slim laptops that are customizable? I would be very surprised if this ever happened, but I’m willing to be proven wrong.
You know, I can't think of a concrete example of that, when pressed, so it is possible this was just something repeated to me by coworkers and friends doing Apple apologia, and I just treated it as a truth. A quick Google doesn't appear to show Apple or any other corporation saying it, so I'll take the L on it.
I think my overall point still stands. I still find it irritating that, until very recently, the only way to get a nice, slim laptop was to accept that everything is hard-wired in. Framework proved that that's not correct.
Your experience matches my own. I haven't heard it from a company but there are definitely Apple zealots/shills who would say that in threads about right to repair.
Yeah, sort of this strange Mandela effect thing I guess; I have a distinct memory of reading an official statement with Apple or Samsung claiming that that was the reason, but that statement does not appear to exist, and it seems like the most likely reason is because my brain just incorrectly extrapolated that memory from stuff non-Apple-non-Samsung folks were saying.
FWIW, I also remember this claim being made, but yes, it may have just been Apple apologists in comments, or perhaps Apple wrt the iPhone battery, ages ago.
> Kyle Weins (iFixit CEO) basically claiming that's the reason
I.e. The last person who can be trusted to report on Apple’s motivations.
It’s an absurd explanation. Some obvious other factors are:
1. The idea that a modular chassis is less robust over time. Not that it can’t be made, but that if you make millions of them, vastly more of them will have problems because of all the connectors etc.
We don’t have any data on the framework. Perhaps they’ll prove this to be a misplaced fear, but it’s also possible that framework laptops in aggregate will need more repairs because of the extra complexity.
2. Limited hardware profiles are easier to support with software. If users can create limitless combinations, it becomes much harder to test. This isn’t an issue for the typical Linux user who can do their own homework and fix their own issues, but it’s a deal breaker for someone who just wants to buy a computer and get work done.
iFixit has repairability scores for products. Eg this Surface gets a 1 out of 10 because MS used adhesives among other problems. They traded off repairability for thermal, rigidity, and mechanical concerns.
Many companies do heavily promote the thinness of their device on launch. How many of them also say that it is thin and customizable? (And customisable doesn't mean choosing between 8 gb of soldered ram vs 16 gb of soldered ram). There's your answer.
Unfortunately currently framework says it is impossible to have a slim, tiny , customizable laptop with a trackpoint option (due to the keyboard height). Hopefully we will overcome this one too and I would be all in...
This is the barrier for me too. I'm writing this on a 4-year-old Thinkpad. (The "25th anniversary edition", which has my favorite keyboard.) If somebody figures that out, I'm happy to switch, as my feelings on the Thinkpad line are the same as Doctorow's: formerly great, now sadly declining.
I’ve never understood the trackpoint use case, the (admittedly few) computers I’ve had with it, I just felt it a nuisance in the middle of the keyboard.
To the same goes for the trackpad for me. I am always disabling them to avoid accidental contact. I guess this is why we need more customizable devices.
According to Clayton Christensen, there's a cycle between integrated and modular, as consumer perferences change. At first performance is inadequate, but once it is good enough, people base their buying decisions on other things like customization.
e.g. By this theory, android would become more popular than iphone.
EDIT yes, which happened, favouring the theory; despite iphone still leading performance, due to integration even to cpu and gpu.
Besides planned obsolescence and other malicious intentions, there are also economies of scale to consider: if making a non modular/repairable laptop cost to a producer just one less buck, that's a lot of money when multiplied by the huge number they sell.
That said, I love the concept and plan to buy one next year, when they hopefully will have means to sell in the EU without outrageous shipping+import duties.
I love the Framework idea, and believe in repairability. But I’ll be interested to see what Framework does when they have the (enviable) challenge of manufacturing, distributing, and supporting half a billion of them.
HP is a notable exception to this trend, they still make multiple ranges of slim, customizable laptops (at least non-touchscreen models). I oversee IT for an org of ~150 people and have swapped out RAM and M.2 SSDs on multiple recent HP ProBooks, EliteBooks, and a ZBook in the last few weeks with just a Phillips screwdriver and a spudger.
Nothing epoxied shut or soldered in place, and the metal cases on recent generations of these HP models are sturdier than the older plastic. Our lead tech has replaced HP laptop batteries, keyboards and displays when needed with no issues. HP is also one of the few brands with backlit keyboards standard on most laptops, even down to the lower-end models I've encountered since 2018.
My Dell laptop just cross three years in June and the extended warrantee also got expired. The battery has been deteriorating with only a few mins of backup left. Called up the customer care and they asked to instead contact nearby service center. Tried with multiple in Delhi NCR and no one seem to have a battery – one of them said that no battery is available right now (God knows why?)
Finally, somehow got one via Amazon supplied from a state 1600 KMs away from here for about $80 in a week time. It could very well be expensive than the one gotten from the service center but, was left with no option.
Why should we believe anything any for-profit company says, without verifying (whenever possible)? Drug companies lie all the time, about having to price their drugs absurdly high. Facebook lies all the time about not being able to fact check, without even attempting to try seriously. And on and on.
The insane thing is not that companies lie. It is that the general public has either given up or duped into thinking these companies cannot possible lie. We have created an economic system where profit trumps everything else.
> After seeing the framework, I'm more than a little annoyed that I fell for this.
Me too. An I'm annoyed I fell for the lie that board level repair is impossible. What the manufacturers really should be saying is "it's impossible for us" because it's obviously possible for 3rd parties to do it and make a business out of it.
I'm willing to pay +$100 for something that's assembled with screws instead of glues.
I wonder if the slowdown in Moore's law had something to do with it? I mean back in the day the performance gap between a new processor and a 5-10 year old one was so substantial it was hardly worth your while
It seems more likely that tech has advanced enough for this, rather than there being some greedy phenomenon where it just so happened that nobody thought to do better before framework.
The Framework is larger and weighs more than contemporary machines that have larger displays (e.g. ThinkPad X1 Carbon 9th generation), and it has worse performance and shorter battery life because soldered-in RAM isn't some kind of scam, it's actually much better.
In short, it was not a lie that you get smaller, lighter, and better laptops with integration. You do, in fact, get all of those things.
The Framework laptop is 1mm thicker and 200 grams heavier than the 9th gen Carbon. We've gone long past the point of diminishing returns when it comes to size/weight vs repairability trade offs.
Is that a property of the components being soldered in, or just a property of the fact that Framework cannot get access to the highest quality components on the market?
Genuine question, I know very little about electrical components.
With RAM, the problem is more that high capacity LPDDR4 modules simply aren't available on SODIMMs.
However this is only because manufacturers don't make them. If Apple asked Micron or Samsung for them, I'm sure it would happen.
A better argument for non-replaceable RAM can be found in the Apple Silicon chips. Building the memory into the SoC provides very tangible performance and efficiency benefits.
>With RAM, the problem is more that high capacity LPDDR4 modules simply aren't available on SODIMMs.
And I don't think they ever will be. From the little bit that I've read, the higher voltage that SODIMMs have to use has to do with noise in transmission. LPDDR4s have been connection through the direct solder, so are able to use lower voltages.
I'm not an electrical engineer but I have a hard time believing we can't design a socket that provides a connection as stable and free from interference as a solder joint. It could mean making a PGA or LGA socket similar to what we use for CPUs, but it should still be perfectly doable.
the mounting hardware needed for replaceable dimms inherently takes more space on and above the board. the solder approach also gives more flexibility for board layout, since you don't have to reserve space for the exact size and shape of a standard module. see a teardown of the new blade 14 to see how this can be beneficial.
there is an inherent tradeoff between size, battery life, and modularity. if you can make a modular laptop with good specs and battery life, a competitor will always be able to offer the same thing in a smaller chassis or with a bigger battery.
They have to use DDR4-3200 to be able to put the RAM on a stick. Integrated systems can use LPDDR4x-4267. You can't put that on a stick. It's a trade-off. It turns out it takes extra power to drive high-speed signals across long traces with connectors.
I'm fairly sure the power advantage comes from VDDQ being much lower: .6V for LPDDR4x and 1.2V for DDR4.
LPDDR4x is or at least can be 64 bits wide, just like DDR4.
Judging by reviews, the Framework loses 10% of CPU+GPU performance versus reference designs. That could be due to their memory subsystem or thermal design, or both. Ars Technica said the battery life was "mediocre" but I would have gone with "terrible". Compared to the Dell XPS 13 the Framework has only 60% the life.
I don't personally want an exciting laptop. I want a laptop that works with all my other stuff (phone, tv, printer, various software) the moment I open it.
Highlighted on the Linus Tech Tips video on the Framework laptop, there are thoughtful touches like drivers installing in unattended mode upon OS install, so the team clearly puts importance on that seamless functional experience too.
I understand the Thunderbolt hub is fundamental to the Framework concept, but I’m really let down by the lack of an AMD option.
I’ve waited months for Lenovo to sort their supply-chain issues and I’m still quarreling with poor drivers but the Ryzen in this T14s I have is just embarrassingly capable.
I don’t really want an exciting laptop. And I don’t want dongles that plug INTO the laptop. I just want a more powerful 1st gen unibody aluminum MacBook. That was the sweet spot for me. Good ports. Removable battery. Removable memory. Removable disk.
What is the point of a modular IO if I cannot get an ethernet port? Or today only the old school people like me prefer to use a wired LAN connection whenever possible and hate using Wi-Fi that is unreliable and slower?
I don't understand that choice, but to make the laptop thin, no, a Thinkpad T14 is thin (maybe not as a Macbook, but who cares about a couple of millimeters) and integrates a full size ethernet connector.
What is the point of modular IO if the IO that you are giving me is just USB, HDMI and a micro (not even full size) SD card reader?
I would expect to have a whole range of possible ports, for example I use a lot RS232 ports, I would have liked to find a module to add a serial port to my laptop.
There's no reason there couldn't be an Ethernet or RS232 port. It's just that there hasn't been one made yet. And don't forget that third parties are able to make them too.
I am saving currently for it and I can't describe the vague worry I feel that this will become the next big thing. Where I'll be refreshing their page every morning for new drops to attempt to get them before they sell out
If this is truly popular to change the industry then even big players like Apple will adapt. My bet is that it’s super popular in the niche that is the HN audience but not much more.
I love the idea of the Framework and for my next laptop I will definitely check it out.
One topic, though, that I just cannot let stand without comment is the notion that Thinkpads are bad now. I have a Thinkpad X1 Extreme (gen2) and - apart from the stupid name - it is hands down the best Linux Laptop I ever had.
There simply is no "...secretive Nvidia graphics cards, strange BIOS rubbish...".
At least with Fedora it just works (including rendering offload to the NVidia card if you want that.)
I have a DIY version preordered, due in October - planning on transferring over my NVME SSD and 32GB of RAM from my busted, falling apart XPS15 into the new Framework.
Not sure how, as it's not just a resolution issue, it's a screen size issue. I use laptop screen as my only screen and don't use an external monitor. So, for me, the size of the screen does matter and 13" is little small for my eyes.
I'm sure if one uses external monitor most of the time, then 13" is more than sufficient.
Imagine doing this at a large scale for the whole industry.. TVs, mobile phones, dishwashers, microwaves.. helps consumers and our planet environment too
I’m not in the market for a laptop in that price range, but if I was I would seriously consider it. I’ve had a Thinkpad X220 for some time now and done plenty of upgrades but changing out the lcd worries me too much to give it a shot and the second gen intel is starting to be a bit long in the tooth. However for basic computing I prefer my Pinebook Pro, it’s light and the screen is great.
this was always the promise of the beige towers of the 90s: upgradability! repairability! in practice, it seemed that all components were moving fast enough that upgrades maybe extended the life of something by a year or two at best, but ultimately all standards were constantly moving (buses, ram, cpu sockets/chipsets, storage, cases and even psus). compatibility between parts was often a crapshoot, reliability suffered because they weren't burn in tested together and at the end of the day, any major upgrade involved having to replace 90% of the components anyhow. the fully integrated systems seemed to have longer operational lifetimes, to be honest.
that said, maybe things are moving slower now, but it seems like a bit of a fetishization of a past that wasn't that great to begin with.
the focus on reducing waste is good, but honestly what is more modular here than your average laptop with memory and storage doors?
If and when they’ll come out with a 15 inch or above sized version that’s shipping in Europe and my MacBook Pro 2017 gets long in the tooth, I will be looking at a Framework instead.
The replaceable port idea doesn’t factor into my use case - I’d rather have the 4 thunderbolts or a generous assortment of video output and usb ports plus a card reader, honestly. Everything else is really cool and exciting though!
Thanks to Apple’s privacy bungles I’ve already turned iCloud enabled features and their convenient integration into a nice to have from a necessity and cancelled my subscription so I can switch any time to an alternative vendor and lose very little apart from the clipboard sync, wifi phone calls and airdrop.
So I’m genuinely excited to switch to Linux when I need to.
> Base and Performance configurations ship with Windows 10 Home pre-installed and Professional ships with Windows 10 Pro pre-installed. You can also load your own operating system later, like a Linux distribution.
So... Forced windows tax, even when Dell can manage for Linux. Yuck. Like, really yuck.
And for the specs:
> Base $999.00
> i5-1135G7 | 8GB Memory | 256GB Storage | WiFi 6 | Windows 10 Home
8 gigs of ram? Well, I guess that's 1 chrome or firefox window open, and an electron app. And .25TB ssd? That's great for a machine 8 years ago.
I'll stick with Dell's Business line. They're rugged, parts are serialized and easy to order/obtain. Pass on the "Framework".
Every connector and moving part is a cost and failure point.
There is a reason why everything is moving in the other direction.
It’s going to be more expensive more bulky more flaky than the same components in a more integrated system.
> it's no thick-as-a-brick throwback the size of a 2005 Thinkpad – it's approximately the same dimensions as a MacBook.
So, it's a bad laptop in my book (pun not intended).
You see, a laptop needs a good keyboard; and a good keyboard needs height for the keys to travel. So, a laptop needs to be kind of "think as a brick". Maybe not at 2005 levels, but definitely at 2010 levels and no thinner.
However - a laptop which is reasonably easy to disassemble, and hence customize, is definitely something I support in principle. I'm not sure this can catch on if it's essentially a single-vendor thing.
on a fixed income because covid took out my business, how would buying this over what it looks like be any better? granted mac has already a ton of PR issues involving privacy, i guess this laptop would address that?
It seems like there are a lot of people here who change their laptop almost every year or two. Is that common? I ask because I'm still using my mid-2014 MBP, and it's only really this year that it's started feeling underpowered with the fans running quite a bit. Apple laptops are expensive for sure, but compared to upgrading every year it seems like a good deal, not to mention the problem of electronic waste. (I'm not looking for an Apple / PC fight by the way, just haven't heard of people upgrading so regularly before).
It has Intel only, I wish they'd add AMD as well. I was an Intel fan but with their IME and Spectre/Meltdown flurry of problems I've ditched them. Now my household is all Ryzen.
Super interesting product! Does anyone know if they have AMD processors on the roadmap? AFAIK they're beating the pants of Intel at the moment in terms of performance/W.
I have to use a Mac for work rather than Ubuntu, which I have dialed with everything just the way I want it, and it is so frustrating. It drives me batty on a daily basis.
> I'm finding it really hard to reliably hit the right region on my trackpad to get the left-, center- and middle-buttons. I've drawn little hints on in sharpie, and I'm working with Canonical, who make Ubuntu, on remapping the button areas.
Does the trackpad not support pressing anywhere with 1, 2, and 3 fingers for left, right, and middle click, respectively? I'm hoping that it's supported, but the author just doesn't want to use that kind of setup.
I checked out the site and found one problem with it. I had no way of knowing both the screen resolution and whether the hard drive it ships is SSD or not.
“ Mine was delivered at the end of Aug. I got it set up by the first of September and have been using it ever since. Yesterday, I put my 2019 Thinkpad on my pile of "laptops to refurbish and donate." I've bought a new Thinkpad almost every year since 2006. I think that's over.”
I can’t really fathom why anyone would buy laptops at this rate. It seems so incredibly wasteful. The author is free to ship me his 2019 laptop. I will no problem putting it to good use.
Honestly I have no idea what contradiction you're referring to.
Regardless, his donating the laptops doesn't mean he needs to buy new ones. Why is he going through them so fast in the first place? If he really wants to give laptops away, why not just buy new laptops and give them to people? Or why not just take the money he would put to new laptops and donate the cash to people? Given his actions up until now, I would be willing to bet that he would just dump the Framework laptop and buy a new one (or another) whenever the first (replaceable) component fails. He seems like a paragon of waste of resources. I don't understand why he considers the Framework as necessary for him.
I still use my X61s from 2008. Laptops are not an productive/ergonomic choice and if you have to replace it this often I would opt for a Raspberry 4 or Jetson Nano.
This is a great leap ahead, and makes me happy. But I would also like to do a shoutout to a local-to-me company that has been around for decades and have always had a decent amount of modularity. https://eurocom.com/ec/main()ec.
They have always went the way of more power over smaller size.
I used to like being able to control everything about my machine. I'd spend hours hacking away at making my Linux distro just the way I'd like it, and fixing things when they broke, figuring out drivers and oddities. This laptop seems to have the same culture.
Now I just buy a MacBook and the concept of "things breaking" and "having to fix my computer" just isn't a thing anymore.
I bought a Dell and put Fedora on it, and it’s been as stable (maybe more) as my Macs. You don’t have to tinker with Linux too much these days to get a decent experience.
I got one of the first waves, it's awesome. My only complaint so far is getting the fingerprint sensor to function properly in Linux - but that may be more of an issue with fprintd/libfprint than anything.
I'm dual booting Win10 and Debian, and absolutely no issues under Windows so far either.
The battery life is great, the screen is extremely crisp and bright, and the keyboard actually feels decent to type on.
I'm just waiting for an excuse to purchase a Framework laptop. My thinkpad is hanging in there (2014 model, purchased used) and I have a desktop that is more than powerful enough for the type of development I do on my own time. As soon as I get the chance, I'll be ordering one of these. Hopefully there's an AMD version out by then!
To the Framework laptop owners in the thread, I have a question about the power button light. In the image I saw, it seemed rather large (square) as it outlines the fingerprint reader and likely perceptually brighter as a result due to the greater area covered. Do you find it distracting to have it always on in the upper right of your keyboard?
This isn’t a laptop for me, I want different trade offs. But I’m really excited to see it enter the space, and hopefully they will become a viable market contender.
That might both satisfy the demand often expressed, by people like us, for more upgradability/repair ability and make other manufacturers steal some of their best ideas.
You can buy one without windows installed, but you'll have to install ubuntu yourself. I reckon it's not worth to pre-install one of the various GNU/Linux distributions since the average users of such a device will do a clean reinstall anyways. At least it's 200 bucks cheaper without windows, so theres that.
There isn’t a large market for that. If you want to sell a repairable laptop to the masses you need the features they desire. Windows is one of those things, unfortunately.
I wonder about the trackpad, too. Apple has always smoked everyone else at trackpads. After using theirs for so many years, every time I touch another it feels crusty and unresponsive.
I'm pretty excited about this, but I do wish they had made a design that wasn't an Apple clone. I don't want people to look at my computer and mistake it as a clone/copy. I feel it's a missed opportunity for them to say "look, there's a better way".
The OP talks about how they are going to now only switch the cpu/ram/whatever every year. I feel a cpu is quite an expensive/resource intensive thing to make compared to a cheap plastic case. So while maybe not as bad as an entire device, it still seems pretty wasteful.
> From now on, I can easily see myself upgrading the CPU or the screen on an annual basis
how feasible is this, given that sockets and chipsets change every 2 years? also, the ultra low voltage / TDP parts are rarely available for purchase by end users.
The theoretical process of upgrading the CPU on the Framework is buying a new motherboard from them, so sockets (there aren't really socketed mobile CPUs anyway) and chipsets changing wouldn't be a problem.
How often they'll actually have new options, how much they'll cost, and how long they'll stick to the compatible form factor from these initial units are definite question marks that will only become clear over time.
For Framework themselves, assuming they survive long enough for it to be a problem, the balance between new customers and entirely new machines and upgrades (and what that looks like in terms of margins) will presumably be a potential challenge, also.
That is without a doubt the best laptop customizing process I have ever gone through.
Finally a manufacturer that is offering a more "desktop" build experience in a laptop form factor. Framework will be my first choice when I need a new laptop.
I'm not in the market for a laptop right now, but the next time around two years from now, if there's an AMD variant with dedicated GPU that I can have two HDMI ports for my external screens they will have a sale from me.
This should have been a thing a long time ago, I would take the plunge but a 13" laptop is not something I would enjoy working on, if they ever release an option for a ~16" frame and monitor I'd jump right in!
If it had a 16 inch version I'd buy one now. I just want a linux with premiere linux support. Granted the system 76 one is pretty good too. I don't give a crap about the ports. I USB-C all the things. I moved on.
It seems like there's so much potential here for a third-party module market. I wonder how much flexibility these modules provide. Like, could I create a hifi sound card/DAC?
There's so much I love about this laptop, but why is the keyboard layout so Mac-ish even though the aspect ratio is 3:2? This thing could have a much more traditional keyboard layout with that extra vertical apace.
This really is cool. I wish there would be an option for more than 4 ports but this is just a minor issue. If there was an option to get HDMI 2.1 rather than 2.0 it would also eliminate need for a separate DisplayPort.
that's not really a reason why you would need/want that though. Like my gripe isn't that he's wasting money, it's that there is no point to doing so because there's no tangible improvement between one generation and the next. So I guess it is an answer to a "why" but not the "why" I mean, if that makes sense
Once they offer different keyboard layouts ("de" specifically) they'll get my order. I have not heard anything negative about them yet. Reading through HN comments makes me confident with my choice.
These look really good. I'll be in the market for a new linux laptop in a couple months and I really want to give System 76 my money because I like what they're doing, but this looks really good.
I think Framework is great, I'll consider one when they offer a good trackpoint, but Doctorow's description of Thinkpads is…. weird. Like, a heavy X1 Carbon? Replaceable drives that don't go to 2TB? Other generalizations that just don't make sense.
I've been using Thinkpads for a couple decades now, in some ways the quality has wavered (partially due to contemporary concessions like compromises to be thinner) but they are still in general the best mainstream systems to run Linux with end user and field support. I've installed Ubuntu on every system, especially in the last years it has "just worked," and Lenovo even increasing offers Ubuntu pre-installed on some models.
It would be great if the windows is not pre-installed, and let Linux lovers have the opportunity to opt-out and get dollars back (the windows os license fee)
Very exciting. Their site allowed me to configure mine with 54 HDMI ports. Will this come in some sort of octopus cable one USB-C split to 54 HDMIs? Can't wait to find out!
I would love to get one.
The main thing that stops me is that this is only a clever idea as long as Framework is viable as a business.
All the components here are proprietary to Framework (?)
If they should go out of business, which often happens
with these efforts, then you have no way to source
parts and you are stuck.
There really is not much customization they offer,
at least not now. the expansion bays seem to
be the major components now.
Selectable when you buy the machine:
CPU (non replaceable?) , WIFI, Storage, memory,
power adapter yes/no, operating system yes / no
Not configurable: (yet?)
Screen, Camera, Fingerprint reader, battery.
Hardware switch for camera, NA.
A slight look back at laptops:
Old style laptops / enterprise laptops:
----------------------------------------
Battery can be swapped by sliding a open a plastic lock.
Swap batteries in seconds.
Different sized batteries were available, some taking up more space giving your laptop a "bump".
RAM was beneath a small hatch easily accessible from the bottom of the unit by removing two screws and direct access.
Often the hard drive was accessible easily as well
Two bays with no tool swappable, cd, extra storage/2nd hard drive, and some more exotic things.
Then we had PCMCIA Card a long time ago.
Most of my laptops had two bays.
No tool swappable. Wide selection.
>Be extremely careful when sliding the Battery connector out,
>as it is very easy to accidently bend the pins.
>Make sure to slide straight down, and avoid letting the >connector twist or bend.
You should not have to perform surgery on the box switch the battery.
It should not involve screws at all.
With the placement of the battery in Framework,, it is hard to see how they can upgrade it based on size limits.
Hopefully, they will offer an extension battery of some sort .
Where is the space for the dvd/cdrom?
(Some still use it. I love it for livecd, where the entire OS and file system is Read only. Every time you boot its clean.
4 user selectable expansions cards are unique and cool
as long as they are available, and someone makes a wider
assortment of them
It is really cool that you can got at the guts and replace things and it is great that this is available.
I would just want to do a lot of it a lot easier.
I think this is cool. But at some point in my career/life in the tech space, I stopped caring as much about my tools as long as it works to a reasonable degree. I am fine with whatever Dell/Apple/Asus/etc laptop that's out there. Will my productivity increase if I switch to Framework? Maybe? I am not sure how to convince people like me. Obviously, I am not the target audience for this. I wonder what is their addressable market?
I would love one here in New Zealand but unless they partner with someone locally, it seems risky to buy a laptop that requires spare parts to be shipped over from the USA if there is a problem. Especially with how bad global shipping is.
They have a sign up form where you can be notified when it is available in your country. Right now only USA AND Canada.
Though it's worth noting that when you do this you also don't have the SSD, RAM, or WiFi modules installed either. Though on the plus side you can choose not to buy those components from Framework and just supply your own, if you want.
I bought the DIY edition for a combination of wanting to bring my own components and not pay for Windows, and generally liking a little bit of tinkering. Installing the stupid little WiFi module antenna connectors was the only annoying part.
Here's an overview on the state of Linux support, and the community forums are also a great source if you want details on a specific distro. But basically Ubuntu 21.04 works out of the box, and I'm currently on 20.04 LTS, which required some minor work arounds.
No PIO port, no high speed serial port (and before you suggest USB<>Serial that means another external interface). Also maybe it is for the future but somehow I missed seeing any Ethernet ports either.
Also, is there a base dongle that we can add our own hardware/software too? Just having a dongle with a onboard programmable Arduino which can be programmed by the user would help a lot.
I still remained baffled by the popularity of the laptop. Literally everyone I know works with them fixed in one location; their desk. Yes, some people do need mobility, but this appears to be a minority of consumers. We could’ve had all this repairability and modularity years ago if most consumers just admitted that really they wanted a desktop all along.
Even when I'm home 90% of the time during COVID, I use my laptop in lots of places: my desk, my bed, my dining table, my kitchen counter, my couch. I can't drag my desktop setup all around my apartment like that.
It makes sense if you use the laptop as a desktop - plugging it to a bigger monitor with an external keyboard and mouse. Otherwise it's hard to have good ergonomic with a laptop and you can't work long hours on it without developing some body pain.
I don’t use my laptop in any of those places because my desktop setup isn’t there. Laptop keyboards are awful to type on, why put yourself through that?
I’d rather get my work done at the desk and then enjoy my hammock or couch without the computer, preferably with a good book.
Literally everybody in my company (a large enterprise) uses a laptop as their main workstation. They work with them docked at their desks, but then unplug them to take them to a meeting room, a shared working space, etc.
Outside of work, 90% of people I know have a laptop as their main computing device at home. Very few have a desktop PC - those are the PC gamers.
Your anecdotal experience doesn't match with my anecdotal experience.
My anecdotal experience matches yours. I work for a smallish business in an active growth phase, and the onboarding process for every employee starts with "Welcome to the company, here's your laptop."
Right, which doesn’t actually contradict what I said in any way. I did not say that companies are giving out desktops, genuinely not sure why everyone is pretending to the contrary.
Really? This has maintained true for you even during the pandemic?
I don't know a single person who uses their laptop in one place. Hell, every one of my co-workers has been at home or in the office with their machine at least a couple times over the past month.
Yes. It has maintained true for me throughout the entire pandemic.
Laptop ergonomics are catastrophically bad. Doing any meaningful work on them for any length of time without my keyboard, monitor, and mouse just plain sucks, so I don’t do it.
My personal machines (gaming and non) are both desktops. I’ve never wanted to move them, and they cost me a fraction of what an equivalent laptop would have.
I keep my personal computer at my desk in my apartment almost all of the time, and for the past decade have used a desktop since I pretty much never needed to move it. But over that decade with a desktop, I found myself wanting - craving - the ability to easily take my computer on trips, or into the living room from time to time, etc. Not a lot of movement, but the possibility for it. But the concept of having files/projects split over a desktop AND a laptop seemed to be a hassle. And syncing seemed to be a not-so-ideal situation (possibly hard to set up, or requiring a paid service, or not reliable, etc). The best solution I found was a dockable laptop setup. I've now been using a dockable laptop setup for almost a year and I honestly can't imagine ever going back.
The only other solution I could envision I'd be happy with is one that doesn't exist: where I have a processor and storage "core" that I can use in a variety of dumb terminals. That's my real dream, but a dockable laptop is kind of similar.
I could do without the iPad and use my phone, I just coincidentally have an iPad I inherited. I will not be buying another iPad once it dies, nor will my next personal computer be a laptop.
The funny thing is that the laptop is actually incapable of doing the one thing I would reasonably want it to do: work outside. It overheats within 5 minutes and slows to a halt if I dare let the sun hit it. Both the iPad and iPhone handle this task easily, weirdly enough.
And this is really the issue I have with laptops; they try to be everything to everyone and they end up sucking at any given one task as a result. Compared to my Mac mini my MBP is expensive and underpowered with compromised ergonomics and thermals. Compared to my iPad my laptop has poor battery life, weighs a ton, and overheats in the sun. All that buying a MBP has done is waste $2k extra of $CORPs money, and produce a bit more e-waste given the short upgrade cycle they have me on.
You must be out of touch. In the last 20 or so years of work (~8 employers), I have never had a desktop as a primary work machine (I've had desktops as a secondary machine). My primary personal machine has also been a laptop in that time frame.
What? I finally switched from a desktop battlestation PC at work and at home to a high-powered gaming/compute laptop that I dock at work/home in my 3x monitor setup with mechanical keyboards and such at each location - and I'm a 1990's LAN party, lug your giant tower to the basement of your buddy's place nerd.
I was really late to switch and I don't regret it at all. Almost everyone I know made this switch 5+ years ago before me.
I also needed a laptop anyway with the battlestations, because I travel for work. Now it's all in one package and I spent a lot less money on it.
I'm not sure how common it is, but at my old job devs and business analysts would regularly go on site to work with clients. Larger teams would have build boxes for CI and VMs that we could remote into if need be, but other than that laptops were absolutely required.
I'm currently writing this at a coffee shop surrounded by others all working on various things. I really like the option to work from anywhere.
My experience is that everyone claims they want the ability to go to a coffee shop, but very few actually exercise it. Obviously some will, like you, but not most. And unfortunately the perception of need drives behavior more strongly than actual need. Same with “off road” vehicles and trucks, people buy things based on capability they’ll never exercise.
Personally I loved the idea of working from a coffee shop until I actually tried it. Then I found that the glamour of the idea was much more than the experience of trying to do focused work in a noisy environment on a cramped keyboard. So instead I do work in my office, and leave my laptop permanently closed and connected like the worlds most expensive Mac Mini.
Carrying to meetings to be productive or stare at Zombo (back when meetings were in person). Some of my co-workers have a hybrid schedule and work won't give us two machines. Getting things done while attending conferences (if those happen again).
Even if you mostly work at your desk, a desktop computer has just no option for mobility if you ever want to work somewhere else temporarily. It seems like many people value the option of mobility, even if they don't use it often.
But why? Putting a SDR in a laptop seems counterproductive from an electrical and radio interference POV. The USB2/3 ports are the option to plug in a software defined radio. Unless you mean some pcie interface top end SDR, in which case you probably have the money to throw around to get this custom.
If a product manager is listening: you have my interest. I'll be excited when mouse buttons become available. It'll make my shortlist when it has Coreboot, and if Libreboot is on, I'll be obliged to give you my money.
The article contains a link to the page where one can customize a Framework order https://frame.work/laptop-diy-edition
The problem with this page is that I can't see which component is selected, even after I click on them. Zooming in, I see that selection means coloring the thin border in red. About 8% of male adults have at least a minor trouble with colors, usually green and red, so that UI is annoying/unusable for many people. I hope their computers are better thought out.
tl;dr Framework's shopping site is broken in case of light color blindness.
It's a 2px border, so I guess that only really works with lower-resolution screens, anyway. It probably should have been specified with a unit like `rem` instead, (the background should also stay different instead of only on transition).
Any kind of socket would require additional height. Probably at least 5mm. And you would certainly want to avoid that on mobile devices. For memory thats less of a problem, because you can plug it from the side, but CPUs have way more pins, so that won't work.
They also offer instructions to update that for when you need to upgrade that.
It's not bring your own but still upgradable, which is what I care about, especially with a smaller company obviously trying to keep everything ethics of repairability.
Even in desktops it feels like socket+chipset is becoming more and more tied to specific CPU so that you need to swap the motherboard and the cpu together.
This is a small competitor right now but this is the definition of disruptive innovation and it should scare the cuss out of Apple. This is what people want.
Oh that would be great. Never going to get that from Intel though -- didn't you know that only big enterprises with lots of cash are interested in having ~~reliable hardware~~ECC?! /s
I would love to see Framework become successful and then also enter the phone space. We need a highly polished but open phone platform that isn't a walled garden under control of a tech giant.
Wait, wait, wait. The Framework is supposed to be entirely customizable. Except I have to buy Windows 10 Home, which has an OEM cost of, what, about $100? And I have to have the little Windows icon on my keyboard. This alone gets me thinking about other options.
This guy must be James Bond. I’ve had my 2013 MacBook Pro since it came out and I never needed service on it. My 2016 MacBook Pro, no service. 2020 MacBook Pro, no service.
I like that this company is making completely replaceable parts though. The idea of being able to upgrade ram again sounds like a real benefit… it’s embarrassing that sentence even has to be muttered.
He lost a little bit of credibility saying the Macbooks have terrible build quality… compared to a thinkpad they are on par or better, compared to everything else they are head and shoulders above.
The only real Linux related quirk I've run into so far is that you have to disable panel self refresh (it's on by default and causes stuttering). Other than that tiny thing I pretty much just installed my stuff and started using it.
One little anecdote: I got a card in the mail from Framework saying that there was a problem with the cable for the touchpad, and it had instructions on how to fix it. Contrast that to my experience with Apple where they would delete forum threads for laptop problems and spend years denying issues until legal action forced them to acknowledge it.
Anyway, I'm a fan. I'm really looking forward to when the marketplace opens up with some new parts. I really want my blank keyboard. I'm hoping 2021 will be the year I can own a laptop without a god damn windows logo emblazoned on the keys.